Second, and our topic for today, is that the amount of entropy produced by a process is related to the irreversibility of that process. This fact is mathematically inscribed in an idea called “the fluctuation theorem.” We can derive a very simple, special case of it by imagining that we have a single molecule which can take on two different, distinguishable shapes which we will call A and B. So we have a two-state system consisting of one thing that can transition between the two states:

A ⇆ B

When the molecule is in shape (state) A, there is some chance that a thermal wiggle (fluctuation) will come along and rearrange it into state B. If you know this probability and the temperature, you can calulate the rate for the A to B transition. We call this the forward rate. Likewise, when in shape B, there is some chance that it will flip back into A and thus also a rate for the reverse transition. The definition of equilibrium is when the forward and reverse rates for all possible processes in a system are equal. Irreversibility, on the other hand, is characterized by the fact that the rate of a process in one direction is much larger than the rate in the reverse direction, yielding a very large ratio between the rates. So if A rapidly transitions to B but B very slowly or almost never turns back into A, then that means the transition can be considered irreversible. You see that it is a matter of degree, but the ratios involved in chemistry may be so large that a process is well-approximated as absolutely irreversible. If for example, the time one expects to wait for B to turn back into A is much longer than the time one expects B to *continue to exist*, well then you don’t expect to see A again at any point in the future.