A paper by Jeremy England at MIT made headlines this month with the audacious claim that he had discovered the thermodynamic basis for how life emerged from matter. Reading through the article, one comes away with the impression that this “new” idea is simply that organisms exist because they disperse energy, which is to say produce entropy. Mixis readers will know, however, that the concept of an organism as a dissipative structure is by no means new. For people unfamiliar with the history of the interaction between thermodynamics, biology, and ecology, the names Alfred Lotka, Ilya Prigogine, Harold Morowitz, and Jeffrey Wicken are good places to start. Eric Schneider and Dorion Sagan’s book "Into the Cool" provides an accessible history of many of these ideas, and of course, much to my adviser’s chagrin, there is the stuff I have written while I should have been doing research. All that says nothing about England’s contribution, however, which is real and substantive and so let’s talk about it.
The concept of entropy has been mired in confusion for much of its popular history, and misunderstandings of the second law still provide fuel for zealous creationists. In my ongoing attempt to make entropy a simple and intuitive notion, I have come to prefer two descriptions. First, entropy quantifies how much energy (in the form of radiation or matter) is dispersed during a process. This is because, in accordance with the first law, energy cannot be created or destroyed, only spread out and mixed up. Frank Lambert has long been a proponent of this “call it dispersal, not disorder” approach.
Second, and our topic for today, is that the amount of entropy produced by a process is related to the irreversibility of that process. This fact is mathematically inscribed in an idea called “the fluctuation theorem.” We can derive a very simple, special case of it by imagining that we have a single molecule which can take on two different, distinguishable shapes which we will call A and B. So we have a two-state system consisting of one thing that can transition between the two states:
A ⇆ B
When the molecule is in shape (state) A, there is some chance that a thermal wiggle (fluctuation) will come along and rearrange it into state B. If you know this probability and the temperature, you can calulate the rate for the A to B transition. We call this the forward rate. Likewise, when in shape B, there is some chance that it will flip back into A and thus also a rate for the reverse transition. The definition of equilibrium is when the forward and reverse rates for all possible processes in a system are equal. Irreversibility, on the other hand, is characterized by the fact that the rate of a process in one direction is much larger than the rate in the reverse direction, yielding a very large ratio between the rates. So if A rapidly transitions to B but B very slowly or almost never turns back into A, then that means the transition can be considered irreversible. You see that it is a matter of degree, but the ratios involved in chemistry may be so large that a process is well-approximated as absolutely irreversible. If for example, the time one expects to wait for B to turn back into A is much longer than the time one expects B to continue to exist, well then you don’t expect to see A again at any point in the future.
So what the fluctuation theorem tells us is that as this ratio between the forward and reverse rates of a process gets larger and larger, we expect more and more entropy to be produced as a result of this process. Let’s turn that around though, causally: the reason that a process may be spontaneous in one direction and unlikely-approaching a limit of impossible in another is because of the direction of increasing entropy set by the arrow of time. Energy is always dispersing because we live after the big bang, and if something is possible in one direction but not in reverse this fact is a result of the energy that the process disperses. Once dispersed, energy will not return to a more concentrated form without energy from some other portion of the universe consequently being dispersed in more than compensating amounts. Mathematically, irreversibility is an exponential function of entropy production so the fluctuation theorem can be rendered in a mixed math/plain language form like so:
(forward rate/reverse rate) = exp^[entropy change of the surroundings]
Ok but people have known this for more than 100 years. What they didn’t know for certain until the 90’s, however, was that this statement is true whether or not a system is in thermal equilibrium with its surroundings. This is another topic for another time, but deriving statements that are true for systems arbitrarily displaced from equilibrium is HARD. Ilya Prigogine got a nobel prize for writing down expressions for the time-derivatives of the entropy change in what is called a “near-equilibrium regime.” Up until recently, and if you ask the opinions of older thermodynamicists still today, a common viewpoint is that nothing persuasive can be said generally about non-equilibrium systems and that their unpredictable dynamics just lie entirely outside the time-reversible world of mechanics. Happily, this view is receding and England’s work will hopefully be a major step forward in popularizing non-equilibrium ideas in the larger scientific community.
Ok so the fluctuation theorem isn’t restricted to equilibrium systems: so what? Well that means that it has general validity for any macroscopic process that we can succesfully represent in terms of a macrostate/microstate scheme. Among those schemes are the macrostates, “a jar containing one bacteria” and “a jar containing two bacteria.” You can read the paper for yourself, I think its fairly accessible if you don’t get hung up on the math notation. The summary is that using this coarse-graining scheme England is able to perform a calculation of the minimum entropy that must be produced during a bacterial replication process, and it turns out that e. coli are clearly very close to being perfect replicating machines. This is a wonderful result, but two points in the paper, for me, are more interesting than the calculation itself.
First, the generality of the fluctuation theorem extends the second law for macroscopic, irreversible processes. What England has done for replication applies generally to anything we can successfully represent in terms of a macrostate with a corresponding probability distribution of microstates. In plainer language, the second law already told us that the entropy of the universe must increase during any real process. We now know that in addition, if we can find a way to estimate the probability or rate of a process and the probability or rate of the reverse process, then we can say by how much the entropy of the universe must increase during that process, not just that it must be greater than zero. The full generality of that statement is difficult to appreciate, and I’m still pondering the immense consequences.
Second, in regard to biology, the results of his calculation lead England to the following insight:
“We often think of the main entropic hurdle that must be overcome by biological self-organization as being the cost of assembling the components of the living thing in the appropriate way. Here, however, we have evidence that this cost for aerobic bacterial respiration is relatively small, and is substantially outstripped by the sheer irreversibility of the replication reaction as it churns out copies that do not easily disintegrate into their constituent parts.”
In other words, the process of replication is driven forward by its irreversibility; irreversibility is the mechanism of the second law’s probabilistic mandate to randomize matter and energy. The organization among the parts of the organism which is engendered by this irreversibility is, in fact, a minor component of the overall energetics of the process. The organization is easily “payed for” by the heat (entropy) given off by the chemistry of replication, and the amount of entropy is related to the “durability” of the structure because the reverse process is the rate at which a bacteria will spontaneously decompose into its components. Thus England is able to conclude that no matter what coarse-graining scheme one selects in order to count microstates and macrostates, “the resulting stochastic population dynamics must obey the same general relationship entwining heat, organization, and durability.”
In my own more romantic, anthropic terms, the fact that bacteria grow rapidly but don’t fall apart readily is, therefore, an indication that the production of new bacteria is an excellent way for the universe to do something irreversible and thereby get itself closer to the finish line it secretly desires above all else: the heat death, wherein all matter will finally be able to lay down its burden and relax.