Entropy, ecology and evolution

Not long ago, while diving through our webring, on an evening not unlike any other, I stumbled across a reference to an article published in MDPI’s Entropy journal titled Entropy, Ecology and Evolution: Toward a Unified Philosophy of Biology, and the ideas it presented struck me as one of the most beautiful ways to express what life as a structure and process is.
To paraphrase the abstract: ecology in the classical sense is the science that studies the interaction between organisms and their environment, and an ecosystem is defined as a community of living organisms in conjunction with the nonliving components of their environment, interacting as a system. While different branches of biological science have focused on elucidating details of structure and function in narrower scopes centered around heritability (genetics), structure (molecular and cellular biology, histology and anatomy), and metabolism (physiology and energetics), the essay proposes that the overarching context of thermodynamics that controls all biological processes and the evolution of life remains understudied as an originating foundation.
The essay is an attempt at reframing the titular three concepts in a more generalized context that is consistent with the fundamental physical processes that govern natural change, specifically the flow and degradation of energy and the creation of disorder. Biologists have focused on the structures of life that are the byproduct of a larger universal process of the unwinding of the great spring of order in the cosmic clock and the degradation of useful energy in the universe. The proposal of the essay is that seeing biology as a thermodynamic process governed by the most fundamental laws of physics brings a newfound clarity and understanding of life, its structure, evolution and meaning.
Immediately, these ideas became a point of fascination and have taken up residence in a corner of my mind, and they have kept cycling through my brain in the past few days. This only happens every so often, and this time, I wanted to savor the occurrence by trying to build an understanding and intuition here, in writing. The essay can be somewhat dense in places, and its format assumes some prior knowledge, so I’ll try to lay things out here in a way that will hopefully help one with finding and placing the pieces of this grand puzzle.
On that point, a word of warning: the text below is both an adaptation, an extension, and a possible mutation of the original. While it intends to present the same ideas with some added context (and a few omissions, for the sake of brevity), it is a derivative work. It reuses a lot of the structure and phrasing of the original. The pieces of it that are quoted ad verbatim are framed as blocks of code, the rest is a loose remix by yours truly.
Introduction
The first section of the essay talks about the context set by Darwin’s theory of evolution, which refocused biology into the study of evolution and how it drives the emergence of structure and adaptation. Observing the tendency of biological reproduction to produce far more offspring than can be supported by the resources in the environment, and that variation within a species that was more advantageous for survival and reproduction would be selected through the process that came to be called natural selection: the struggle for existence became seen as the driving mechanism for evolutionary change and the origin of species.
Although biological evolution can be studied as the interplay between heritability, structure and metabolism, the focus on studying the emergent structures and the high-level processes of evolution can overlook the overarching context of thermodynamics. From that fundamental perspective, organisms are self-replicating dissipative structures.
All structure and non-random aggregation in nature is driven by a greater degree of disaggregation and disorder in the larger system in which it is contained. This may be more familiar when stated in the form that the entropy of an isolated system always increases for irreversible processes. While theoretical reversible processes could maintain thermal equilibrium and result in a total entropy change of zero, they remain a theoretical case, as all natural processes are irreversible and will result in the total entropy increasing. These are statistical processes in the sense that there are more ways to be disordered than ordered, more ways for something to be broken than fixed, and more ways to be dead than alive.
A dissipative structure is something that builds and maintains order by creating more disorder in its environment. Organisms are dissipative structures, or rather systems of dissipative structures. All biochemical reactions are governed in their direction and rate by free energy. A spontaneous reaction occurs when the reaction consumes free energy, and transforms highly ordered energy capable of doing work into degraded thermal energy with lower ability to do work. The molecules, tissues and systems in an organism’s body are dissipative structures built and maintained through continuous consumption of free energy and degradation of order. Organisms have evolved through natural selection to maximize their ability to maintain and reproduce their organization through efficient utilization of energy, and this is directly linked to increasing entropy (through the second law of thermodynamics). Organisms are self-replicating dissipative structures.
Throughout the introduction’s reasoning, I think I notice the traces of why the change of focus from evolution to thermodynamics is proposed. Although evolution is a powerful lens through which to view the structures and processes of the natural world, it does not seem to be a central process. It’s hard to put into words, but it feels like a second-order process. Thermodynamic processes just feel more fundamental to me. The essay proposes that the central process is the dissipation of free energy according to the second law of thermodynamics, and that evolution is better conceptualized as the emergence of self-replicating dissipative structures that, through natural selection, become increasingly more efficient at degrading free energy.
The structures of life, to which biologists have focused their attention on describing, measuring, quantifying, and explaining, are really the byproduct, the "dissipative structures", created by the grander universal process of the unwinding of the great spring of order and degradation of useful energy in the universe.
Part 1 - Heat, work, entropy and free energy
There are four laws of thermodynamics.
The zeroth law allows the definition of temperature in a non-circular way without reference to entropy, its conjugate variable. This is an empirical definition, and it establishes the transitive relation between the temperatures of multiple bodies in thermal equilibrium. The law is often stated in the following form: “If two systems are both in thermal equilibrium with a third system, then they are in thermal equilibrium with each other.”
The first law is a version of the conservation of energy, adapted for thermodynamic processes. In general, the conservation law states that the total energy of an isolated system is constant; that energy can be transformed from one form to another but can be neither created nor destroyed. According to the author, with the first law, energy supplanted work as a central principle of physics.
The second law recognizes a fundamental asymmetry in nature: the irreversibility of natural processes, and in many cases, the tendency of natural processes to lead to spatial homogeneity of matter and energy. One of the simplest formulations of the second law is the Clausius statement, which states that heat does not spontaneously pass from a colder to a hotter body. An important corollary can be stated here: although the total quantity of energy must be conserved in any process, the distribution of that energy and its utility in doing work changes in an irreversible manner.
The third law deals with properties of matter at very low temperatures, and can be stated as: “A system’s entropy approaches a constant value as its temperature approaches absolute zero.” Another consequence of the third law is that matter cannot be brought to a temperature of absolute zero in a finite number of steps.
Although work can be converted into heat, and heat can be converted to work (with some loss), it is important to note the difference in the effort and sophistication needed to produce heat on the one hand and work on the other. For instance, producing heat from petrol is as simple as supplying heat initially to trigger combustion and oxygen to fuel the spontaneous chemical reactions that release energy from complex organic molecules and produce byproducts of water and carbon dioxide. On the other hand, producing work from petrol involves an internal combustion engine made of many parts built and assembled with high precision. The question arises: why is it so much harder to produce work than heat from a given amount of energy?
The fundamental challenge in any biological or technological task is to extract ordered motion from disordered motion, for therein lies the difference between heat and work.
The Carnot engine (a maximally efficient heat engine operating on the Carnot cycle) can be useful in illustrating this idea. The efficiency of such a heat engine can be shown to depend only on the temperatures of the hot and cold reservoirs, and to be strictly less than 100% in the absence of either a hot reservoir at an infinitely high temperature or a cold reservoir at a temperature of absolute zero. An efficiency less than one means that a real engine in a real environment cannot convert heat into work completely.
After each cycle, the working fluid must be brought back to its initial state. During isothermal expansion, the entropy of the fluid increases during its intake of heat. These two combined mean that an equal amount of entropy must flow out of the working fluid before the next cycle can begin. The entropy of the working fluid can only change when it is in contact with either the hot or the cold reservoir. An equal amount of entropy flowing to the hot reservoir would require all the work done during the first half of the cycle to be undone in the second half. Work can only exit the system if the entropy flows out to the cold reservoir while the engine keeps running forward.
Although the net entropy change in the fluid must be zero after each cycle, the sum of the entropy changes in the hot and cold reservoirs can be shown to be greater than zero, leading to an increase in the total entropy of the surroundings. During heat transfer, without an infinite length of time for thermal equilibrium to be maintained, the temperature of the fluid is slightly less than the temperature of the hot reservoir or slightly more than the temperature of the cold reservoir.
The entropy change of a system can be defined as the heat transferred divided by the temperature of the system during the transfer. During heat intake, the fluid may gain all the heat lost by the hot reservoir, but if the temperature of the fluid was slightly lower, the increase in entropy it experienced ends up being slightly higher than the decrease in entropy experienced by the hot reservoir. This difference in temperature can also occur while entropy flows out of the fluid while it expels heat into the cold reservoir, and leads to the cold reservoir gaining more entropy than the fluid loses. Any such infinitesimal imperfection in either direction leads to the cold reservoir gaining more entropy than the hot reservoir loses, leading to a net increase in the total entropy of the system and its surroundings. For a real heat engine, the total thermodynamic process is irreversible.
When we grasp that energy disperses, and only through its dispersal can there be any work, aggregation, organization or structure created, we grasp the fundamental engine of all nature. Work involves coherent motion; heat involves incoherent motion. Only by creating an excess of incoherent motion can we increase coherent motion in any local subsystem.
In terms of the behaviour of individual atoms, the tendency of energy to disperse leads to a dispersal of their location and the equilibration of their energy levels. This is not a teleological process, this arises ontologically from the ways that particles happen to interact and transfer energy among themselves. Any action that transpires and structure that is created is a consequence of these interactions. The cooling of a heat engine as it discards energy into a cold sink is one example of a more general idea of cooling, and analogous processes of cooling can be found to underlie chemical reactions and all other processes that maintain order or concentrate energy.
Cooling is the process through which the total energy of the system remains the same, but the distribution of energy spontaneously equilibrates and disperses, and becomes more disordered.
The disorder of a system can be quantified through entropy, a concept
central to thermodynamics. In statistical mechanics, Boltzmann’s entropy
formula relates the entropy S of an ideal gas to the
multiplicity W (the number of microstates corresponding to
a particular macrostate in a thermodynamic system) through the equation
S = k_B * ln W, where k_B is the Boltzmann
constant. This formula applies only to systems in which each microstate
is equally probable, and can be thought of as a corollary of the more
general equation which also considers the probability of each
microstate.
As an example, in thermodynamics, the universe is usually divided into a system of interest and its surroundings. In that case, the entropy of Boltzmann’s microscopically specified system can be identified with the system entropy in classical themordynamics. The microstates of such a thermodynamic system of interest are not equally probable. For instance, if a system of interest is kept at a fixed temperature by allowing contact with a heat bath, high energy microstates are less probable than low energy microstates. In more direct terms, heat is unlikely to flow from the colder system to a hotter system. In less direct terms, for a given macrostate of the full system, the probability of microstates that correspond to a macrostate of the system of interest in which it is hotter than its surroundings is less than the probability of microstates in which it is in thermal equilibrium with its surroundings.
Nevertheless, it is fair to say that the entropy of a system observed to be in a particular macrostate is proportional to the logarithm of the number of microstates that correspond to that macrostate. Maximum entropy occurs when the full system is at thermal equilibrium and there is no net flow of energy from one part of the system to another. Except for chance fluctuations, the system will remain in equilibrium forever.
This simple idea of random dissipation of order and energy is deceptively profound. Boltzmann’s equation describes the statistical way in which systems evolve. This illustrates the fundamental irreversibility of natural change. All structures and events correspond to the evolution of the universe through successive states of increasing probability, with disordered, dispersed energy states having higher probability than organized and concentrated states.
Part 2 - Organisms are self-replicating dissipative structures
Regarding ecosystems, it has been said that they are not only more complex than we think, but that they are more complex than we can think (Frank Egler). The interaction of a vast number of organisms with the structure of their environment is inconceivably complex. The profound idea that emerges from our previous notion - that the random dissipation of energy gives rise to all structure and change - is that ecosystems are structured and maintained by the simplest thermodynamic process that can be conceived: the random dissipation of order and energy.
At this point, the author briefly describes how chemical reactions are similar to the processes of heat engines, with reference to chemical thermodynamics. For me, chemistry is a topic even more distant from my understanding than physics, so I’ll skip over all that here, and point out only a single sentence with far-reaching implications: “so long as reactions are linked such that one change may be constructive and lead to a local reduction in entropy, chemical and biological systems can grow increasingly complex.” The mere fact that, in the right conditions, reactions can constructively interfere and produce increasing complexity feels unobvious and circumstantial, similar to the feelings evoked by the anthropic principle, I guess.
This thermodynamic process underlies the major biochemical pathways, it is as fundamental to them as photosynthesis is to life on earth. In oxygenic photosynthesis, the incoming low-entropy, high-energy photons of solar radiation are used to drive reactions that store a lesser part of that energy in an intermediate product, discarding most of the energy as heat. A similar process happens in Photosystem I, where another photon drives the reaction, storing some of the energy in the intermediate product and discarding most of it as heat. In this process, energy is taken from photons to drive chemical reactions up the free energy ladder, storing usable energy, but causing an overall degradation of the quality of energy received as input.
Much of the incoming energy is discarded as heat, with much less ability to do work. On a longer timescale, if Earth is in a state of radiative equilibrium, the energy of the original photons is radiated away into space as a greater number of high-entropy, low-energy photons of a longer wavelength.
Conversely, the citric acid cycle is a biochemical reaction used by respirating organisms to release energy stored in nutrients. During the cycle, the energy originally stored is degraded through a number of intermediate reactions in which oxygen is consumed and carbon dioxide is released, while the energy is made available in molecules such as ATP and GTP. At each step, some of the energy released is degraded and discarded as heat, increasing the entropy of the universe.
Biochemical heat engines drive all biological processes, and these heat engines run by dissipating order and increasing disorder. They function because they transform ordered motion and concentrated energy into disordered motion and dispersed energy.
Dissipative structures build and maintain order by degrading energy and producing disorder in their environment. These structures arise as a consequence of the spontaneous dispersal and degradation of energy and the associated increase in entropy and disorder. They are sustained by a flow of energy, without which they decay. During the occurrence of a process that produces (increases or maintains) a dissipative structure the rate of generation of entropy in the universe is increased, as energy is being dissipated more rapidly in the presence of a dissipative structure than in its absence.
Life on Earth also causes the the entropy of the universe to increase at a greater rate than it would if the photons had fallen on inanimate rock. Without the source of their energy, organisms will die and their dissipative structures will decay.
They are thermodynamic instabilities driven by the flow of energy and the transduction and degradation of that energy.
More accurately, organisms are systems of dissipative structures. All the molecules, tissues, organs and systems in their body are dissipative structures built and maintained by the continued consumption and degradation of order and energy. Besides this homeostasis, organisms are also evolved to reproduce themselves, making them self-replicating dissipative structures. Evolution can be understood as the process through which variants of these structures that are better able to efficiently consume energy (as well as increasing entropy) are selected for. Life does not evolve first and then just happen to dissipate energy as a side effect.
Life is an emergent property of a system that has evolved to dissipate energy efficiently and increase entropy, not the other way around.
Part 3 - Ecosystems are networks of self-replicating dissipative structures
Proceeding updwars on the cosmic ladder, the essay tries to show how heat engine dynamics and the central process of the random dissipation of energy and the increasing of entropy can be observed at the scale of ecosystems. The structure of ecosystems can also be phrased as emergent dissipative structures driven by the flow of energy and the increase in entropy.
In ecosystems, energy is transferred among so-called trophic levels and becomes degraded, dispersed and diminished, going from higher quality to lesser quantity. This transformation and degradation of energy within organisms is analogous to the process of decrease in useable energy through the food web and energy pyramid.
Autotrophs are organisms that can convert abiotic (non-living) sources of energy into energy stored in organic compounds. They do not need a living source of material or energy and are the (primary) producers in a food chain, at the first trophic level. As an example, photoautotrophs can utilize light energy and inorganic compounds to produce organic materials to sustain their own metabolism, such as plants, algae and cyanobacteria. In contrast, heterotrophs obtain matter and energy by either consuming material produced by autotrophs directly, or indirectly, by eating other heterotrophs.
As a rule of thumb, on Earth, the loss of usable energy between trophic levels is on the order of 90% in terrestrial food webs and somewhat less in aquatic food webs. The dependence of heterotrophs on autotrophs and other heterotrophs builds up towards multiple trophic levels in many ecosystems, with increasingly attenuated production at each level. Due to the high loss at each level, it is rare to find more than three or four trophic levels. However, aquatic systems can more commonly display additional trophic levels due their higher efficiency and stronger partitioning of trophic niches.
The energetic loss between trophic levels is a direct analogy to the concept of thermodynamic cooling that drives the heat engine model of extracting work from heat.
In trophic transfer, there are losses at multiple points of both nutrients and energy. For instance, herbivores do not eat all the plant mass produced, and they don’t assimilate all of that biomass, much of it is lost as excrement, urine and respiration products. Also, they do not convert all of the energy they assimilate into biomass either. The inherent thermodynamic inefficiency of metabolism and subsequent dependent biochemical pathways means that a large portion of the assimilated energy is used for homeostasis and does not build new biomass. As consumption and production layer on each other cyclically along the pathways of energy flow, their efficiencies multiply and lead to a sharp reduction of the efficiency of net primary productivity usage at higher trophic levels.
Consumption (assimilation) and production efficiencies vary among heterotrophs, with factors such as taxonomic group, feeding strategy (such as herbivores and omnivores), and type of body temperature regulation (metabolically or externally regulated).
The paragraphs above describe the energetics of ecosystems as aggregate systems. However, it is important to appreciate that ecosystems have spatial structure and temporal dynamics that drive pattern-process relationships that govern distribution, abundance, fitness and evolution of organisms.
After the above, the essay mentions a debate in landscape ecology that centers around attempts to link thermodynamical processes to landscape patterns. While some try to pursue ecological thermodynamics, some think that thermodynamics have nothing to do with landscape ecology and the entire endeavour should be abandoned.
This is where I think I found evidence of a feeling about the fundamental difficulties of using low-level understanding (thermodynamics) to describe structures and processes that are multiple layers of organization and emergence away from the scales at which they are usually studied at (molecular scale). Emergence seems to complicate that transfer of understanding to higher scales of distance and layers of organization. What seems interesting to me here is the proposition that various processes on larger scales could just be described as thermodynamic systems and reasoned about as thermodynamic processes. It does seem like a beautiful idea, anyways.
Conclusion
While digesting the later sections, I kept trying to figure out what my conclusion should be, what it should be about. There are a number of possible philosophical consequences of these ideas that I think would be interesting to explore, such as the double-edged tendency of systems to evolve towards efficiently maximizing entropy and the inseparability of dissipation from the creation of complexity.
It would only be a conclusion in name, though. A published essay may be expected to have a section so titled due to its format, but this text is just an imprint of my attempt at ingesting these ideas, far from any point in time when I could distill them into a form that would deserve such a title. Still, I can already say that the essay has put the diversity, beauty and complexity that we see around ourselves (which we can choose to maintain and create) in a new light for me.
There is a grandeur to this view of things, with its simple, undirected process underlying all creation and all structure, and whilst the cosmos has gone on unwinding, according to the second law of thermodynamics, endless forms are being and will be emergent.
created
modified