A logical question to be expected from students: “How could life develop, that is, change, evolve from simple, primitive organisms into the complex forms existing today, while at the same time there is a generally observed decline and disorganization – the second law of thermodynamics?” The explanations in biology textbooks relied upon by students and instructors are incomplete. A necessary but insufficient premise is that only total entropy of a system must increase. In this article, I present background information for a lesson plan on entropy and question biology textbook presentations on the second law and how life could evolve despite it. The principal concept is that biological information in macromolecules provides fresh insight into evolution in the earth’s thermodynamic system.

Experience teaches that neat and tidy children’s rooms become disorganized and haphazard (Figure 1A). Similarly, in principle, primordial granite mountain peaks wear away to sand, ancient structures fall into ruins, and your favorite shoes wear out. The universal experience of deterioration and disorganization is the basis of the second law of thermodynamics: The entropy of the world tends to increase (Clausius, 1865). Entropy increases as things come apart and energy is wasted. In contrast to the observed disorganization, over the ages there is a progression in size, beginning with the smallest of organisms and leading to giant sequoias and blue whales. From the simplest origin, the complex human brain develops. A logical question to be expected from students is “How could evolution occur while at the same time there is generally increasing disorganization?” If biology textbooks are the source of the answer for instructors and students, misconceptions can be created. The textbook presentations are incomplete. Here, I present information for a lesson plan developed to give a more complete answer, with the following topics: understanding entropy; a digest of biology textbook presentations on the second law and how life could evolve in its presence; questioning the textbooks; information in macromolecules, the key; and entropy in chemical evolution. The suggested principal concept is that biological information in macromolecules provides fresh insight into evolution in the earth’s thermodynamic system of general decline and disorganization. For other objectives, students could be expected to better understand the important but perhaps unfamiliar topic of entropy, and to appreciate that a variety of answers to a complex question can be published, some of which may be incomplete or unsatisfactory.

Figure 1.

(A) “I blame entropy” (after Thompson, 2012). (B) To bring about organization, an arranging process is necessary that employs effort (energy) and entails a form of information processing (i.e., knowing where things are, where they are to go, and a path between).

Figure 1.

(A) “I blame entropy” (after Thompson, 2012). (B) To bring about organization, an arranging process is necessary that employs effort (energy) and entails a form of information processing (i.e., knowing where things are, where they are to go, and a path between).

Entropy in Terms of Energy Unavailable to Do Work

Increasing entropy occurs when organisms use food as a source of energy and do work, such as in human effort. Complex, ordered molecules such as glycogen stored in muscle tissue are used as a source of energy to do work and are also converted into metabolic heat that is released to the environment. Because nutrient energy cannot be converted to work by cells at 100% efficiency, some of the energy is unavailable for work. “Energy unavailable to do work” is one definition of entropy.

Life requires a constant input of energy to maintain order, and without energy the complex structures of living systems would not exist. The steady flow of energy necessary to sustain a living system increases entropy. Cain et al. (2009) illustrate this principle using a tool shed, unmaintained and falling into disrepair, being restored by human effort (energy and work; see Figure 2).

Figure 2.

The disorder of a system tends to increase unless it is countered by an input of energy. Unattended organized systems, such as this wooden tool shed, tend to lose their order and become disarrayed. An input of energy, here in the form of human effort, is needed to maintain order and structural organization (After Cain et al., 2009).

Figure 2.

The disorder of a system tends to increase unless it is countered by an input of energy. Unattended organized systems, such as this wooden tool shed, tend to lose their order and become disarrayed. An input of energy, here in the form of human effort, is needed to maintain order and structural organization (After Cain et al., 2009).

Entropy in Terms of Organization & Probability

That entropy is some form of energy is but a partial understanding. Entropy is identified with order and organization. Entropy as unavailable energy and entropy in terms of organization or order are, in reality, quite the same, although they appear to be different. In this other understanding of entropy, the trend to disorder is quantitatively described in terms of possibilities and probability. For example, there are more possibilities of an untidy, messy room than a neat, orderly room. More possibilities of an untidy room compared to all possibilities, neat plus untidy, means higher probability, and more entropy for the untidy room than for the neat room. Boltzmann (1844–1906) expressed this concept mathematically: ΔS = k ln Δp, where ΔS is increase in entropy, k is Boltzmann’s constant, ln is the natural logarithm, and Δp is increase in probability of being in a less organized state compared to being in a more organized state (technically microstates).

There is a persistent bias toward randomization (i.e. disorder and disorganization) in nature because random arrangements are more numerous and, therefore, more probable than orderly ones. Every process and reaction is subject to the same tendency toward disorder unless counteracted by some dynamic. It is to be noted that “order” has several meanings, some of which may be inappropriately associated with the second law. Nevertheless, order and organization are related to entropy as described by Boltzmann’s view of the second law.

Textbook Explanations of Evolution & Entropy

Building on the concept of entropy as unavailable energy, biology textbooks focus on the model of an open thermodynamic system, as in these three examples:

As with other spontaneous events, we must distinguish between the system and its surroundings. The second law of thermodynamics indicates only that the total entropy in the universe must increase; the disorder within one part of the universe (the system) can decrease at the greater expense of its surroundings…. Life operates on a similar principle. Living organisms are able to decrease their own entropy by increasing the entropy of their environment. (Karp, 2010)

One of the fundamental characteristics of ecosystems is that they must have access to an external source of energy to drive the biological and ecological processes that produce these localized accumulations of negative entropy.... Virtually all ecosystems (and life itself) rely on inputs of solar energy to drive the physiological processes by which biomass is synthesized from simple molecules. (McGrath, 1999)

During the early history of life, complex organisms evolved from simpler ancestors. For example, we can trace the ancestry of the plant kingdom from much simpler organisms called green algae to more complex flowering plants. However, this increase in organization over time in no way violates the second law. The entropy of a particular system, such as an organism, may actually decrease as long as the total entropy of the universe – the system plus its surroundings – increases. Thus, organisms are islands of low entropy in an increasingly random universe. The evolution of biological order is perfectly consistent with the second law of thermodynamics. (Reece et al., 2011)

And here is a complementary textbook approach to the anomaly of increasing entropy and more complex organisms:

Consider the human body, with its highly organized tissues and organs composed of large, complex molecules. This level of complexity appears to be in conflict with the second law but is not, for two reasons. First, the construction of complexity also generates disorder. Constructing 1 kg of a human body requires the metabolism of about 10 kg of highly ordered biological materials, which are converted into CO2, H2O, and other simple molecules…. [M]etabolism creates far more disorder (more energy is lost to entropy) than the amount of order (total energy; enthalpy) stored in 1 kg of flesh. Second, life requires a constant input of energy to maintain order. Without this energy, the complex structures of living systems would break down. Because energy is used to generate and maintain order, there is no conflict with the second law of thermodynamics. (Sadava et al., 2011)

Questions

There are questions that point out inadequacies of the textbook answers. What is the basis for asserting that the total entropy of the system must increase but not that of its parts? The statement of the second law came about by observing all earth’s processes, not earth’s system in total. Clausius would have observed wasted heat and Boltzmann would have seen disorder, all at the local level. In all processes and reactions in earth’s system, there is a tendency to increase entropy, and this will occur unless counteracted by some dynamic. Like the first law of thermodynamics, that energy is neither created nor destroyed, the second law is a fundamental scientific law and applies to every process and reaction.

What then is the justification for “islands of low entropy” and “localized accumulations of negative entropy” to arise from a sea of increasing randomness – the remainder of things on earth? Why is it that living entities are mainly the “islands,” and the “sea” consists primarily of inanimate things? These questions are significant because these apparently anomalous local accumulations are the consequence of many adaptations, with succeeding generations being more complex. It cannot be readily concluded that increasing total entropy of earth’s system and surroundings alone accounts for these anomalies.

Is energy from the sun the cause? There are striking similarities between the responses to sunlight by a sunflower and by a solar panel (see Figure 3). A sunflower will take in solar energy and by photosynthesis convert the energy captured from the sun into chemical energy. This energy can be the constant input to sustain and grow “islands of low entropy.” A solar panel will also take in the sun’s radiation and, by the photovoltaic effect, produce electrical energy. However, an inanimate solar panel will continually deteriorate and produce less and less electrical energy. What accounts for the difference? It is apparent that the sunflower has something more in its composition than the panel to cause a marked difference in response to solar energy.

Figure 3.

Examples of contrasting responses to solar energy: a sunflower grows, generating more energy, and a solar panel deteriorates, generating less energy. What causes the difference?

Figure 3.

Examples of contrasting responses to solar energy: a sunflower grows, generating more energy, and a solar panel deteriorates, generating less energy. What causes the difference?

Similar contrasting responses are observed even within organisms: “Unlike the parts of a cell which simply deteriorate if isolated, whole cells can be removed from a plant or animal and cultured in a laboratory where they will grow and reproduce for extended periods of time” (Karp, 2010). Again a question: What makes a part of a cell react as expected by the second law, and yet the whole cell grows and reproduces? Organisms, such as sunflowers, mature, decline, and die (even if placed in a situation where there is adequate sunlight and nutrients) just as expected by the second law. This behavior is not caused by change in total entropy of earth’s system or the sun’s radiant energy, but by some other governing mechanism. “Remarkably, cells within the body generally die ‘by their own hand’ – the victims of an internal program that causes cells that are no longer needed or cells that pose a risk of becoming cancerous to eliminate themselves” (Karp, 2010).

Consider the premise that if the entropy gain associated with disorder is greater than loss of entropy to organization and order, then organized life comes about. Turning again to the inanimate world for comparison, a classic example of increasing entropy is the melting of ice in a glass in a warm room. Energy flows into the glass, and the ordered H2O molecules of ice melt into the more random molecular arrangement of liquid water. This increase in disorder does not induce or otherwise cause any corresponding increase in organization. The idea of processes that generate more entropy than is lost somehow causing organization is not valid. Similarly, a power-generating station or a motor generator needs a constant input of fuel energy to continue to function but will wear away and not display the characteristics of life. A net gain in entropy and a steady flow of energy can be necessary but are not sufficient to explain life’s anomaly.

Organization & Biological Information

Examples of organization are readily found in nature. Honeybees build honeycomb nests based on a regular hexagonal cell design, as shown in Figure 4. Is this activity an improvisation, learned from parents, or reinvented with each generation? Bees must inherit an algorithm, a step-by-step procedure, to accomplish the construction. Being inherited, the algorithm can be traced through genes to informational macromolecules. The difference between living and nonliving entities is the presence of DNA and RNA. The sequence of nucleotides in DNA carries the information that is used by RNA to run cellular activities, and the program for making more of themselves (Karp, 2010). “Organisms evolve through changes in their genetic information” (Hillis et al., 2012). It is to be deduced that information of a biological nature, along with energy input, is the means by which “islands of low entropy” can emerge and grow within a thermodynamic system tending toward disorganization and increasing entropy.

Figure 4.

An illustration of organization created by bees, a structure of hexagonal cells (http://office.microsoft.com/en-us/images/results.aspx?qu=bees&ex=2#ai:MP900316870).

Figure 4.

An illustration of organization created by bees, a structure of hexagonal cells (http://office.microsoft.com/en-us/images/results.aspx?qu=bees&ex=2#ai:MP900316870).

The title of a paper, “Experimental demonstration of information- to-energy conversion and validation of the generalized Jarzynski equality,” suggests that new understanding seems likely (Toyabe et al., 2010). A relationship of information and energy provides experimental evidence validating an explanation of evolution dependent on information. The informational aspect of macromolecules such as DNA and RNA has emerged for prime consideration in explaining biological growth in size and complexity.

It is not a new idea that sequences of nucleotides could be the source of the anomaly of living entities and normally increasing entropy. Nobel Laureate Erwin Schrödinger wrote in his 1944 book What Is Life?: “How does the living organism avoid decay? The obvious answer is: By eating, drinking, breathing and (in the case of plants) assimilating” (Schrödinger, 1992). Nutrient and sunlight energies are necessary to enable biological order and not violate the laws of thermodynamics. He also wrote an apparently overlooked observation:

An organism’s astonishing gift of concentrating a “stream of order” on itself and thus escaping the decay into atomic chaos – of “drinking orderliness” from a suitable environment – seems to be connected with the presence of the “aperiodic solids” the chromosome molecules, which doubtless represent the highest degree of well-ordered atomic association we know of – much higher than the ordinary crystal…. (Schrödinger, 1992)

We now know the “aperiotic solid” to be RNA or some other macromolecule, the carriers of biological information.

The need for information to overcome disorganization can be illustrated using the untidy room of Figure 1. Mader et al. (2013) reason that “a neat room is more organized but less stable than a messy room, which is disorganized but more stable. How do you know a neat room is less stable than a messy room? Consider that a neat room always tends to become more messy.” To counteract this tendency, effort is required. This corresponds to energy input. Also necessary is an organizing process, such as by a parent who arranges everything to its place (see Figure 1B). The course of bringing about an orderly arrangement entails the processing of information (i.e., knowing where things are, where they are to go, and a path between).

Another important consideration in understanding the role of information in evolutionary change can be seen in an organism’s repair of wear and injury. An automobile fender with a scratch will remain that way or corrode to a worse condition unless there is intervention. A human body with a scratch will initiate self repair. Species would not likely continue to exist if they had no repair and maintenance functions, because wear and injury are universally experienced. Healing and restoring functions are out of the ordinary and performed on an as-needed basis involving information on where, when, and how. As referred to earlier, Cain et al. (2009) stated that organized systems that are not maintained tend to lose organization and become disorganized and that a constant input of energy is needed to maintain order. Both energy and information (as illustrated by a knowledgeable worker) are needed to maintain order in complex structural organizations. Humans are complex systems and need to be maintained. The body’s mechanisms are sophisticated and varied. Sometimes external intervention is needed, for example as in dental procedures. A dentist will use x-rays as information in addition to dental work to counteract the natural tendency toward deterioration of a human. Similarly, “For example, failure of a cell to correct a mistake when it duplicates its DNA may result in a debilitating mutation, or a breakdown in a cell’s growth-control safeguards can transform the cell into a cancer cell with the capability of destroying the entire organism” (Karp, 2010).

Entropy in Chemical Evolution

The processes for the creation of the first molecules of life would be characterized thermodynamically by Gibb’s (1839–1903) free energy equation: The change in free energy ΔG equals the change in the enthalpy ΔH (energy absorbed or lost as heat during a process at constant pressure) of the system minus the product of the temperature (Kelvin) and the change in the entropy ΔS of the system: ΔG = ΔH − T ΔS. Entropy, therefore, is a significant factor in chemical evolution.

Because life requires an informational molecule, the culmination of chemical evolution would be a copolymer, perhaps RNA, or something like it, because RNA is a most versatile molecule (Lewis et al., 2007). These early molecules must possess encoded information to be able to (1) metabolize to provide energy, (2) construct containment to sustain the nonequilibrium chemistry, and (3) self replicate. “Consider the state of nucleic acids, and proteins in all your cells: at equilibrium they are not polymers, they would be hydrolyzed to monomer nucleotides and amino acids” (Raven et al., 2011). Because this information would necessitate specific arrangement of the nucleic acid pairs, this would increase entropy (Andrieux & Gaspard, 2008).

Conclusions

Inadequate answers are found in biology textbooks to the question “How could life develop (i.e., change, evolve from simple, primitive organisms into the complex forms existing today) in view of the second law of thermodynamics?” It cannot be readily concluded that increasing total entropy of earth’s system alone explains why living organisms arise and change from a sea of inanimate things. Material for a lesson plan developed to show the informational functioning of macromolecules such as RNA and DNA provides fresh insight into evolution in the earth’s thermodynamic system of general decline and disorganization. Entropy is a significant factor in Gibb’s free energy formula, relevant to chemical evolution.

Acknowledgments

Insightful comments were given by Professor Marjorie L. Reaka, Biology Department, University of Maryland. Helpful suggestions were provided by ABT reviewers and Dr. Clark and Judith Foulke. Laura Hoff provided the tool shed illustration. Child’s room photographs are courtesy of Sarah Baker.

References

References
Andrieux, D. & Gaspard, P. (2008). Noneqilibrium generation of information in copolymerization processes. Proceedings of the National Academy of Sciences USA, 105, 9516–9521 [and see review by C. Jarzynski, pp. 9451–9452].
Cain, M.L., Yoon, C.K. & Singh-Cundy, A. (2009). Discover Biology: Core Topics, 4th Ed. New York, NY: Norton.
Clausius, R. (1865). [“Die Entropie der Welt strebt einem Maximum zu.”] Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie. Annalen der Physik und Chemie, 125, 353–400.
Hillis, D.M., Sadava, D.E., Heller, H.C. & Price, M.V. (2012). Principles of Life. Sunderland, MA: Sinauer Associates.
Karp, G. (2010). Cell and Molecular Biology: Concepts and Experiments, 6th Ed. Hoboken, NJ: Wiley.
Krogh, D. (2011). Biology: A Guide to the Natural World, 5th Ed. San Francisco, CA: Benjamin Cummings.
Lewis, R., Parker, B., Gaffin, D. & Hoefnagels, M. (2007). Life, 6th Ed. New York, NY: McGraw-Hill.
Mader, S.S. & Windelspecht, M. (2013). Human Biology, 11th Ed. New York, NY: McGraw-Hill.
McGrath, K.A., Ed. (1999). World of Biology. Farmington Hills, MI: Gale.
Raven, P., Johnson, G.B., Mason, K.A., Losos, J.B. & Singer, S.S. (2011). Biology, 9th Ed. New York, NY: McGraw-Hill.
Reece, J.B., Urry, L.A., Cain, M.L., Wasserman, S.A., Minorsky, P.V. & Jackson, R.B. (2011). Campbell Biology, 9th Ed. San Francisco, CA: Benjamin Cummings.
Sadava, D.E., Hillis, D.M., Heller, H.C. & Berenbaum, M. (2011). Life: The Science of Biology, 9th Ed. Sunderland, MA: Sinauer Associates.
Schrödinger, E. (1992). What Is Life? [Reprint, including Mind and Matter & Autobiographical Sketches]. Cambridge, UK: Cambridge University Press.
Thompson, M. (2012). [Cartoon.] The New Yorker. May 7, p. 12.
Toyabe, S., Sagawa, T., Ueda, M., Muneyuki, E. & Sano, M. (2010). Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality. Nature Physics, 6, 988–992.