Skip to main content
Communicative & Integrative Biology logoLink to Communicative & Integrative Biology
. 2016 Jul 22;9(4):e1187348. doi: 10.1080/19420889.2016.1187348

Discourse on order vs. disorder

Arto Annila a,b, Keith Baverstock c
PMCID: PMC4988435  PMID: 27574534

ABSTRACT

The second law of thermodynamics is on one hand understood to account for irrevocable flow of energy from the top down, on the other hand it is seen to imply irreversible increase of disorder. This tension between the 2 stances is resolved in favor of the free energy consumption when entropy is derived from the statistical mechanics of open systems. The change in entropy is shown to map directly to the decrease in free energy without any connotation attached to disorder. Increase of disorder, just as order, is found to be merely a consequence of free energy consumption. The erroneous association of disorder with entropy stems from an unwarranted assumption that a system could undergo changes of state without concomitant dissipation, i.e., a change in energy.

Keywords: disorder, free energy, the principle of increasing entropy; the principle of least action, the second law of thermodynamics

Introduction

The second law of thermodynamics is generally regarded as the supreme law of nature. Its import is common sense: Heat will flow from hot to cold, never the reverse. The second law can also be stated alternatively: Entropy cannot but increase. The irrevocable increase in entropy, however, is not as tangible a notion as the irreversible motion of energy down from height. So, what does entropy mean?

On one hand entropy S is quite frequently equated with disorder.1-3 On the other hand, when the entropy change dS is multiplied with temperature T, the result TdS is a perceptible change in energy.4,5 So, it is pertinent to ask: How does increasing disorder relate to decreasing free energy? This question has remained open despite many studies,6-12 and hence continues to be right in the forefront of scientific inquiry of life given in terms of physics.13,14

On one hand life, by its numerous processes that consume free energy in insolation, food, etc., is no different from any other process, for instance, inanimate processes that level off temperature gradients.15,16 On the other hand life's tendency to organize is often seen as opposing irrevocably increasing disorder.17,18 True enough, plants, animals and other kingdoms of life do display amazing complexity. Then again, some minerals may crystallize to gigantic monoliths with astounding degrees of order. Aren't these contrasting examples alone implying that neither disorder nor order is an end in itself, but only an outcome of the irreversible consumption of free energy? So, how did entropy ever become associated with disorder?

On the origin of misconception

Boltzmann was impressed by Darwin's tenet of evolution by natural selection, and wanted to see evolution as a manifestation of the natural law.19 Boltzmann realized that complicated systems comprising numerous particles are best described in statistical terms, though he considered only an ideal gas. According to his kinetic theory of gases20 entropy S = kB ln W is the logarithm of probability (Wahrscheinlichkeit) multiplied by Boltzmann's constant kB. The probability meant for Boltzmann the number of possible ways the state of a system could be realized from various positions and momenta of its copious constituents. The logarithm of W served to give a convenient additive measure, known as entropy S.

Boltzmann associated increasing entropy with increasing disorder by reasoning that an orderly ensemble of gas molecules has a low number of possible positions and momenta, whereas when dispersing to macroscopic uniformity, the number of possible permutations will be high, corresponding to the maximum microscopic disorder. Noteworthy, Boltzmann did not consider dispersal in energetic terms, because the ideal gas, unlike any real substance, was imagined to become disordered without any change in energy, i.e., without dissipation. In other words, Boltzmann disregarded the very force that drives systems toward thermodynamic balance. Thereby, his definition of entropy became conceptually detached from energy, and hence it deviates from reality.

Thus, it remains for us to formulate how the irrevocable decrease in free energy relates to the probable evolution of a system toward thermodynamic balance in its surroundings. To this end we are guided by common observations that the surrounding density in energy dictates the course of a system. For example, water molecules on a cold surface, when warmed up, will vaporize and disperse. Conversely when the gas cools down, the molecules will condense back on the cold surface. Likewise, seedlings will organize their growth toward sunshine, whereas when left in darkness their growth disperses aimlessly. Thus, the probable course of a system from one state to another is driven by the energy difference between the system and its surroundings, not by disorder or order that are merely consequences of free energy consumption. Eventually, when all free energy has been consumed, the system and its surroundings have attained common density in energy, i.e., the most probable state.

Probability in energetic terms

Although it is next to impossible to know exactly how a complex system comprises of its entities, we may nevertheless formally depict the state of a system exactly by placing its constituents on levels of an energy diagram.21-23 In this scale-free manner we can express in energetic terms the probability Pj for a population of entities, labeled with j, to exist. Then, by considering all populations, we can formulate the total probability P = ΠPj for the entire system to exist in its surrounding density in energy.

We begin by assigning each j-entity with a distinct energy attribute Gj, given relative to the average energy kBT of the system at temperature T. The j-entities that populate a distinct energy level in numbers Nj (Fig. 1) house altogether the density in energy24 Nj exp(Gj/kBT). Its logarithm is known as the chemical potential μj = kBT ln Nj + Gj.

Figure 1.

Figure 1.

The system is depicted in terms of an energy level diagram. At each level, indexed by k, there is a population of Nk individuals each with energy Gk. The size of Nk is proportional to probability Pk. When an entity in the population Nk transforms to an entity in the population Nj, horizontal arrows indicate paths of transformations which are available for changes in the potential energy bound in matter and vertical wavy arrows denote concurrent changes driven by energy in light. The vertical bow arrows mean exchange of indistinguishable entities without changes in energy. The system evolves, step-by-step, via absorptive or emissive jk-transformations that are mediated or catalyzed by the entities themselves, toward a more probably partition of entities eventually arriving at a stationary-state balance where the levels are populated so that the average energy kBT equals that in the system's surroundings. A sufficiently statistical system will evolve gradually because a single step of absorption or emission is a small perturbation of the average energy. Hence at each step of evolution the outlined skewed quasi-stationary partition does not change much. This maximum-entropy distribution accumulates along a sigmoid curve (dotted) which is on a log-log scale (insert) a straight line of entropy S vs. [chemical] potential energy μ.

The probability

Pj=[k=1NkeΔGjk/kBTe+iΔQjk/kBT]Nj/Nj! (1)

for the population Nj to exist depends on the density in energy Nk exp(Gk/kBT) that is bound in its surrounding substrates, labeled with k, in numbers Nk, each with energy Gk, as well as on the flux of photons whose energy matches the energy difference ΔGjk = GkGj per entity between the k-substrates and j-products. This influx or efflux of energy to the system, i.e., dissipation, is denoted by iΔQjk. The imaginary part merely indicates that the vector potential from the surroundings to the system or vice versa is orthogonal to the scalar [chemical] potential. The division by factorial Nj! enumerates the inconsequential exchange of identical entities (Fig. 1). The indexing includes transformation stoichiometry by running from k = 1 to an unknown upper limit that will be eventually reached when the system attains thermodynamic balance with its surroundings. The product form Πk in Eq. 1 ensures that if any one vital k-ingredient is missing altogether, the j-population cannot exist, i.e., Pj = 0, as well as, if no flux of energy couples from the surroundings to the system, the jk-transformation cannot take place.

The total probability for the system comprising the diverse populations, indexed with j, is simply the product of Pj:s

P=j=1Pj=j=1[k=1NkeΔGjk/kBTe+iΔQjk/kBT]Nj/Nj! (2)

Again the product form Πj ensures that if any one vital j-population is missing altogether, the system cannot exist, i.e., P = 0. The Equation 2 is the complete formal account of a system, but due to its product forms it is not particularly amenable for analysis.

The logarithm of P when multiplied with kB, will return a convenient additive measure, known as entropy

S=kBlnP=kBln[j=1(k=1NkeΔGjk/kBTe+iΔQjk/kBT)Nj/Nj!]=j=1NjkB+Nj(k=1Δμjk+iΔQjk)/T (3)

obtained by denoting the potential difference Δμjk= μk – μj using Stirling's approximation lnNj! ≈ Nj ln NjNj. The entropy formula (Eq. 3) is easy to understand in tangible energetic terms when multiplied with temperature T. Then it is apparent that the first term ΣjNjkBT means energy that is bound in the diverse populations Nj. The second term ΣjNjk−Δμjk + iΔQjk) means free energy that is still present between the system and its surroundings, and hence available for consumption by various jk-transformation mechanisms.

The probable evolution

The open system will evolve from one state to another by consuming free energy via various jk-transformations. The associated flux of energy carriers, e.g., photons, from the system to its surroundings or vice versa leads to the increase in entropy, until all energy differences have leveled off. When all energy is bound, there are no longer driving forces and the system is stationary. At the maximum entropy state there is no net flow of quanta between the system and its surroundings, and hence at the thermodynamic balance there is no gain or loss of energy either.

The equation of motion from one state to another is obtained by differentiating entropy (Eq. 3)

dSdt=j=1dSdNjdNjdt=1Tj=1dNjdt(k=1Δμjk+iΔQjk)0 (4)

using the chain rule. The two-term product shows that when there are resources, i.e., free energy, Aj = Σk(−Δμjk + iΔQjk) > 0, the population Nj will increase, i.e., dtNj > 0 by consuming it. Conversely, when the resources have been over-depleted, i.e., free energy (known also as chemical affinity) Aj = Σk(−Δμjk + iΔQjk) < 0, the population will downsize, i.e., dtNj < 0. Thus, the product is always non-negative, i.e., the second law dS ≥ 0 holds always without any distinction between animate and inanimate. The consumption of free energy is invariably irreversible. The flow of time couples inherently with flow of energy because both time and energy are attributes of the force carriers.

When evolution has consumed all forms of free energy, thermodynamic balance is attained and dS = 0. The free energy minimum state is Lyaponov-stable25 so that any perturbation δNj away from a steady-state population Njss will cause decrease in SNj) < 0 and concurrently increase in dtSNj) > 0. In other words, the further away Nj is from Njss, the larger is the restoring force Aj

At times it is claimed that entropy of an animate system could possibly decrease at the expense of an entropy increase elsewhere. Such an assertion would entail that the force carries, i.e., quanta of actions would emerge from nothing or that they would vanish to nothing. This would violate causality. The conservation of quanta requires that the population change dtNj = ΣkσjkAjk/kBT is proportional to free energy by various mechanisms, denoted by σjk, that facilitate free energy consumption via the jk-transformations. These mechanisms are customarily referred to as characteristics, traits, capabilities etc., among animates and as attributes and properties among inanimate systems. For example, a digestive track in its entirety is a mechanism to consume free energy in food. Likewise a catalyst will speed up free energy consumption of a chemical reaction.

It is worth emphasizing that the evolutionary equation (Eq. 4) cannot be solved because the variables of motion, e.g., the population changes dNj/dt, cannot be separated from their driving forces, Aj. This is to say that the natural processes are path-dependent.22,23 In accord with observations the final outcome is not simply a function of the initial conditions.

The evolutionary equation (Eq. 4) shows that the notions of entropy and its increase do not express anything more than that which can be expressed in energetic terms. Explicitly, entropy (Eq. 3) does not relate to disorder. This unnatural and ill-founded interpretation follows from Boltzmann's idealistic derivation of entropy. When energy of a system is defined to be constant then the system cannot change its state, i.e., evolve by acquiring or losing quanta. Only an imaginary system may disperse without dissipation to randomness.

The evolutionary equation (Eq. 4) is only implicitly that entropy will increase in least time. To see clearly that nature abhors gradients11 we take a convenient continuum approximation of the evolutionary equation using the continuum definition of chemical potential μj = (∂U/∂Nj) in terms of the scalar potential U and likewise of dissipation Qj = (∂Q/∂Nj) in terms of the vector potential Q, to obtain

TdSdt=dUdt+idQdt=dVdtd2Kdt=vU+idQdt (5)

where definitions of velocity v = dtx and spatial gradient ∇ = d/dx have been used as well as a shorthand notation V for the total potential that combines both the scalar U and vector Q potentials. The equality TdS = d2K between the change in entropy and the change in kinetic energy is obtained by taking projection of the force F = dtp along with velocity v, i.e., dt(2K) = v·dtp = v·F = TdtS.

When dividing Eq. 5 by velocity, the resulting equation reveals that the force that directs down along the potential energy gradient F = –∇V is equivalent to the path's direction up along the entropy gradient, F = dtp = TS. Thus, we conclude that the second law of thermodynamics (Eqs. 4 and 5) and Newton's second law of motion are equivalent expressions for transformations from one state to another in the continuum limit, as they should be.26,27 Moreover, the 2nd law of thermodynamic is recognized as equivalent to the principle of least action, in its original form, where kinetic energy is the integrand of action ∫2Kdt = ∫p·vdt to be minimized.28,29

Conclusions

Our derivation of the principle of increasing entropy clearly differs from that of Boltzmann, and hence also our reasoning that natural processes consume free energy in the least time departs from those deductions where entropy is associated with disorder. In the end, when judging one tenet over another, only the correspondence with observations matters. According to the thermodynamics of open systems, all systems evolve to attain energetic balance with their surroundings in least time, rather than seeking some absolute equilibrium. Therefore, there is nothing perplexing about non-equilibrium thermodynamics and equilibrium thermodynamics is nothing but the dynamics of a free energy minimum state, i.e., at a stasis.

We draw particular attention to the fact that the evolutionary equation (Eq. 4) reproduces the patterns that are ubiquitous in nature.30-33 Namely, skewed nearly lognormal distributions and logarithmic spirals that accumulate along sigmoid growth and decline curves, and hence follow mostly straight lines on log-log plots, i.e., comply with power laws. There is no profound puzzle about self-organization and no true mystery about emergence either.34-37 Both phenomena are merely manifestations of the least-time free energy consumption. Moreover, the non-deterministic evolutionary equation complies with observations that natural processes are path-dependent. Even the whole Universe has its evolutionary history.38,39 In this context it is of interest to comment on an opinion article40 that this journal carried last year about a view of life based on the Indian philosophy, Vedãnta, as it maintained that abiogenesis is not possible. The rationale presented is that life and consciousness are not separable and life only comes from life. It is not our purpose to argue against this philosophical view, except to say that as it is presented, as an alternative to the “ontological view” (Darwinian biology), which the authors allege regards the “organism as a complex machine [and] presumes life as just a chance occurrence, without any inner purpose,” it is not the only alternative. We take a holistic view of life based on physics and chemistry41 serving the purpose to eliminate disequilibria in energy deposition in the locally available most efficient way – with this view we see no impediments to consciousness, an ambiguous notion itself42 being part of this process, or to it having been initiated by abiogenesis.

The application of thermodynamics in non-equilibrium, non-isolated systems has been an active area of research over many decades. We would argue that the assumption of increasing entropy being synonymous with disorder has had a major disruptive effect on biological thought. Among the most notable work in this area has been that of Ilya Prigogine and co-workers.7 Recognizing that some evolving systems produce ordered dissipative structures spontaneously, apparently in violation of the second law, i.e., dS/dt < 0, Prigogine and colleagues proposed that in such systems the change in entropy was given by the sum of 2 terms, one for the entropy generated within the system (dSi/dt) and the other accounting for the entropy of exchange of energy across the boundary of the open system (dSe/dt). The second law, dS/dt > 0, would hold if dSe/dt was sufficiently negative to offset increases in dSi/dt due to dissipation. In other words, despite the appearance of dS/dt < 0 another un-measurable entropy term was acting in compensation. Under this interpretation, in the practical case of Bénard cells, where ordered dissipative structures occur spontaneously, dSe/dt < 0 would be assumed to offset the obligatory increase in internal entropy. In fact, Bénard cells are a direct demonstration of increased entropy appearing as order, rather than disorder and the term dSe/dt is unnecessary and in this relatively simple case it is difficult to see what exactly it would represent.

Others have explored the thermodynamic implications of non-equilibrium systems from a perspective similar to the “least action” approach taken here. For example, it has been proposed43 that if a stationary state (or equilibrium state) of a system is perturbed, then the system will respond by trying to restore the system by the most efficient means. This approach is based upon an earlier re-statement of the laws of thermodynamics44,45 to the effect that an isolated system, e.g., gas molecules, in which internal constraints, typically compartments, are removed, will acquire a unique state of equilibrium characterized by maximum entropy. However, the authors resolve what they call “the Schrödinger paradox,” the ability of organisms to retain their highly organized state against the supposed degrading effects of the second law17 by invoking the importation of high-quality energy by organisms at the expense of creating disorganization in their environments. Interestingly, Schneider and Sagan provide empirical evidence against Penrose's proposed resolution46 of the same supposed paradox, namely that organisms effectively feed on the products, e.g. vegetation, of low entropy high energy photons and emit high entropy low energy photons back to space. However, measurements of energy input and output for ecosystems show that the more mature an ecosystem the more free energy dissipated corresponds to a lower emission of long wave radiation and a lower surface temperature which is partly contributed to by the latent heat of evaporation as a result of transpiration.

Avery47 addresses the same problem, but proposes that the free energy that drives the life process contains information, coined thermodynamic information that can act to enhance molecular order. The Gibbs free energy at any time in a system as a result of its chemical species represents the amount of available thermodynamic information and that which is not dissipated as heat can be retained in the complex internal chemistry of the system as order. Utilization of this information internally offsets the supposed disordering effects due to increases by driving the molecular machinery “to build up statistically unlikely, e.g., information containing, structures.” What Avery proposes is in fact a circuitous route to the conclusion we have drawn without invoking the novel concept of thermodynamic information: the dissipation of free energy can lead to complex and ordered molecular structures which are, in fact, not unlikely, but the most probable states of the system.

Similar arguments have been given also by others,48,49 but more recently, the physicist Jeremy England has claimed a breakthrough that implies that the origin of life was as inevitable as a rock falling under gravity.13,14 He proposes that if a group of energy driven particles hops over a barrier to adopt one of 2 more ordered states, each emitting low grade heat in the process, the one that emits the most heat, or concomitantly, falls to the lower internal energy state, will be favored. This favored state will be predisposed to dissipate more of the external driving energy and thus undergo further transitions to even more ordered states, By this route, he argues, life would inevitably arise from inanimate material given an external energy driving source, the sun. The dynamic basis of this model is textbook Newtonian mechanics F = ma, which is time reversible, in contrast to Eq. 5 divided by velocity. The dissipated heat serves to increase the barrier to the reverse transition, thus favoring the transition to more ordered states and eventually the self-assembly of life. This he describes as a general thermodynamic mechanism for self-organization. This argument omits 2 important points: 1) that life is based on metabolism and, therefore, chemistry and 2) that a dissipative system can, under the right conditions, self-assemble simply because the self-assembled entity is the more probable, as described herein.

Finally, it is important to realize that entropy, although a macro state variable, cannot be directly measured but has to be inferred. Therefore, entropy is easily subject to ambiguous definitions. However, the consistent and comprehensive definition in Equation 3 makes entropy easy to understand in energetic terms by multiplying with T. Specifically entropy multiplied by T of one system can be free energy for another, whether embedded or not. So, for example, stored energy from one organism can be nutrient for another and information, which is also a form of free energy,50 generated in one system can be applied in another.

Thus, we conclude that the alleged qualitative distinction between animate and inanimate is without substantiation as well as that the contest between order and disorder is contrived.51 According to the 2nd law of thermodynamics there are only quantitative differences between diverse means and mechanisms to consume free energy, as well as between outcomes of natural processes. Consequently we find the discourse on order vs. disorder as obsolete.

Disclosure of potential conflicts of interest

No potential conflicts of interest were disclosed.

References

  • [1].Anderson G. Thermodynamics of Natural Systems. Cambridge, MA: Cambridge University Press; 2005. [Google Scholar]
  • [2].Greven A, Keller G, Warnercke G. Entropy – Princeton Series in Applied Mathematics. Princeton, NJ: Princeton University Press; 2003. [Google Scholar]
  • [3].Boltzmann L. Lectures on Gas Theory. New York, NY: Dover (reprint); 1896. [Google Scholar]
  • [4].Lambert FL. Disorder – A Cracked Crutch for Supporting Entropy Discussions. J Chem Educ 2002; 79:187; http://dx.doi.org/ 10.1021/ed079p187 [DOI] [Google Scholar]
  • [5].Atkins PW, de Paula J. Physical Chemistry. Oxford: Oxford University Press; 2006. [Google Scholar]
  • [6].Lotka AJ. Elements of Mathematical Biology. New York, NY: Dover; 1925. [Google Scholar]
  • [7].Nicolis G. Prigogine I. Exploring complexity. New York, NY: Freeman; 1989. [Google Scholar]
  • [8].Morowitz H. Energy Flow in Biology. New York, NY: Academic Press; 1968. [Google Scholar]
  • [9].Dougherty JP. Foundations of non-equilibrium statistical mechanics. Philos Trans R Soc Lond A 1994; 346:259-305; http://dx.doi.org/ 10.1098/rsta.1994.0022 [DOI] [Google Scholar]
  • [10].Ulanowicz RE, Hannon BM. Life and the production of entropy. Proc R Soc London B 1987; 232:181-92; http://dx.doi.org/ 10.1098/rspb.1987.0067 [DOI] [Google Scholar]
  • [11].Schneider ED, Kay JJ. Life as a manifestation of the 2nd law of thermodynamics. Math Comp Model 1994; 19:25-48; http://dx.doi.org/ 10.1016/0895-7177(94)90188-0 [DOI] [Google Scholar]
  • [12].Brooks DR, Wiley EO. Evolution as Entropy: Toward a Unified Theory of Biology. Chicago, IL: The University of Chicago Press; 1986. [Google Scholar]
  • [13].England JL. Dissipative Adaptation in Driven Self-assembly. Nat Nanotechnol 2015; 10:920; PMID:26530021; http://dx.doi.org/ 10.1038/nnano.2015.250 [DOI] [PubMed] [Google Scholar]
  • [14].England JL. Statistical Physics of self-replication. J Chem Phys 2013. 139:121923; PMID:24089735; http://dx.doi.org/ 10.1063/1.4818538 [DOI] [PubMed] [Google Scholar]
  • [15].Sharma V, Annila A. Natural process – Natural selection. Biophys Chem 2007; 127:123-8; PMID:17289252; http://dx.doi.org/ 10.1016/j.bpc.2007.01.005 [DOI] [PubMed] [Google Scholar]
  • [16].Annila A, Annila E. Why did life emerge? Int J Astrobio 2008; 7:293-300; http://dx.doi.org/ 10.1017/S1473550408004308 [DOI] [Google Scholar]
  • [17].Schrödinger E. What is Life? The Physical Aspect of the Living Cell. Cambridge, MA: Cambridge University Press; 1948. [Google Scholar]
  • [18].Lehninger A. Principles of Biochemistry. 2nd Ed. New York, NY: Worth Publishers; 1993. ISBN 0-87901-711-2. [Google Scholar]
  • [19].Boltzmann L. The second law of thermodynamics (Theoretical physics and philosophical problems) New York, NY: Springer-Verlag; 1974. [Google Scholar]
  • [20].Boltzmann L. Wissenschaftliche Abhandlungen, 1. Hasenröhl F. ed., Leipzig: Barth; 1909. Reissued: New York NY: Chelsea; 1969. [Google Scholar]
  • [21].Annila A, Salthe S. Physical foundations of evolutionary theory. J Non-equilb Thermodyn 2010; 35:301-21. [Google Scholar]
  • [22].Mäkelä T, Annila A. Natural patterns of energy dispersal. Phys Life Rev 2010; 7:477-98; PMID:21030325; http://dx.doi.org/ 10.1016/j.plrev.2010.10.001 [DOI] [PubMed] [Google Scholar]
  • [23].Annila A. Natural thermodynamics. Physica A 2016; 444:843-52; http://dx.doi.org/ 10.1016/j.physa.2015.10.105 [DOI] [Google Scholar]
  • [24].Gibbs JW. The Scientific Papers of J. Willard Gibbs. Woodbridge, CT: OxBow Press; 1993-1994. [Google Scholar]
  • [25].Strogatz SH. Nonlinear dynamics and chaos with applications to physics, biology, chemistry and engineering. Cambridge, MA: Westview; 2000. [Google Scholar]
  • [26].Kaila VRI, Annila A. Natural selection for least action. Proc R Soc A 2008; 464:305570; http://dx.doi.org/ 10.1098/rspa.2008.0178 [DOI] [Google Scholar]
  • [27].Tuisku P, Pernu TK, Annila A. In the light of time. Proc R Soc A 2009; 465:1173-98; http://dx.doi.org/ 10.1098/rspa.2008.0494 [DOI] [Google Scholar]
  • [28].Annila A. All in action. Entropy 2010; 12:2333-58; http://dx.doi.org/ 10.3390/e12112333 [DOI] [Google Scholar]
  • [29].De Maupertuis P-LM. Les Loix du mouvement et du repos déduites d'un principe metaphysique, Histoire de l'Acad Roy Sci Belleslett 1746; 267-94. [Google Scholar]
  • [30].Gaddum JH. Lognormal distributions. Nature 1945; 156:463-6; http://dx.doi.org/ 10.1038/156463a0 [DOI] [Google Scholar]
  • [31].Limpert E, Stahel WA, Abbt M. Log-normal distributions across the sciences: keys and clues. Bioscience 2001; 51:341-52; http://dx.doi.org/ 10.1641/0006-3568(2001)051%5b0341:LNDATS%5d2.0.CO;2 [DOI] [Google Scholar]
  • [32].Grönholm T, Annila A. Natural distribution. Math Biosci 2007; 210:659-67; PMID:17822723; http://dx.doi.org/ 10.1016/j.mbs.2007.07.004 [DOI] [PubMed] [Google Scholar]
  • [33].Bejan A, Marden JH. The constructal unification of biological and geophysical design. Phys Life Rev 2009; 6:85-102; PMID:20416845; http://dx.doi.org/ 10.1016/j.plrev.2008.12.002 [DOI] [PubMed] [Google Scholar]
  • [34].Sneppen K, Bak P, Flyvbjerg H, Jensen MH. Evolution as a self-organized critical phenomenon. Proc Natl Acad Sci USA 1995; 92:5209-13; PMID:7761475; http://dx.doi.org/ 10.1073/pnas.92.11.5209 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [35].Annila A, Kuismanen E. Natural hierarchy emerges from energy dispersal. BioSystems 2009; 95:227-33; PMID:19038306; http://dx.doi.org/ 10.1016/j.biosystems.2008.10.008 [DOI] [PubMed] [Google Scholar]
  • [36].Salthe SN. Evolving Hierarchical Systems: Their Structure and Representation. New York, NY: Columbia University Press; 1985. [Google Scholar]
  • [37].Pernu TK, Annila A. Natural emergence. Complexity 2012; 17:44-7; http://dx.doi.org/ 10.1002/cplx.21388 [DOI] [Google Scholar]
  • [38].Schwarz DJ, Starkman GD, Huterer D, Copi CJ. Is the low-l microwave background cosmic? Phys Rev Letters 2004; 93:221301; PMID:15601079; http://dx.doi.org/ 10.1103/PhysRevLett.93.221301 [DOI] [PubMed] [Google Scholar]
  • [39].Bielewicz P, Gorski KM, Banday AJ. Low-order multipole maps of CMB anisotropy derived from WMAP. MNRAS 2004; 355:1283-302; http://dx.doi.org/ 10.1111/j.1365-2966.2004.08405.x [DOI] [Google Scholar]
  • [40].Shanta BN. Life and consciousness – The Vedantic view. Communicative Integr Biol 2015; 8(5):e1085138; PMID:27066168; http://dx.doi.org/ 10.1080/19420889.2015.1085138 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [41].Annila A, Baverstock K. Genes without prominence: a reappraisal of the foundations of biology. J Roc Soc Interface 2014; 11:20131017; PMID:24554573; http://dx.doi.org/ 10.1098/rsif.2013.1017 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [42].Annila A. On the character of consciousness. Front Syst Neurosci 2016; 10:27; PMID:27065819; http://dx.doi.org/ 10.3389/fnsys.2016.00027 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [43].Schneider ED, Sagan D. Into the Cool: energy flow, thermodynamics and life. Chicago, IL: University of Chicago Press; 2004. [Google Scholar]
  • [44].Hatsopoulos GN, Keenan JH. Principles of General Thermodynamics. New York, NY: Wiley; 1965. [Google Scholar]
  • [45].Kestin JA. Course in Thermodynamics. New York, NY: McGraw-Hill; 1979. [Google Scholar]
  • [46].Penrose R. Cycles of Time: what came before the big bang. London, UK: Vintage Books; 2011. [Google Scholar]
  • [47].Avery JS. Information, Theory and Evolution. London, UK: World Scientific; 2012. [Google Scholar]
  • [48].Bejan A. The Physics of Life: The Evolution of Everything. New York, NY: St. Martin's Press; 2016. [Google Scholar]
  • [49].Lucia U. Bioengineering thermodynamics: an engineering science for thermodynamics of biosystems. IJoT 2015; 18:254-65. [Google Scholar]
  • [50].Karnani M, Pääkkönen K, Annila A. The physical character of information. Proc R Soc A 2009; 465:2155-75; http://dx.doi.org/ 10.1098/rspa.2009.0063 [DOI] [Google Scholar]
  • [51].Annila A, Kolehmainen E. On the divide between animate and inanimate. J Sys Chem 2015; 6:1-3; PMID:25904989; http://dx.doi.org/ 10.1186/s13322-014-0006-2 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Communicative & Integrative Biology are provided here courtesy of Taylor & Francis

RESOURCES