Abstract
Homo sapiens' cultural evolution has far outpaced its biological evolution and led us into an impasse.
Subject Categories: Evolution & Ecology, History & Philosophy of Science
In our time, the tension between the immutable constraints of human nature and the rapid advances of modern civilization is creating an atmosphere of uncertainty, insecurity and threat. Science and its stepsister, technology, are mainly responsible for this state of affairs. It is no surprise then that apocalyptic doom predictions are becoming a fashion. Yet, it is in the very nature of living systems as ontotelic structures pursuing their permanence to negate such doom prospects. This essay attempts to analyze this incongruity by naturalizing it.
A spell and the lure of simplicity
The tension between human nature and modern civilization poses a challenge for biology and for the theory of knowledge implied by it. Indeed, biology in the early 21st century may resemble physics at the end of the 19th century. At that time, many physicists believed that physics was and would remain solidly based on Newton's laws of mechanics and the forces acting between bodies, and that the goal of physics was to reduce all physical phenomena to the basic principles of mechanics. Isaac Newton had solved the mysteries of the universe and his successors could merely add footnotes to his great work.
In 1986, evolutionary biologist Richard Dawkins introduced his book The Blind Watchmaker in a similar vein: “This book is written in the conviction that our own existence once presented the greatest of all mysteries, but that it is a mystery no longer, because its is solved. Darwin and Wallace solved it, though we shall continue to add footnotes to their solution for a while yet.” Dawkins had gained fame with his 1976 book The Selfish Gene, in which he presented his theory of life. The concept was simple: Life began with the formation of a “replicator,” a molecule capable of reproducing itself which thereby gained an advantage over other molecules within the primordial soup. Eventually, replicators became genes, sequences of DNA, that are carried by organisms, “robot vehicles blindly programmed to preserve the selfish molecules known as genes.” The evolution of life is the natural selection for those genes that survive in the struggle for persistence, while the organism exhibits only transient existence. Accordingly, the genes are the exclusive units of natural selection. Dawkins' theory, often dubbed as “genocentrism” or “the gene's‐eye view of evolution” was welcomed by some as the final blow to anthropocentrism: as Copernicus reoriented the solar system and moved humans out of its centre, the selfish genes and not humans became the pivots of life.
The tension between the human nature and modern civilization poses a challenge for biology and for the theory of knowledge implied by it.
The simple notion of a tidy collection of “selfish genes” is not sufficient to explain experimental data on the complexity of genomes. A genome may be conceived as a complicated machinery with two main functions: to preserve the cellular memory of the knowledge about the world and to transmit it to successive generations; and to maintain cellular homeostasis by actively responding to signals from the environment. However, the ongoing genome sequencing efforts have forced us to change our concept of a gene as a clearly defined DNA sequence that codes for a protein.
It is hard to imagine why and how such a fluid and hard‐to‐define entity as “the gene” can be a fundamental and exclusive unit of Darwinian selection.
Of the three billion base pairs in the human genome, only 2% correspond to genes that are translated into proteins. The remaining 98% were once considered “junk DNA,” but further research has now ascribed regulatory functions to most of them: as “switches” that control when and where genes are expressed. The richness and complexity of this regulatory network is astounding: 70,000 promoters, 400,000 enhancers and a myriad of noncoding RNAs of known and unknown function vastly outnumber the about 20,000 protein‐coding genes. Moreover, instead of seeing the genome as a linear representation of genes and DNA sequences, the understanding of its three‐dimensional structure is central to interpreting its regulatory and transcriptional architecture.
It is hard to imagine why and how such a fluid and hard‐to‐define entity as “the gene” can be a fundamental and exclusive unit of Darwinian selection. Since Dawkins formulated his idea of the selfish gene, biologists have differed in its evaluation: some proclaiming it as a breaking discovery, some rejecting it as a nonsense and others favoring a multi‐level selection theory. Let us hear from an independent arbiter, the cosmologist Lee Smolin (2011): “…biologists seem to be endlessly arguing about the scale on which natural selection operates. Does it operate on the ecosystem, on the species, on the individual, on the gene? The one key lesson about self‐organized systems that physicists have learned is that they're what we call critical systems, which are systems in which significant correlations are evolving on every possible scale. So the answer, I imagine, is that evolution must be taking place simultaneously on a large varieties of scales.”
The continuing success of Dawkins' doctrine may be better understood if we read him as a poet rather than a scientist, with all the idiosyncrasies of an artist: grand imagination, rich metaphors, self‐confidence and even fanaticism. Opponents were stultified by the prodigious eloquence of Dawkins, who bluntly denounced Konrad Lorenz and other evolutionary biologists as “totally and utterly wrong.” His acolytes took up his emotionally charged rhetoric, labelling their adversaries as “naive” or as “heretics.”
Dawkins gained even more fame as a militant antireligionist. In 1991, he described religious beliefs as “mind‐parasites” and believers as “faith sufferers” or “patients.” Here is the irony of an evolutionary biologist dismissing the possibility that religion, as part of human mythophilia and spirituality, may have been positively selected for in the “environment of evolutionary adaptedness.”
Another aspect of human mythophilia is the craving for simplicity in our explanation of the world. Our thinking is monocausal or at best oligocausal, admitting only one or a few causes of events. Still: what if the fundamental axiom of selfish genes and their dominance were false? What if life did not begin as a molecular replicator? We know that replication of genes requires energy and thus metabolism. What if genes are not active agents in evolution, but later acquisitions as bookkeepers of evolutionary gains? If this were the case, the lure of simplicity would end. The intricacy of genetic networks may be analogous to that of biochemical and neural networks, and their plasticity may be, according to a recent idea of biologist Steven Frank (2023), involved in adaptive responses of organisms.
Naturalized anthropocentrism
The “gene's‐eye view of evolution” is a metaphor—it is not the view of genes, but the view of Dawkins and his followers and that view is anthropocentric, “through the human lenses.” Each species experiences the world in its own species‐specific way through sensory organs and molecules with which it registers specific aspects of its environment, evaluates them and acts accordingly. As a result, each species has its own species‐specific cognition and its knowledge represents species‐specific reality. In this respect, bacteria, flies or bats are not different from humans.
Our modern civilization is a product of cultural evolution creating ever more complex artifacts and it enables humans to live in two worlds, the physical and the symbolic.
But evolution has equipped humans with a unique capability: the ability to make artifacts, makes humans an “animal artefaciens.” Such artifacts – from stone tools to steam engines and to computers and RNA vaccines—have enabled a cumulative cultural evolution. The knowledge gained during the biological evolution is stored in the biological memory, the genome. The knowledge acquired from the cultural evolution is remembered differently—it is stored in the artifacts themselves. In addition to material artifacts, humans have also created symbolic artifacts: concepts, social institutions, art and science (Fig 1). Our modern civilization is a product of cultural evolution creating ever more complex artifacts and it enables humans to live in two worlds, the physical and the symbolic (Kováč, 2023).
Figure 1. Prometheus by Gustave Moreau, 1868, Oil on Canvas. Wikipedia/Public Domain.
In ancient Greek mythology, Prometheus was a titan who stole fire from the gods and gave it to humankind. For this deed, he was punished by the gods who chained him to a mountain where an eagle would come every day to eat from his liver. The Prometheus myth is a metaphor of humans acquiring knowledge to shape the world.
Cultural evolution has a ratchet‐like character: it leads to the cumulative accretion of knowledge. This caused a revolution in physics at the turn of the 19th and 20th centuries. Statistical physics, the theory of relativity and quantum mechanics revolutionized science and philosophy not the least by highlighting the limitations of human efforts to understand the world. According to 18th century philosopher Immanuel Kant, these limitations define a human‐specific reality, which is separated from the real, objective world by what we call Kant's barriers and lock up the human species into a “Kant cage.” Kant explained that human cognition consists of two forms of intuition, space and time, and twelve categories of logic. These represent a priori knowledge, which does not stem from experience, but makes cognition possible from observation or experiment.
The onset of statistical physics revealed additional limitations of human cognition. We are capable of sensing the physical world, the macroworld, but we have long been unaware of the microworld, the world of atoms and molecules. When we observe an event, we experience a “macrostate” of the macrowold and can only speculate about the “microstates” underneath the macroworld. What draws contemporary biology closer together with statistical and quantum physics is the question about the role of the human subject in observing and understanding the world. Newtonian science was based on a third‐person view of nature: the objective world is both independent of the human observer and reproducible. The guiding principle of scientific objectivity was to “let nature speak for itself.”
What draws contemporary biology closer together with statistical and quantum physics is the question about the role of the human subject in observing and understanding the world.
Today the dominance of Newton's world view in science is coming to an end. It started in the 1920s, when Niels Bohr, in a discussion with Einstein in Epistemological Problems in Atomic Physics on wave/particle duality in quantum physics, introduced the principle of complementarity. Depending on the measuring instruments, we can interpret an elementary physical entity either as a wave or as a particle. Even if they appear divergent or contradictory, according to Bohr, they are both true and equally essential in representing the true nature of the world; they are complementary. Bohr's younger colleague Wolfang Pauli wrote in 1952: “The general problem of the relation between psyche and physis, between inside and outside, can hardly be regarded as solved by the term “psychophysical parallelism,” which was advocated in the last century. It would be most satisfactory of all of the physis and the psyche could be regarded as complementary.”
The principle of complementarity may be applied to today´s efforts in biology to understand the nature of consciousness, in which David Chalmers (1995) recognized “the feeling which accompanies awareness of sensory information” as “the hard problem”. Here, one must distinguish between the Newtonian “third‐person” and Edmund Husserl's “first‐person” approach of philosophical phenomenology. Contemporary efforts by cognitive scientists and philosophers to naturalize phenomenology, pioneered in 1997 by Francisco Varela, are an attempt to bridge the two approaches, to complement them.
Applying without understanding
There is a famous quote by Richard Feynman, made at a conference at Cornell University in 1964: “I think I can safely say that nobody understands quantum mechanics.” Would this apply to modern biology as well? In general, one needs to distinguish between the terms grasping and understanding. We can apply scientific knowledge in practice without understanding it. Quantum physics is a good example: we will soon be able to exploit the miraculous power of quantum computing without understanding quantum mechanics.
When Craig Venter reported in 2010 that he and his team had constructed an artificial bacterium by inserting synthetic DNA into another bacterium stripped of its own DNA, David King of the Human Genetics Alert watchdog said, “What is really dangerous is the ambition of these scientists for complete and unfettered control over nature, which many people describe as “playing God”. Scientists' understanding of biology falls far short of their technical capabilities.”
The evolution of science goes on, and the gap between grasping and understanding is growing. As early as in 1985, evolutionary culturologists Robert Boyd and Peter J. Richerson anticipated that “we shall replace a world which we do not understand by a model of the world we do not understand.” In 2008, there was a discussion among experts on the Edge website sparked by a text by entrepreneur Chris Anderson, entitled The End of Theory, Will a Flood of Data Make the Scientific Method Obsolete? According to Anderson, huge amounts of data and applied mathematics will replace other research tools: given sufficient data, the numbers speak for themselves. This issue was later discussed in 2012 at the Altenberg symposium Data Intensive Biology: Why Google will not replace science. Its organizers, Werner Callebaut and Isabella Sarto‐Jackson, referred to microbiologist Carl Woese's (2004) warning that “a society that allows biology to become an engineering discipline, that allows this science to slip into the role of changing the living world without trying to understand it, is a danger to itself.”
Energy and entropy
The driving force behind the cumulative growth of scientific knowledge is the second law of thermodynamics. It was formulated in the middle of the 19th century by physicist Rudolf Clausius, who was one of the founders of phenomenological thermodynamics, a science that, at that time of the industrial revolution, busied itself with improving the performance of steam engines. From the investigation of a purely practical problem a law was born that—according to the famous statement from physicist Arthur S. Eddington—holds “the supreme position among the laws of Nature.”
From the investigation of a purely practical problem a law was born that—according to the famous statement from physicist Arthur S. Eddington—holds “the supreme position among the laws of Nature.”
The second law of thermodynamics has various formulations. One of the most common states that “the total entropy of a system either increases or remains constant in any spontaneous process; it never decreases.” The term “entropy” itself was invented by Rudolf Clausius. He expressed it by a rather abstract mathematical formula according to which the change in entropy is equal to the ratio of the reversible heat transfer between the system and its environment and the absolute temperature: ∆S = Q rev/T, where S is the symbol for entropy, Q for heat and T for absolute temperature.
Energy is a conserved quantity: the law of conservation of energy states that energy can be converted in form, but it cannot be created or destroyed. Like matter, energy is irregularly distributed in the universe and it flows down gradients in two ways. One appears to us humans as work, the other we perceive as heat. In 1956, Zoran Rant proposed the name exergy for that fraction of energy, which—in a reversible process—can be used for doing work. The remaining fraction is called anergy, it represents the waste heat and corresponds to the “lost energy,” a term introduced in 1849 by physicist William Thomson, who later became Lord Kelvin.
While we live in the physical world, we conceptualize and create hypotheses about the physical world exclusively in the symbolic world.
The increase in entropy is a measure of the declining of energy gradients, the “dissipation” of energy, its depreciation, the diminishing of its usefulness for doing work. From the fact that entropy is increasing, Clausius concluded that the universe has an arrow of time that points forward: spontaneous events are unidirectional. Thus, the universe is headed toward the “heat death” when exergy becomes zero, and thus energy, in terms of human needs, is rendered worthless.
The concept of entropy became clearer when Clausius' phenomenological thermodynamics was complemented in the late 19th century by Ludwig Boltzmann and Josiah Willard Gibbs' statistical thermodynamics. It describes the microworld, the world of atoms and molecules, which we humans cannot see, because we are beings of the macroworld. The motion of a single molecule in the microworld is governed by Newton's laws of motion, and these laws are symmetrical with respect to time: motion in one direction is equivalent to motion in the opposite direction. However, a large number of moving molecules will collide with each other randomly. For calculating their dynamics, Boltzmann introduced probabilistic assumptions, which are absent in Newtonian mechanics. He derived his famous law for the entropy S: S = k loge Ω, where loge is the natural logarithm and Ω is the number of microstates that a single macrostate can realize. The constant k is now called Boltzmann's constant and we denote it as k B. Its value is a very small number 1.38 × 10−23 J/K, where J is the unit of energy in joule and K is the absolute temperature in kelvin.
The probability of a given microstate i corresponds to: p i = 1/Ω. From this, Gibbs derived the formula for thermodynamic entropy: . Boltzmann also proved that the reversibility of microstates in systems where many microstates form a single macrostate leads to macroscopic irreversibility directed toward thermodynamic equilibrium as the most probable state.
Information entropy
The concept of entropy became more complicated when mathematician Claude Shannon published his theory of communication in the middle of the 20th century and introduced “information entropy.” In communication, messages are transmitted between their sender and their receiver, and information entropy is a measure of how clear or unclear a message is: S inform ≈ uncertainty. Formally, the formula for information entropy resembles the Gibbs formula for thermodynamic entropy: . But the constant K does not have the numerical value of the Boltzmann constant k B: it is equal to 1 and can be dropped from the formula. This substantially changes the dimension in which the two entropies are expressed and their numerical values. The magnitudes of the two entropies are incommensurable. Thermodynamic entropy has a given dimension J/K from the Boltzmann constant, whereas information entropy is a dimensionless quantity and cannot be expressed in SI units. While the logarithmic expression in the formula for thermodynamic entropy is in natural logarithms, information entropy is commonly expressed in logarithms of base 2, as , and its unit is a bit or alternatively a byte.
The formal similarity of the two terms does not imply their identity. Consequently, it is advisable to use the term entropy as a noun with an adjective. Thermodynamic entropy S therm, as represented by statistical thermodynamics, expresses the inability of humans to describe in detail those states of microworlds that constitute a particular macrostate. In communication theory, the information entropy S inform represents the uncertainties the receiver of a particular message has before and after receiving the message: the difference in the quantity of the entropies we call “information.” In the real world, thermodynamic entropy is constantly increasing; it would seem that it has no limit. On the other hand, the information entropy is maximal when all probabilities in Shannon's formula are the same. This is the case when uncertainty equals randomness and the message has zero information.
The progress in constructing virtual realities, soon amplified by artificial intelligence, may quickly transfer the human species from living in the physical, material world to living in the symbolic world.
The differences between the two entropies led theoretical physicist Edwin T. Jaynes (2003), in his book entitled Probability Theory: The Logic of Science, to state “We must warn as the outset that the major occupational disease of this field is a persistent failure to distinguish between the information entropy, which is a property of any probability distribution, and the experimental entropy of thermodynamics, which is instead a property of a thermodynamic state as defined, for example by such observed quantities as pressure, volume, temperature, magnetization, of some physical system. They should never be called by the same name; the experimental entropy makes no reference to any probability distribution, and the information entropy makes no reference to thermodynamics. Many textbooks and research papers are flawed fatally by the author's failure to distinguish between these entirely different things; and in consequence proving nonsense theorems.” Incidentally, many experts have debated the question of the anthropomorphic nature of entropy which could perhaps be considered a part of the same problem as the role of the subject in quantum physics.
In applying Shannon's information theory, Jaynes based his approach on a thesis he had put forward already in 1957 that humans live in a world about which they lack complete information. The only rational way to create new knowledge is to generate experimental and observational data to update uncertainties and probabilities, a procedure called “Bayesian inference.” Jaynes also proposed the MaxEnt principle: Of all possible probability distributions, the one that leaves a subject with the greatest uncertainty is preferable because it does not imply more than what he or she knows.
Data and information
All of this brings us back to the human‐specific duality of physical and symbolic worlds (Kováč, 2023). While we live in the physical world, we conceptualize and create hypotheses about the physical world exclusively in the symbolic world. We then search for answers in the physical word by observing it, measuring it, or experimenting on it. In this way, we accumulate data. The difference between the uncertainties before and after a measurement represents a potential information by which we may enrich our knowledge. The knowledge would become part of our symbolic world, but we exploit it to act in the physical world and fashion the latter for our benefit.
… there is wisdom in accepting with dignity the end of the human species in the impasse of evolution while appreciating and admiring how wonderful the experiment was and the magnificent works humankind has accomplished.
By acting in the physical world we dissipate energy and raise the entropy of the universe. Dealing with knowledge in the symbolic world also needs energy, but several orders of magnitude less. According to Landauer's principle, named after the physicist Rolf Landauer, the processing of one bit of Shannon's S inform entropy, represents an energy equal to or greater than the value of k B T lne 2, which, at room temperature, is approximately 2.9 × 10−21 J. The biophysicist Mikhail Volkenshtein (1972) reasoned that by reading—in the symbolic world—all the books humans have written so far, we would increase the thermodynamic entropy by a smaller amount than if we brought—in the physical world—the water in a small cup to boil.
It is important to keep this in mind. We do not live in the “information age.” The word information has been used and abused even more than the word entropy. It is more correct to say that we live in a digital age, which is also the age of data deluge. Information entropy, which we count in bits or bytes, is growing enormously. In 2010, humanity produced 2.5 exabytes (an exa is a number with 18 zeros, a quintillion) of new data every day—more than it had produced in its entire existence until 2003. By 2025, we will have an estimated 163 zettabytes (a zeta is a number with 21 zeros, a sextillion) of data at our disposal. But this is just data from which we need to extract information and thereby increase our knowledge. Even the tiny fraction of information that we are able to extract grows at an ever‐increasing rate and, in accordance with the principle of maximum entropy production, it accelerates the dissipation of energy in the material world. It is making the world we live in increasingly complex, with problems increasingly difficult to solve and the future increasingly unpredictable.
Looking at the stars
The enormous consumptions of exergy and generation of entropy have brought contemporary humanity into an unprecedented dire state of climate change, accumulation of material waste, biodiversity loss, global migrations and political tensions. The situation may be aptly described by oft‐quoted words, uttered by a character from one of Oscar Wild's play: “We are all in the gutter, but some of us are looking at the stars.”
Is there a way out? The obvious solution to curb environmental deterioration is a radical reduction in material consumption. However, such a move at the global level and across generations is highly improbable for two reasons: First, it would run counter market dynamics, which appears to require a permanent economic growth, and second, it would oppose human hedonotaxis—the unceasing search for pleasure and minimization of discomfort.
Two respective measures are conceivable to overcome resistance to change: A replacement of political democracy by autocracy or downright dictatorship to enforce a desirable human behavior, or application of genetic manipulation to modify human nature and made it compatible with the requirement of modern times. The former has been refuted in the 20th century by the failure of the communist social experiment and the latter appears to be more dangerous than the recent upsurge of artificial intelligence.
Taking into account the incommensurability of the two entropies, we can imagine another solution that would have humanity thriving without radical alterations of politics or human nature. The progress in constructing virtual realities, soon amplified by artificial intelligence, may quickly transfer the human species from living in the physical, material world to living in the symbolic world. Social bonds in the real world will eventually wither away as most human needs will be completely satisfied in the symbolic virtual world. Odi and amo, I hate and I loved. Here you go. Negative emotions, pain and fear, will only serve to achieve a catharsis, to strengthen the intensity of positive emotions, pleasure and delight. World travel will be virtual, no need to burn tons of gasoline to fly from continent to continent. The requirement for exergy will be many orders lower and the larger part will be consumed by microprocessors. Even though present‐day computers use about a billion times more exergy than is the Landauer limit, the energy efficiency of computing continues to improve.
Virtual sex may well become the first technology of virtual reality to be commercialized. The ensuing ceasing of human reproduction has been considered as a single painless alternative to numerous scenarios which foresee the extinction of Homo sapiens as a consequence of violent conflicts and unimaginable suffering (Kováč, 2015). A sweet utopia to counter the dystopias? A race against time? Perhaps. But essentially a unique challenge for the young generation and its imagination and creativity.
There is a snag in the whole story. Cutting down energy dissipation by moving from the physical to the symbolic world would contradict the principle of maximum entropy production, which seems to be a cosmic imperative. Life has evolved to speed up the dissipation of energy and, according to the epistemic principle, to advance cognition at the expense of usable energy. We may conceive evolution as steady experimentation, progressing in a maize with myriads of blind alleys in which miscarried experiments get stuck. The cultural evolution has enabled the abnormally rapid growth of human knowledge, much in excess to that allowed by biological evolution. This substantiates considering the human species as an anomaly, a miscarried experiment.
The meaning and value of the life of each human individual, and of the species as a whole, is that it was lived.
It does not, however, devaluate human life, nor deprive it of meaning. A minority of humans, designated by Aristotle as magnanimous people, have the privilege, thanks to scientific knowledge, to gain insights into the evolution of the cosmos and of life. Given that living entities at all levels of hierarchy have their limited lifetime—birth, growth, aging and death—there is wisdom in accepting with dignity the end of the human species in the impasse of evolution while appreciating and admiring how wonderful the experiment was and the magnificent works humankind has accomplished.
After the extinction of Homo sapiens—a sort of “species euthanasia”—the of increase of entropy on Earth and the successive knowledge accumulation, driven by the second law of thermodynamics, will return to its normal rate. In the time that is left to us, we should attempt to find the final theory of everything, which physicists have been striving for. It might not be a theory of the elementary constituents of the world, but a theory of the mind and of its relationship to the universe. What if we do not achieve it? What if the goal surmounts the competence of science itself? Perhaps, the magnanimous people would resort to the arts.
We are closing our part as actors in a thrilling play. Life will not stop progressing. A good ground for the ultimate optimism. The meaning and value of the life of each human individual, and of the species as a whole, is that it was lived. Let us recall the legacy of the philosopher Michel Montaigne, which he left for us four centuries ago: “The surest sign of wisdom is constant cheerfulness; her state is like that of things above the moon, ever serene.”
Supporting information
EMBO reports (2023) 24: e58377
References
- Chalmers DJ (1995) Facing up to the problem of consciousness. J Consciousness Stud 2: 200–219 [Google Scholar]
- Frank SA (2023) Precise traits from sloppy components: perception and the origin of phenotypic response. Entropy 25: 1162 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jaynes ET (2003) Probability theory: the logic of science. Cambridge: Cambridge University Press; [Google Scholar]
- Kováč L (2015) Closing human evolution: life in the ultimate age. Berlin: Springer; [Google Scholar]
- Kováč L (2023) Cognition: from the cosmos down to molecules – and back. EMBO Rep 24: e57263 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Smolin L (2011) https://www.edge.org/conversation/lynn_margulis‐lynn‐margulis‐1938‐2011‐gaia‐is‐a‐tough‐bitch
- Volkenshtein M (1972) Perekrestski nauki. Moscow: Nauka; [Google Scholar]
- Woese CR (2004) A new biology for a new century. Microbiol Mol Biol Rev 68: 173–186 [DOI] [PMC free article] [PubMed] [Google Scholar]