Abstract
The second law of thermodynamics accounts for irreversibility of processes in the universe. As a statement about increasing disorder, it also plays a central role in creating order. Structuring is a way of how to increase the rate of dissipation of matter and energy. This is the reason why chemical reactions on Earth have produced a profusion of structures. Chemical structures with particularly high stability, maintained by continual dissipation, are designated, somewhat arbitrarily, as living systems. To preserve stability, organisms are unceasingly performing ontic work, assisted by epistemic work. Biological evolution is a progressing process of knowledge acquisition (cognition) and, correspondingly, of growth of complexity. The acquired knowledge represents epistemic complexity. Biological species are the main “bookkeepers” of acquired knowledge, with individual members of the species functioning as “explorers” of novelty. Science, a human species-specific mode of acquiring knowledge, abounds in metaphors no less than art. In the postgenomic era, the metaphor of information, along with the related metaphor of selfish genes, may need reconsideration and/or complementation. The world of great complexity, which is becoming the focus of studies of contemporary biology, may require—similarly as is the case of quantum physics—descriptions based on the principle of complementarity. Embodied knowledge, molecular engine, ontic and epistemic work, and triggering may become parts of a new conceptual armory.
Key Words: cognitive biology, complexity, information, knowledge, principle of complementarity, selfish gene, thermodynamics
Introduction
The term “information” belongs to the most frequent terms in common conversation, in mass communication media, and also in science. By the end 2003, the word “information” opened at Google 3.1 × 108 entries, in February 2007 2.1 × 109 entries. The doubling time of the item may have been about 13 months (which may be compared to the doubling time of the computer processing power, given by Moore's law, which is about 18 months). In the common parlance, we live in an information society, in the era of information explosion, we daily search for and get huge amounts of information. In biology, organisms are often conceived of as informational systems, DNA as carriers of “genetic information”, and the brain is considered to be an extremely powerful information processor. The influential biologist John Maynard Smith wrote in 2000: “A central idea in contemporary biology is that of information. Developmental biology can be seen as the study of how information in the genome is translated into adult structure and evolutionary biology of how the information came to be there in the first place.”1 Early in 2007, Google gave 1.73 × 108 hits on the joint couple “information” and “biology”. The figure indicates how difficult and risky is a challenge to a rewiever who attempts to give sense to such an overwhelming amount of data.
The very term “genetic information” has undergone evolution, from the idea of the genome as a blueprint, a map, a code up to now mostly favored idea of the genome as a recipe.1–3 Countless attempts have been published to apply the theory of information to the analyses of genomes, receptors, intra- and intercellular communication. The notion of information probably rendered a valuable service in unraveling processes of protein synthesis and topogenesis and in deciphering nucleic acid and protein sequences. Yet, the contemporary state of biology may require a rethinking of the vague and too extensive use of the term. This has already been suggested and attempted by many investigators (for more recent papers, reviewing also the ideas of preceding investigators, see refs. 4–8). The present paper adds some additional arguments and proposals of how to demarcate the term and complement it with other notions. As a corollary, a distinction made between data, information, instruction, command, and knowledge may have a bearing upon the use, disuse and abuse of the term information in other branches of natural sciences, and also in cultural (social and human) sciences, humanities and humanistics.
Elaboration of Conceptions
Epistemological prolegomena.
It has been estimated that the total number of species of animals and plants on Earth may be in the range of 5 to 15 million.9 Framed in the language of thermodynamics, a biological species represents a system and everything that exists outside the system is the environment. Part of the environment that interacts in some fashion with the system and hence has a detectable influence on the system is called the surroundings.10 A living system represents a hierarchy of nested structures and consists, at each level of hierarchy, of individuals, subjects (cells, organisms, groups). Every living system is equipped with a fixed number of distinct sensors. The sensors interact with a specific part of the surroundings, which we call, following Jakob von Uexküll, the Umwelt (Fig. 1).11 Each biological species has its own species-specific Umwelt. The sensors parcel the world into a limited number of items, the differents, and it depends on the number of distinct sensors and on their specificity how fine is the eventual graining of the world. By its complement of sensors, any species distinguishes and assigns significance to a selected set from a myriad of otherwise meaningless objects and events of its surroundings.12 It is from the collection of differents marked by significances that a species-specific reality is being constructed. The Umwelt, which is, in the form of reality, experienced by the subjects, is, in terminology of Edmund Husserl, their Lebenswelt.13
Figure 1.
Organism as a chemical system in its environment.
In this respect, the human species is no exception. As it has been already stated, if bacteria had consciousness and were capable of self-reflection, their world view would be definitely “bacteriocentric”, seeing themselves to be placed in the center of the universe. Human Umwelt is the world of medium dimension and low complexity, the macroworld.14 The world of very small dimensions, microworld, of very great dimensions, megaworld, and of great complexity, multiworlds (psycho- and socioworlds and possibly others) are not directly accessible to us. Indeed, in human perception and conception, they are separated from the macroworld by boundaries that have been named Kant's barriers (Fig. 2). As a biological species, we had been selected in biological evolution for living in the macroworld and, until the onset of cultural evolution, we did not have a slightest idea about the existence of the other worlds.
Figure 2.

Human Umwelt is the world of medium dimensions and low complexity, the macroworld. Other words are separated from it by Kant's barriers.
Cultural evolution has singled out our species from all the other biological species. Humans have the biological capacity to make artifacts. Artifacts have drawn the human species out of its biological confinement and elevated up to a qualitatively novel evolutionary level. The dynamics of the evolution of artifacts is autonomous, only slightly dependent on the intentions and will of human actors. It has dramatically transformed the human Umwelt and surroundings and more and more affects also the environment of life on Earth as a whole. Artifacts also become new surroundings for humans. In addition to material artifacts, there are also social artifacts. One of them is science. Science itself is generating, and in turns dependent upon, a specific category of material artifacts, scientific instruments. They circumscribe the epistemic horizon of scientific inquiry, delimitating the nature of questions that can be posed meaningfully by science. There are the available research techniques that are promoting imaginative speculations to the status of scientific hypotheses.15
Alfred Lotka called mechanical tools and machines “exosomatic instruments” and the term has been popularized by Georgescu-Roegen.16 In the same vein, measuring instruments of science can be considered as “exosomatic sensors.” As the number of these exosomatic sensors has been growing, the human Umwelt has been getting larger. Gradually, the worlds lying behind Kant's barriers have become domains of human interest. Humans are able to construct reality which now comprises not only the macroworld, but also micro-, mega- and multiwords. However, the human mind lacks competence to conceptualize the words behind Kant's barriers. In quantum physics, which deals with the microworld, we claim that electron is a particle, when interpreting data of measurement by one type of instruments, and we claim that it is a wave on the basis of measurements made by another type of instruments. What then “really is” an electron? Both ideas, electron as particle and electron as wave, are correct, even though they rule out each other. Our mind is simply insufficient to grasp the microworld in a more appropriate, consistent, way. The recourse to such a duality of representation was named by Niels Bohr the principle of complementarity.17
The present state of quantum physics reflects the paradoxical situation to which humankind has been brought by cultural evolution. Even though we can exploit the phenomena of the quantum world in our technology and quantum computers may soon raise our computing power by many orders of magnitude, the statement of Richard Feynman from 1965 continues to hold18: “I think I can safely say that nobody today understands quantum mechanics.” Discrepancy between how much we can do, how much we can intervene and manipulate the world on the one hand, and how little we understand it, is increasing. This is true not only of quantum physics. As science is penetrating ever father behind Kant's barriers, the gulf is getting wider between discursive science which aims at achieving understanding and sharing it, and instrumental science or technoscience, which manipulates the world and creates ever more artifacts with ever higher speed.
By being locked in their Lebenswelt, humans use concepts of this Lebenswelt in both discursive and instrumental science. According to Allott,19 in any language the full lexicon derives its internal structure from the representation in language of the basic human behavioral abilities of action, perception and speech. This may also apply to the “lexicon” of concepts, with probably no more than five hundred constituting primitives, “conceptual primordials”, possibly constant for all humans of all times. All other, more abstract, concepts are derived from these primordials and all may be, more or less, metaphors. The language of science is vastly metaphorical - no less that that of art. As pointed out by George Lakoff,20 metaphors are primarily conceptual constructions, and indeed are central to the development of thought. Our understanding of the world is constrained by these forms of cognition.
However much metaphors enlarge our description of the world and refine graining of the description, they do not prevent a pervasive cognitive fallacy that can be called the “fallacy of pseudogeneralization”. There is a possibility to inspect different objects or events, finely discriminate between them and find a common denominator, which represents a generalization from particular observations. But there is also a possibility to observe different objects and events in a coarse manner, without perceiving any difference between them. In the latter case, the outcome of the course graining is not generalization, but “pseudogeneralization.” One commits the fallacy of pseudogeneralization if one uses a single concept, a single name for different entities.
The notion of information has in most cases the character of a metaphor. It applies also to the excessive use of this notion in biology.
Natural History of the Concept of Information
Multiple use of the information metaphor.
In antiquity, the originally Latin word “information” had been rarely used and had a sense to mould, to give shape. The mediaeval scholastics, inspired by Aristotle, conceived the term in the philosophical sense of giving form to matter as a substance. Later, the term got also a meaning to inform someone, to mold or give form to their mind - to teach. The term also appeared in cultivated language and gradually became common in customary language. Dictionaries explain information in the sense of news, data, thing told, items of knowledge. Until the middle of the 20th century the word was virtually absent from science writing. Leo Szilard, in his path-breaking study from 1929 on the relationship between entropy and measurement did not use the term information.21 In the second half of the 20th century, the concept of information made a triumphal massive entrance with cybernetics, communication theory, systems theory, game theory, computer science. It has permeated biology, psychology, sociology, economy, political science. Every aspect of human behavior has been considered as consisting in the collection, processing, transforming and storing information. The term was gradually losing its anthropocentric status, information transactions have been ascribed to cells, genes, machines, molecules. We reach a state in which some physicists proclaim that information is, along with matter and energy, the third substance of the world and all processes in the universe are expounded as processing information.
The problem of such a broad concept of information is similar to the problem of the concept of God: a concept that explains everything is explaining nothing. The import of such an all-encompassing concept of information for discursive science is questionable. But it also largely misses the property that has been considered as the main virtue, if not a prerequisite, of a scientific concept: it is not a quantity that can be measured and used in calculations. In can be of value for instrumental science if it is substantially narrowed down and reduced to a measurable quantity. That is the case of information in the conception of Claude Shannon.
Shannon's concept of information.
Claude Shannon published his conception of information in 1948.22 He worked it out as a mathematical basis for the theory of communication. It should have served to optimalize and economize the transmission of messages. Shannon's information does not relate to the qualitative aspect of messages. It allows optimalization of the transmission per unit energy expenditure or per unit time or per unit cost of any other item of practical importance in the design of the communication networks. As it was the case in his time when sending messages by telegraphy was still in use, the price of a telegram depended not on the importance of its contents, but merely on the number of words. But it is precisely the number of the words, or the digits, to be transmitted and not the contents, which determines the efficacy of transmission and is thus of interest to a communication engineer.
Shannon's conception has been described and popularized in innumerable books and papers. For the purpose of the present essay, its interpretation by Myron Tribus, presented in his book23 and two coauthored papers,24,25 seems to be most enlightening. According to Tribus, a subject has a well-formulated question Q, if we he/she can enumerate all possible answers, without necessarily knowing which answer is correct. If one asks something without knowing what the possible answers are, one has instead requested help in formulating the question. When asking a question Q, a questioner has always some prior knowledge about it, which allows assigning probabilities pi to the various possible answers. If the answer is impossible, its p = 0, if the answer is certain, its p = 1. By borrowing a term from thermodynamics, in fact as a metaphor, Shannon defined entropy S in terms of a well-defined question Q and knowledge X about Q:
S (Q|X) = − K ∑ p ln pi
K represents an arbitrary scale factor, and ln pi the natural logarithm of the probability of pi. If the logarithm with base 2 is used, K = 1 and the unit of entropy is a bit. As is the case of thermodynamic entropy, S is at maximum if all the probabilities are equal. In this special case, we can describe X quantitatively (expressing maximum ignorance about Q, or indifference with respect to all answers), otherwise X is a qualitative concept. Accordingly, entropy is a quantitative measure of uncertainty of the questioner. Shannon defined the information in a message as a difference between the uncertainty the questioner had before receiving the message and after receiving the message:
I = S(Q|X) − S(Q|X')
X' is the knowledge of the questioner after receiving the message. If the message was relevant for the question, X' is greater than had been X, even if not generally calculable. In general, knowledge may be considered as a quantifiable variable, but expressed as ordinal and not cardinal quantity, by applying algebra of inequality, always in comparison with a reference state. Incidentally, in probability theory, we can speak of a probability only with respect to a probability space set up in advance by a subject.
When Shannon's conception is formulated in this manner, it is obvious that the values of entropy and information, and also the very occurrence of information, depend on the receiver of a message, that is, on a subject which is asking a question and expecting a reduction of uncertainty about the possible answers. If the subject is a simple organism, like a bacterium, all its prior knowledge is built in as a result of the evolutionary past of the species. If the subject is an organism with ability to learn, the evolutionary knowledge is supplemented with new pieces of knowledge acquired by conditioning in the course of individual life. If the subject is a molecular sensor, with a simple question and two alternative answers (on and off), the maximum entropy (in case that both answers are equally probable) has the magnitude of only one bit.
The same applies to a human being as a receiver of a message. If the subject has a relative in a hospital that had undergone a surgery, and has a prior estimate of the success of the intervention to be fifty percent, a telephone message, reporting the outcome of the surgery, will carry entropy (and information) with a value of just one bit. For a communication engineer, who cares of the reliability and economy of the telephone connection between sender and receiver of messages, the contents of a specific message, and the single message itself, is of no relevance. For the engineer, there are the set of all possible messages, which can be transmitted, the properties of the communication channel and the manner of how messages are being encoded, which are of importance, and the relevant entropy has different, substantially larger, magnitude.
There is not only prior knowledge of a subject, but also interest and intention, which determine the quantity of information. If a human observer were interested in a physical system and if his/her intention were to specify all quantum states of the system at a particular instant, the all-encompassing message that an appropriate measuring device furnishes—unimaginable so far—would represent a big difference of entropies and hence a very large value of Shannon's information. It is only in this singular case that Shannon's entropy would be equal to thermodynamic entropy. Discussions and controversies about the relationship between Shannon's and thermodynamic entropy, which have lasted for more than half a century (reviewed in refs. 26–29), could have been largely avoided if one had accepted as self-evidence that it is always a subject with intentions and prior knowledge that determines the fineness of graining of the world and by this also the magnitudes of entropy and information.
The concept of information as a case of the fallacy of pseudogeneralization.
It is a matter of fundamental importance to restrict in scientific discourse the concept of information to Shannon's information. Identifying or equating the notion “information” with the notions “data”, “order”, “sequence”, “knowledge”, “instruction”, “command” is a confusion that should be dismissed. We input data into our computers, not information, and the computers process data, not information. Media of mass communication do not serve, with rare exceptions, for production and transmission of information. Databases of sequences of nucleic acids and proteins are simply depots of data and not information databases.
There is a certain analogy between work and heat on the one hand, and information on the other hand. As emphasized in a rigorous textbook of thermodynamics,30 work and heat have a dimension of energy, but they are not energy, they are two modes of transferring energy between a system and its surroundings. Work and heat are not “things”, they are processes between initial and final states of the energy of a system. Something similar can be said of information. Information about a particular thing or event is equal to the difference between final and initial knowledge of a subject about the thing or the event. This is achieved by collecting, selecting and processing available data in such a way that they would rearrange probabilities of possible answers to a well-formulated question. Work and heat are not static entities, they are nowhere generated and nowhere stored, they are two different ways of transforming energy. Just as wrong is it to speak of generation, processing and storing information. We may use another simile. We may compare information to a photon. Just as there is no stationary photon, there is no stationary information. Information is not a substance of the world, not a “thing”. However fashionable the sentence “to change data into information” may be, it would be incorrect if it implied that information is something “objective”, independent of prior knowledge of its recipient.
A quantity of a datum (but not its quality, data can be imprecise, false or senseless), recorded in the form of a sequence of symbols, can be translated into a sequence of two digits, 0 and 1 and, as such, be expressed in number of bits. If the subject were interested in the sequence of symbols in the original record and had no prior knowledge, his/her uncertainty would be reduced by the same number of bits. There is only the process of reducing this uncertainty, and hence of increasing knowledge of the subject, that represents information. The record itself is not information. Referring to Schrödinger,31 we can call the record a “script” (the original word of Schrödinger was a “code-script”). A full knowledge about the sequence of bases in a nucleic acid will reduce to zero the prior uncertainty about the sequence of a biochemist who was interested in knowing the sequence. To obtain the complete data the biochemist would use a specific “exosomatic organ”, a sequencer.
Almost two decades before DNA sequencing has become a commonplace, Lila Gatlin32 had accomplished pioneering investigations on probabilities of base distributions in DNA and summed up her results in a book entitled “Information theory and the living system”. In fact, however, as Peter Sibbald et al33 pointed out in a paper, in which they commended Gatlin's work, “we read along a DNA sequence and measure the amount of information or ‘surprise’ that each symbol conveys to us.” The sequence of bases in a nucleic acid of a cell reduces uncertainty of cellular subjects, which exist at entirely different levels than a human and their uncertainty is qualitatively and quantitatively quite different from the uncertainty of the biochemist. Let us take the restriction enzyme EcoRI as a cellular subject. The enzyme EcoRI can be conceived as a sensor that attaches meaning to a single sequence of bases, GAATTC. The magnitude of its uncertainty is obviously quite different than is that of a biochemist sequencing the entire DNA. This reservation also applies to impressive attempts to understand molecular recognition by using the mathematics of information theory (e.g., refs. 34 and 35 from many others): quantitative analysis of sequences and frequencies of bases has no bearing on the chemistry of molecular recognition, which determines the affinity and specificity of the recognition process.
If information is a matter of a well-formulated question, only a subject that already exists can pose such questions. If a system, as a potential receiver of messages, is only under construction, it makes no sense to attribute uncertainty to it. The question “what is going to be next?”, if asked by a system under construction, would not be a well-formulated question. Accordingly, the construction of a system does not consist in receiving information but, if the system has a designer, in receiving instructions. Instructions can be recorded as a script and a set of instructions can constitute an algorithm of the construction process. As scripts, instructions can be measured and expressed in bits but it does not mean that instruction is equal to information or even identical with information. Brooks and Willey36 proposed the term “instructional information” and maintained that it “comprises a physical array whose informational properties depend only on the properties internal to the system in question” (p. 35). With such a conception, we would have reification not only of information but also of instruction. Instruction has nothing to do with uncertainty and probability; it is subject-dependent and appears to be, in fact, antithetical to information. In the same sense, a child, who learns at school, scarcely receives any information. It receives commands and, mainly, instructions. These are but three of a number of ways by which a human subject processes data coming from the surroundings.
It follows from this analysis, that calling nucleic acids “information carriers” appears to be a misnomer. There have been lasting, largely fruitless, speculations as to whether nucleic acids are “carriers” of data, programs, information, etc. According to Douglas Hofstadter, in the case of nucleic acids not only are programs and data intricately woven together, but also the interpreter of programs, the physical processor, and even the language are included in this intimate fusion.37 Since Hofstadter wrote his book, the subject has not been clarified, but has become much more tangled. Nucleic acids definitely do not function simply as a store of instructions. Already in 1968, Conrad Waddington remarked that a genotype is like a set of axioms, for instance Euclid's, and a phenotype is like a three-volume treatise on Euclidean geometry.38
Hofstadter pointed out to “level mixing in the cell”. But there is also level mixing at others, both lower and upper, levels of hierarchy. At any level, there is always a subject, which is either posing its level-specific questions or is being receiving commands of how to act. It is the very nature of the subject that needs clarification.
Life as Knowledge-Accumulating Systems
Subjectibility: The Janus face of the second law of thermodynamics.
It has been often said that the second law of thermodynamics is the most important law of nature. It is the prime determinant of irreversibility of events, of the “arrow of time”, of evolution. Without it, nothing new would happen in the universe, there would only be eternal tautology of being.39 Various formulations have been given to the law. In its most general formulation, it can be expressed as the principle of indifference. The description of the principle had preceded thermodynamics, its origin can be found in philosophy, in Leibniz's principle of sufficient reason. It also forms the basis of mathematical theory of probability (Table 1).
Table 1.
Principle of indifference
| (1) In philosophy: Nothing happens without a reason (G.W. Leibniz, 1645–1716). |
| (2) In theory of probability: All states or events are regarded as being equally probable as long as we can see no reason why one should occur rather than another. |
| (3) In thermodynamics: Spontaneously, in the absence of forces, matter and energy tend to achieve maximum spreading, dissipation, to eliminate inequalities, to occupy all accessible states, to annul all gradients, to abolish any priveleged, improbable positions in space and time, to reach equilibrium. |
In nature, the second law holds in the presence of forces. The forces of nature are for the most part associative ones. Smaller entities of matter associate, form larger entities: the conversion of potential energy to thermal energy, energy spreading, is accompanied by aggregation of matter.40 In addition, several investigators in the last decades reached a conclusion that formation of material structures is a way of accelerating the process of annulling energy gradients in the universe. This prompted Schneider and Kay to reformulate the second law of thermodynamics.41 Although the second law is a statement about increasing disorder, it also plays a central role in creating order. Thermodynamic systems, exposed to forces of their surroundings, are moved away from equilibrium, and they utilize all avenues available to counter the applied gradients. As the applied gradients increase, so does the system's ability to oppose further movement from equilibrium. Structuring is a way of how to increase the rate of dissipation. The term “dissipative structure”, introduced by Ilya Prigogine and his collaborators, takes, according to Schneider and Kay, a new meaning. No longer does it mean just increasing dissipation of matter and energy, but dissipation of gradients as well. Dissipative structures are simply gradient dissipators.
A similar idea was expressed by Rod Swenson.42 However, he did not consider it as a reformulation, or an extension, of the second law, but as a new law of maximum entropy production: the system will select the path or assembly of paths out of those other available paths that minimize the potential or maximize the entropy at the fastest rate given the constraints. Stuart Kauffman proposed a tentative fourth law of thermodynamics, according to which “the workspace of the biosphere expands, on average, as fast as it can in its constructing biosphere”.43 Eric Chaisson also finds a direction towards growth of complexity in evolution of the universe, the measure of which is an ever increasing rate of energy flow through a system of given mass-energy density.44 In fact, the law of maximum energy flow through the system of organic nature, put forth by Alfred Lotka and called by him the law of evolution, may have been an anticipation of all these later formulations.45
In addition, the expansion of the universe so fast that the universe has been unable to remain in equilibrium internally has been proposed as an apparently independent cause of formation of structures.36,46,47 The very clumps of matter can serve the second law in this nonequilibrium world—in facilitating entropy production by way of energy gradient degradation.
In any case, because of the unidirectional effect of the second law, the universe is replete with thermodynamic systems that intensively and quickly dissipate energy and matter in their surroundings. The dissipation enables the systems to keep themselves unchanged and to increase their complexity. Such systems were named by Swenson “autocatakinetic systems” and by Kauffman “autonomous agents”. They are able not only to maintain their onticity, but also to increase in size, to self-organize, to break up (eventually the source of reproduction in living systems), to spread to new locales and there find new gradients of energy to be dissipated. The work used for maintaining permanence of a system, its onticity, corresponds to “ontic work”. Different autocatakinetic systems compete for their own maintenance and for the degradation of gradients in their surroundings. Not all such systems need be dissipative structures. Conservative structures, constructions, are organized systems that maintain their stability and a distance from equilibrium because high kinetic barriers prevent or retard their destruction and dissipation. We can call those structures, both dissipative and conservative, which maintain their permanence in surroundings undergoing dissipation, the subjects. This is not a metaphor, but a primitive, a conceptual primordial, a definition. The propensity of the world, ensuing from the second law of thermodynamics, to create subjects, may be designated as subjectibility. If we want to have a third “substance” in the world, in addition to matter and energy, it is not information, but subjectibility.
Knowledge as thermodynamic depth and as work capacity.
A specific kind of Swenson's autocatakinetic systems, or Kauffman's autonomic agents, are living systems. According to Swenson, nonliving autocatakinetic systems are captives of their local potentials. If the local energy resource is removed from a nonliving autocatakinetic system then the system “dies”. In the case of a living system, when food in the vicinity of an organism is depleted, an increase in activity rather than a decrease or stoppage of activity is usually the reaction. The perceptual guidance of movements and the movement enhancement of opportunities to perceive, extend to organisms' particular benefits in their hunt for, and consummation of, energy resources. As aptly put by Konrad Lorenz,48 “Life is an eminently active enterprise aimed at acquiring both a fund of energy and a stock of knowledge, the possession of one being instrumental to the acquisition of the other. The immense effectiveness of these two feedback cycles, coupled in multiplying interaction, is the pre-condition, indeed the explanation, for the fact that life had the power to assert itself against the superior strength of the pitiless inorganic world.”
Alfred Lotka cited Henri Poincaré that it is easy to put one grain of oat into a bag filled with grains of wheat, but almost impossible to retrieve the single oat grain out the bag full of wheat.45 Both Poincaré and Lotka presented the example as a model or irreversible processes. Much work must be done, much energy dissipated if one has to take out a grain after grain. If, however, one had a measuring device capable to distinguish grains of oat from grains of wheat and precisely locating the grain of oat in the bag, the retrieval of a single grain of oat would be as easy as taking out any of the huge amount of wheat. The measuring device would carry embodied knowledge of how to distinguish the two kinds of grain. Let us consider two autonomic agents, which would have at their disposal only a limited quantity of energy and could accomplish but a limited amount of work and the survival of the agents would depend on finding the very single grain of oat. Obviously, the chance of survival would be much higher for the agent capable of recognizing, locating and picking out the oat grain without needing to inspect all grains one after another. The example given by Poincaré demonstrates how important is embodied knowledge for survival of a subject in the environment.
Knowledge embodied in a system corresponds to its epistemic complexity.14 Its increasing determines the arrow of biological evolution. According to Hans Kuhn, in the course of evolution organisms gain in quality.49 The quality represents knowledge and is measured by the total number of bits to be discarded until the evolutionary stage under consideration is reached. This Kuhn's measure of knowledge is related to a similar measure of complexity, for which Seth Lloyd and Hans Pagels introduced the term “thermodynamic depth”.50 They identified complexity of a thing with the amount of informational and thermodynamic effort involved in putting it together. The measure of complexity of a macroscopic state, d, of a system that has arrived at that state by the ith possible trajectory is −k.ln pi, where p is the probability that the system has arrived at d by the ith trajectory and k is an arbitrary positive constant. They defined the depth of the state to be:
D(d) = −k.ln pi
The arbitrary multiplication constant has been chosen to be the Boltzmann constant for systems whose successive configurations can be described in the physical space of statistical mechanics. In this case the depth, called now the thermodynamic depth, becomes the entropy passed on, during the evolution, to degrees of freedom others than those needed in the specification of the final state. Or, in other words, the thermodynamic depth may be equated to the quantity of energy dissipated in the past, as the system evolved through its specific trajectory up to the present state.
What is essential in this measure of complexity is the fact that the present appearance of the state says nothing of its evolutionary past and hence of its complexity. The atomic structure, the sequence of digits describing it, or the blueprint specifying the object is not sufficient. Seth Lloyd has put forth the following illuminative example51: A sand castle built up by a child is thermodynamically much deeper than a sand dune. When measuring the thermodynamic depth of the sand castle we must account not only for the natural processes, the geological and meteorological forces that conspire to make dunes, but also for the child's conscious acts, and even for the evolutionary processes that created the child itself. Replacing the term “thermodynamic depth” by the term “knowledge”, we grasp immediately the epistemological significance of the example.
Knowledge, implicated in the epistemic complexity, means, however, still something more than just thermodynamic depth. Knowledge, embodied in a system, means also the capacity to do ontic work, necessary for maintaining onticity, permanence of the system. And also to do epistemic work: to sense, measure, record the properties of the surroundings in order to counter their destructive effects and, in the case of more knowledgeable systems, to anticipate them. Epistemic complexity of an agent thus enfolds two processes: one of the evolutionary past and another of the total set of potential actions to be performed in the future. In fact, evolution seen as thermodynamic deepening appears, from another angle, as a process of continual increasing of the distance from equilibrium as a measure of epistemic complexity.14
Knowledge, carried by human-made artificial systems, has a still more intricate property: its quantity seems to be a function not only of the creator of an artifact, but of its users as well. Look at a portrait of the poet Anna Achmatova drawn by the artist Amedeo Modigliani (Fig. 3).52 Its epistemic complexity encompasses both creative skill and craft of the artist and also the complex effects it may have on beholders of the drawing. But if we digitalized the drawing, composed of a few simple lines, and expressed it in number of bits, we would capture nothing of its richness and appeal. Incidentally, the same applies to the epistemic complexity of an ovum. It comprises not only the script of DNA sequences of its protein-coding genes, but also other parts of the genomes, and presumed, but so far essentially unknown, hereditary frames, such as core constituents of membranes, cytoskeleton and nucleosome proteins. We also know essentially nothing of the epigenetic rules53 that may successively unfold in embryogenesis, starting from the inherited asymmetry of cell components of the ovum.
Figure 3.

A portrait of Anna Achmatova by Amedeo Modigliani. Complexity and embodied knowledge is not “stored” in a few lines of the picture.
Biological species as the main bookkeepers of evolutionary acquired knowledge.
The dominance of the paradigm of information has its parallel in the domination of genocentrism, the idea that the main unit of natural selection, and, by implication, the main actor of evolution, is an individual gene, or, more generally, an individual replicator. According to widely popularized view of Richard Dawkins, genes (and memes) are self-replicating entities that compete with each other through adaptations that enhance their own replication.54 Dawkins opposed any view that attributed biological adaptations to the operation of selection processes at higher levels of organization—organisms, groups, populations or species. Only slowly the conception of genocentrism is being replaced by the conception of multilevel selection.55,56 The most radical has been the position of Stephen Gould, according to which “Darwinian individuals” can be found across an entire biological hierarchy, beginning with genes, to organisms, demes, and species.57 All these individuals are hierarchically nested one within another. Particularly important has been for him selection at the species level. Gould and Lloyd wrote that “Manifesting the resilience of our usual metaphors for stubborn persistence the formerly anathematized concept of supraorganismal selection has emerged from previous calumny to a new status of intense discussion and growing importance.”58
From the epistemological point of view the idea that the major players of the evolutionary game are species, makes sense, even though this idea, propounded by biologists of the “pre-genocentric era”, for instance by Konrad Lorenz,59 was later rejected, and even ridiculed. Individual organisms can be considered as copies of the knowledge that has accumulated in the evolution of a species, and its bookkeeper is the epistemic complexity of the particular species and not that of individual members of it. Lloyd and Pagels argue with substance that complexity of an object does not much increase with the number of its copies.50 “Seven bulls need not be much more complex than one bull. It took billions of years for the earth to evolve one bull; but one bull and a few compliant cows will produce seven bulls relatively speedily.”
In view of Tribus's description of information, a biological species as a whole contains a fixed set of well-formulated questions and also a fixed set of possible answers to any of the well-formulated questions. It also has a fixed number of sensors. Individual organisms are just copies of the same ontic and epistemic system. Every biological species is epistemically closed, both semantically—as argued by Rosen60 and Pattee61 and syntactically: for every species, the rules according to which data from the sensors are being processed are specific and unchangeable. Species are also reproductively closed, which, according to Gould and Lloyd, has some similarity with the immune system of individual organisms.57 The immune system, as Jerne62 and Edelman63 believe, is a kind of hermetically sealed system that contains all the possible responses to the external antigen world.
Considering life in this way, we have to admit that there is much less information transactions in the living world than has been usually assumed. In the case of simple species, unable to learn, stimuli from the surroundings function only as triggers, setting off fore-prepared responses as fixed action patterns. In species that can learn, data received from the surroundings just serve, in the form of information, to reallocate probabilities of possible responses or, in other words, to choose a response from a constant set of possible responses. This also applied to organisms possessing the central nervous system. Data, perceived and transformed, are used to update hypotheses, continually present in the brain, and to choose one from preformed alternatives of action. Friedrich Hayek anticipated it already in 1952, when he wrote64 “An event of entirely new kind, which has never occurred before, and which sets up impulses which arrive in the brain for the first time, could not be perceived at all.”
But individual organisms are not identical copies of some unique Platonic pattern of a species. They serve as arrangements by which a species “explores” new, undiscovered aspects of its surroundings and by which it expands its Umwelt. Some mutations as evolutionary innovations represent, in the terminology of Tribus's model, additions to a set of possible answers to an existing question. A major hereditary change may change the question itself or be a novel addition to a set of existing questions. Speciation may, indeed, represent modification of the quality and quantity of questions which living systems are able to pose to their environment.
A well-posed question—that is a question with a complete set of all possible answers—can be called an “inquisitory” question. Nonknowledge of a subject asking an inquisitory question expresses uncertainty. It is a state when a subject knows that he (she, it) knows, but has uncertainty with respect to reliability and precision of what he knows. In uncommon situations, in particular of danger, organisms may also ask questions of which all possible answers cannot be specified; such questions can be called “exquisitory”. The nonknowledge, implied in an exquisitory question, corresponds to confusion. A subject knows that he does not know and is in a state of frustration. A subject that does not know that he does not know is in a state of ignorance. Questions that would correspond to such a state would be meaningless. We can make a classification of nonknowledge and knowledge (Table 2).
Table 2.
Classification of knowledge and nonknowledge
| Nonkowledge | Question | Knowledge |
| Uncertainty | Inquisitory question | Intrinsic knowledge |
| Confusion | Exquisitory question | Extrinsic knowledge |
| Ignorance | Meaningless question | Surplus knowledge |
Reducing uncertainty linked to inquisitory questions and intrinsic knowledge is actually not an entirely novel knowledge. It leads to more precision in the existing knowledge. Information is implicated only in this particular, intrinsic, knowledge. To characterize a process furnishing extrinsic knowledge, we may introduce, in analogy with information, a neologism “exformation.” Surplus knowledge is that particular knowledge that is specific to a single species existing on Earth, Homo sapiens. It is the knowledge generated in cultural evolution, as has been already suggested in the first part of this study.
Concluding Remarks
The aim of this study has been to present arguments that some common concepts may no longer be of explanatory and heuristic value in contemporary biology. The excessive use of the terminology of information has naturally promoted the metaphor of life as a computer. In cognitive sciences, the computer continues to be a much-favored model of mind, and also of life as a whole. But life as we know it, natural life (n-life), is a chemical system and cognition is a property of such a chemical system.12 Artificial life (a-life) need not be founded on chemical principles. Chemical interactions differ fundamentally from other kinds of interactions, such as mechanical combinations of Lego parts. Putting together two pieces of Lego does not bring about a qualitative change, but compounding molecules of hydrogen and oxygen creates a novelty which, at least in a description available to a human subject, has not been inscribed in the precursors. Chemistry is a science of emergence. Electromagnetic interactions between atoms produce a profusion of molecules, with properties qualitatively distinct from those of their constituents. Molecules combine and/or self-assemble into supramolecular structures, which function as molecular engines. The richness of chemical novelties, at both molecular and supramolecular levels, a necessary consequence of the second law of thermodynamics, makes evolution of life into a permanent fashion parade, and natural selection becomes not a device for creating novelty but for pruning, culling—or, to put it poetically, to cull the choicest verses from a poem of life.
The concept of information and the metaphor of computer are closely linked to the concept of genocentrism and the metaphor of the selfish gene. Thomas Kuhn has depicted dynamics of science as one of replacements of paradigms and he used as an example the replacement of Ptolemy's geocentric paradigm by Copernicus's heliocentric paradigm.65 The conception of genocentrism may have undergone a similar evolution as had, in the past, Ptolemy's conception of geocentrism: to account for new observations, ever new epicycles had to be added. The simpler conception of Copernicus could do without epicycles. The metaphors of selfish genes, of information as a property of the world and as the main feature of life, of the brain as a computer, appear to acquire ever more a Ptolemaic character. It does not mean, however, that a new “Copernican” paradigm should be simple. On the contrary, biological “Ptolemaism” may be abandoned because of its excessive, albeit elegant, simplicity. Biology now focuses on the world of great complexity. But it is the world that lies behind Kant's barriers.
This is apparent from the difficulties which we may face when attempting to describe knowledge understood as thermodynamic depth. The dynamics of the past, the evolutionary trajectory and, at the same time, the dynamics of the future, sets of possible actions, are “frozen” in a form which may not be expressed by any linear script. Erwin Schrödinger ingeniously anticipated this in his prophetic book “What is life?”31 He attempted to envisage the genetic “code-script” as a “four-dimensional pattern”, containing the entire pattern of an individual's future development and of its functioning in the mature state. “Law-code”, “executive power”, “architect's plan” and “builder's craft”—in one! As he put it with prescience, “This is a marvel—than which only one is greater; one that, if intimately connected with it, yet lies on a different plane. I mean the fact that we, whose total being is entirely based on a marvelous interplay of this very kind, yet possess the power of acquiring considerable knowledge about it. I think it is possible that this knowledge may advance to little short of a complete understanding—of the first marvel. The second may well be beyond human understanding.”
Biology may have reached a similar stage as has physics with quantum mechanics. Howard Pattee, who has long professed that the classical structure-function relation in biology needed two modes of description, neither derivable from nor reducible to the other, should be credited for his perspicacity.66 Just as the interpretations of the electron as a particle and as a wave are complementary, things and events revealed by contemporary biology may require descriptions based on the principle of complementarity.
Acknowledgements
This study was supported in part by Howard Hughes Medical Institute Research Grant No. 55005622. I thank my colleagues Jozef Nosek and Lubomír Tomáška for advice and valuable discussions.
Footnotes
Previously published online as a Plant Signaling & Behavior E-publication: http://www.landesbioscience.com/journals/psb/abstract.php?id=4113
References
- 1.Smith Maynard J. The concept of information in biology. Philosophy of Sci. 2000;67:177–194. [Google Scholar]
- 2.Dawkins R. The blind watchmaker. New York: Norton; 1987. [Google Scholar]
- 3.Williams GC. The pony fish glow. New York: Basic Books; 1997. [Google Scholar]
- 4.Stuart CIJM. Bio-informationel equivalence. J Theor Biol. 1985;113:611–636. doi: 10.1016/s0022-5193(85)80183-1. [DOI] [PubMed] [Google Scholar]
- 5.Emmeche C, Hoffmeyer J. The semiotic metaphor in biology. Semiotica. 1991;84:1–42. [Google Scholar]
- 6.Conrad M, Marijuan PC, editors. Proceedings of the First conference on foundations of information science: From computers and quantum physics to cells, nervous systems, and societies. BioSystems. 1996;38:87–266. doi: 10.1016/0303-2647(95)01578-7. [DOI] [PubMed] [Google Scholar]
- 7.Markoš A, editor. Readers of the book of life. Oxford: Oxford University Press; 2002. [Google Scholar]
- 8.Capurro R, Hjørland B. The concept of information. Ann Rev Information Sci Technol. 2003;37:343–411. [Google Scholar]
- 9.May RM. The dimension of life on Earth. Nature and human society: The quest for a sustainable world. Washington, DC: National Academy of Sciences Press; 2000. [Google Scholar]
- 10.Wark K. Thermodynamics. New York: McGraw-Hill; 1983. [Google Scholar]
- 11.Uexküll J. Streifzüge durch die Umwelten von Tieren und Menschen. Hamburg: Rowohlt; 1956. (Ger). [Google Scholar]
- 12.Kováč L. Life, chemistry and cognition. EMBO Reports. 2006;7:562–566. doi: 10.1038/sj.embor.7400717. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Husserl E. Die Krisis der europäischen Wissenschaft und die transzendentale Phänomenologie. Den Haag: Biemel; 1976. (Ger). [Google Scholar]
- 14.Kováč L. Fundamental principles of cognitive biology. Evolution and Cognition. 2000;6:51–69. [Google Scholar]
- 15.Kováč L. Physics, mind, society: back and forth. Appl Magn Reson. 2007;31:11–28. [Google Scholar]
- 16.Georgescu-Roegen H. The entropy law and the economic process. Cambridge, MA: Harvard University Press; 1971. [Google Scholar]
- 17.Bohr N. Atomphysik und menschliche Erkenntnis, vol. I. Braunschweig: Vieweg; 1974. (Ger). [Google Scholar]
- 18.Feynman RP. The character of physical law. Cambridge, MA: MIT Press; 1965. [Google Scholar]
- 19.Allott R. Natural origin of language. Knebworth, UK: Able; 2001. [Google Scholar]
- 20.Lakoff G, Johnson M. Metaphors we live by. Chicago, IL: University of Chicago Press; 1980. [Google Scholar]
- 21.Szilard L. Über die Entropieverminderung in einem thermodynamischen System bei Eingreifen intelligenter Wesen. Zeitschrift für Physik. 1929;53:840–856. (Ger). [Google Scholar]
- 22.Shannon CE. A mathematical theory of communication. Bell Syst Tech J. 1948;27:379–423. 623–656. [Google Scholar]
- 23.Tribus M. Thermostatics and thermodynamics. Princeton: Van Nostrand; 1961. [Google Scholar]
- 24.Tribus M, McIrwine EC. Energy and information. Sci Amer. 1971;224:179–190. [Google Scholar]
- 25.Costa de Bauregard O, Tribus M. Information theory and thermodynamics. Helv Phys Acta. 1973;47:238–247. [Google Scholar]
- 26.Pierce JR. An introduction to information theory. 2nd ed. New York: Dover; 1980. [Google Scholar]
- 27.Leff HS, Rex AF, editors. Maxwell's demon: Entropy, information, computing. Bristol: Hilger; 1990. [Google Scholar]
- 28.Yockey HP. Information theory and molecular biology. Cambridge: Cambridge University Press; 1992. [Google Scholar]
- 29.Adami C. Introduction to artificial life. New York: Springer; 1998. [Google Scholar]
- 30.Bazarov IP. Termodinamika. Moscow: Vysshaya shkola; 1983. (Rus). [Google Scholar]
- 31.Schrödinger E. What is life? Cambridge: Cambridge University Press; 1944. [Google Scholar]
- 32.Gatlin LL. Information theory and the living systems. New York: Columbia University Press; 1972. [Google Scholar]
- 33.Sibbald PP, Banerjee S, Maze J. Calculating higher order DNA sequence information measures. J Theor Biol. 1989;136:475–483. doi: 10.1016/s0022-5193(89)80159-6. [DOI] [PubMed] [Google Scholar]
- 34.Berg OG, von Hippel PH. Selection of DNA binding sites by regulatory proteins. J Mol Biol. 1987;193:723–750. doi: 10.1016/0022-2836(87)90354-8. [DOI] [PubMed] [Google Scholar]
- 35.Schneider TD. Measuring molecular information. J Theor Biol. 1999;201:87–92. doi: 10.1006/jtbi.1999.1012. [DOI] [PubMed] [Google Scholar]
- 36.Brooks DR, Wiley EO. Evolution as entropy. 2nd ed. Chicago, IL: The University of Chicago Press; 1986. [Google Scholar]
- 37.Hofstadter DR. Gödel, Escher, Bach: An eternal golden braid. Harmondsworth: Penguin Books; 1980. [Google Scholar]
- 38.Waddington CH. The basic ideas of biology. In: Waddington CH, editor. Towards a theoretical biology. vol. 1: Prolegonema. Edinburgh: Edinburgh University Press; 1968. [Google Scholar]
- 39.Prigogine I, Stengers I. Dialog mit der Natur. München: Piper; 1981. (Ger). [Google Scholar]
- 40.Wicken JS. Evolution, thermodynamics, and information. New York: Oxford University Press; 1987. [Google Scholar]
- 41.Schneider E, Kay J. Life as a manifestation of the second law of thermodynamics. Math Comp Model. 1994;19:25–48. [Google Scholar]
- 42.Swenson R. Spontaneous order, autocatakinetic closure, and the development of space-time. Annals NY Acad Sci. 2000;901:311–319. doi: 10.1111/j.1749-6632.2000.tb06290.x. [DOI] [PubMed] [Google Scholar]
- 43.Kauffman S. Investigations. New York: Oxford University Press; 2000. [Google Scholar]
- 44.Chaisson E. Cosmic evolution: The rise of complexity in nature. Cambridge, MA: Harvard University Press; 2001. [Google Scholar]
- 45.Lotka AJ. Elements of mathematical biology. New York: Dover; 1956. [Google Scholar]
- 46.Layzer D. The arrow of time. Sci Amer. 1975;233:56–69. [Google Scholar]
- 47.Landsberg PT. Can entropy and “order“ increase together? Physics Letters 102A. 1984;35:159–169. [Google Scholar]
- 48.Lorenz K. Behind the mirror. New York: Harcourt Brace Jovanovich; 1977. [Google Scholar]
- 49.Kuhn H. Origin of life and physics: Diversified macrostructure-Inducement to form information-carrying and knowledge-accumulating systems. J Res Develop. 1988;32:37–46. [Google Scholar]
- 50.Lloyd S, Pagels H. Complexity as thermodynamic depth. Ann Phys. 1988;188:186–213. [Google Scholar]
- 51.Lloyd S. The calculus of intricacy. The Sciences. 1990;1990:38–44. [Google Scholar]
- 52.Belsky Lagazzi I. Anna Achmatova. 1990 ( http://www.larici.it) [Google Scholar]
- 53.Lumsden CJ, Wilson EO. Genes, mind and culture: The coevolutionary process. Cambridge, MA: Harvard University Press; 1981. [Google Scholar]
- 54.Dawkins R. The selfish gene. Oxford: Oxford University Press; 1976. [Google Scholar]
- 55.Sober E, Wilson DS. Unto others: The evolution and psychology of unselfish behavior. Cambridge, MA: Harvard University Press; 1998. [Google Scholar]
- 56.Michod RE. Darwinian dynamics: Evolutionary transitions in fitness and individuality. Princeton, NJ: Princeton University Press; 1999. [Google Scholar]
- 57.Gould SJ. The structure of evolutionary theory. Cambridge, MA: Harvard University Press; 2002. [Google Scholar]
- 58.Gould SJ, Lloyd EA. Individuality and adaptation across levels of selection: How shall we name and generalize the unit of Darwinism. Proc Natl Acad Sci US. 1999;96:11904–11909. doi: 10.1073/pnas.96.21.11904. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Lorenz K. On aggression. London: Methuen; 1966. [Google Scholar]
- 60.Rosen R. Life itself. New York: Columbia University Press; 1991. [Google Scholar]
- 61.Pattee HH. Evolving self-reference: Matter, symbols and semantic closure. Communication and Cognition. 1995;12:9–27. [Google Scholar]
- 62.Jerne NK. The generative grammar of the immune system. Science. 1985;229:1057–1059. doi: 10.1126/science.4035345. [DOI] [PubMed] [Google Scholar]
- 63.Edelman G. Bright air, brilliant fire: On the matter of mind. New York: Basic Books; 1992. [Google Scholar]
- 64.Hayek FA. The sensory order: An inquiry into the foundations of theoretical psychology. Chicago, IL: University of Chicago Press; 1952. [Google Scholar]
- 65.Kuhn T. The structure of scientific revolutions. Chicago, IL: University of Chicago Press; 1970. [Google Scholar]
- 66.Pattee HH. The complementary principle in biological and social structures. J Soc Biol Structs. 1978;1:191–200. [Google Scholar]

