Skip to main content
Frontiers in Systems Neuroscience logoLink to Frontiers in Systems Neuroscience
. 2020 Aug 13;14:57. doi: 10.3389/fnsys.2020.00057

Renewed Perspectives on the Deep Roots and Broad Distribution of Animal Consciousness

Louis N Irwin 1,*
PMCID: PMC7438986  PMID: 32903840

Abstract

The vast majority of neurobiologists have long abandoned the Cartesian view of non-human animals as unconscious automatons—acknowledging instead the high likelihood that mammals and birds have mental experiences akin to subjective consciousness. Several lines of evidence are now extending those limits to all vertebrates and even some invertebrates, though graded in degrees as argued originally by Darwin, correlated with the complexity of the animal’s brain. A principal argument for this view is that the function of consciousness is to promote the survival of an animal—especially one actively moving about—in the face of dynamic changes and real-time contingencies. Cognitive ecologists point to the unique features of each animal’s environment and the specific behavioral capabilities that different environments invoke, thereby suggesting that consciousness must take on a great variety of forms, many of which differ substantially from human subjective experience.

Keywords: cognition, awareness, evolution, arthropods, cephalopods, vertebrates

Introduction

Ever since Darwin, students of animal behavior and the nervous system have generally regarded consciousness as a product of the brain, subject to the influence of natural selection (Richards, 1987). Just as the complexity of brains varies across the full range of animal taxa, Darwin (1871) and his contemporary, Romanes (1883), argued that cognition has emerged in grades of complexity over the evolutionary history of animals.

Notwithstanding this early biological approach to mental phenomena, the scientific study of consciousness entered into a hiatus during the first half of the 20th century, accentuated by the rise of genetics (Richards, 1987), the turn toward behaviorism in psychology (Griffin, 1984), and some difficult philosophical problems (Nagel, 1974; Chalmers, 1995; Dennett, 2017). In recent years, however, neuroscientists have started applying their increasingly sophisticated techniques to the study of mental activity, and neurobiologists have resumed consideration of what ecology, ethology, and evolutionary history have to suggest about the original contention of Darwin and his contemporaries that consciousness is a function of the brain molded by natural selection (Churchland, 2007; Engel, 2010).

Perhaps because consciousness is so often thought of as a thing instead of a process (Rose, 1976), it most often is expressed as a binary possibility—either it exists or it doesnot (Chaisson, 1987; Humphrey, 1992; LeDoux, 2019). And among those animals capable of it, consideration is seldom given to varying degrees or alternative modes of consciousness. The goal of this article is to briefly review the perspectives from cognitive ecology and comparative neurobiology that suggest a return to the view of Darwin and his contemporaries is warranted, and to point out that if indeed consciousness has been forged through natural selection to be suited uniquely to the environment the animal inhabits, logic suggests that it must be multimodal, occurring in highly variable forms largely remote from human experience.

Definition

Of the modern attempts to define consciousness, I prefer that of John (2003) who described it as “the subjective awareness of momentary experience interpreted in the context of personal memory and the present state.” The terms “awareness” and “experience” are themselves difficult to define without circular references, and “subjective” necessarily invokes phenomenological (personal) experience resistant to objectivization, but all of them can plausibly be attributed to at least some animals with nervous systems of sufficient complexity.

Four features of consciousness deemed irreducible by Feinberg (2012) serve as useful elaborations of John’s definition. He posited that consciousness is: (1) referential, or experienced as occurring outside the head; (2) unified, or perceived as coherent scenes, sensations, events, or emotions; (3) qualitatively variable, consisting of sensory gradations and variations within modalities (as in colors, sounds, intensities, etc.); and (4), causational, in its ability to trigger subsequent mental activity and affect behavior.

To summarize by integrating John’s definition with Feinberg’s features, consciousness is the personal awareness of unified and qualitatively textured current or recalled experience, perceived as existing in the animal’s external or bodily environment, with the capacity to induce further mental activity and/or behavioral action.

Nature and Function

Any argument about the origin and varieties of consciousness must begin with a consideration of its nature and function. From clinical observations and personal experience, human consciousness appears in gradations from marginal awareness to full and focused attention. It should be noted, however, that attention and consciousness are not the same things. Stimuli can be attended to unconsciously, and subjects can be conscious of experiences that they are not attending to (Koch and Tsuchiya, 2007).

In the literature of comparative animal psychology, references to roughly three degrees of consciousness are common. The first degree arises from the detection of physical stimuli at the body’s periphery or interior, capable of eliciting an adaptive reflex. This is referred to as “primary” (Edelman, 2003), or “sensory” (Feinberg and Mallatt, 2016), consciousness. It requires some level of arousal or vigilance to make the animal receptive to stimulation and capable of motor activity but does not necessarily imply rich, textured, or complex content. Some authors consider awareness of feelings or affect as distinguishable from but comparable in degree to sensory consciousness (Feinberg and Mallatt, 2016). The second degree of consciousness is self-awareness (Churchland, 2002; Lou et al., 2020), which Churchland (2013) argued is just another form of perception. The highest degree of consciousness is meta-cognition or the conscious knowledge that individuals have about their cognitive capacities (Smith et al., 2003; Al Banna et al., 2016). The second and third levels of consciousness do imply increasingly complex content. As used in this article, consciousness refers to its primary form (both sensory and affective), unless stated otherwise.

A common approach to discerning the function of consciousness is to focus on the key features of subjective experience: its stabilizing properties and qualitative richness, dynamic integration, situatedness, and intentionality (Pennartz et al., 2019). James (1884), for instance, held that consciousness serves to direct attention and dampen chaotic cerebral activity. He viewed it as a means of focusing on one of several simultaneously possible objects or trains of thought (James, 1890). As he noted, “My experience is what I agree to attend to” (James, 1884). A modern version of the same view was expressed by Edelman (2003) that “the neural systems underlying consciousness arose to enable high-order discriminations in a multidimensional space of signals, ” and qualitative differences in content (qualia) provide a basis for those discriminations.

Many authors have seen the integrative nature of consciousness as central to its function (Edelman, 2003; Tononi, 2012; Miyahara and Witkowski, 2019). In some cases, the integration is directed toward managing the multiplicity of incoming sensory information. In others, it is seen as essential for accessing memory stores with which it evaluates the context and significance of incoming information. Edelman saw consciousness as the integration of perceptual and motor events together with memory to construct a multimodal scene in the present (Edelman, 2003). For Pennartz et al. (2019), the biological function of consciousness is to present the subject with a multimodal, situational survey of the surrounding world and body, subserving complex decision-making, and goal-directed behavior.

Consciousness is particularly important for evaluating real-time changes in the situation of a mobile animal behaving appropriately in its environment (Merker, 2005). Indeed, the essence of consciousness for some authors revolves around its role in evaluating appropriate actions to take in given circumstances, anticipating the consequences of those actions, and updating the animal’s position and orientation as it moves through space (Churchland, 2002; Engel, 2010; Clark, 2016; Buzsáki, 2019). Griffin (1984) asserted that an animal is conscious if it is aware of what it is doing or intending to do. As an appreciation for the central role of place in the cognitive landscape has grown (Irwin and Irwin, 2020), so has a realization that movement through its milieu is largely how an animal mentally creates the dimensions of its environment (Merleau-Ponty, 1945; Sheets-Johnstone, 1999).

Every hypothesis about the function of consciousness has at its core the view that it enables the animal to make optimal use of the information available to it, which has obvious survival value. This is not to assert that consciousness is necessary for every beneficial activity—adaptive behaviors from the tropisms of unicellular microbes to complex instinctual behaviors in vertebrates—can occur without conscious reflection. But the ability to maximize the utility of information, integrate it with memory, and focus on elements most critical for survival, either in the moment or over a longer-term, is a biological capability susceptible to favorable natural selection.

Cognitive Ecology

Jakob Von Uexküll (1926) was an early proponent of the argument that organisms experience life in terms of species-specific, spatio-temporal, “self-in-world” frames of reference. An organism creates and reshapes its perception. Consequently, the minds of different organisms differ, which follows from the individuality and uniqueness of the history of every single organism. Gibson (1977) coined the term affordance to denote what the environment uniquely makes possible for any given species. Many observers have noted the need for an animal’s perception and cognition to match the affordances of its environment. Gallagher (1997) argued that “Consciousness and the brain develop and function within a form of existence that is already defined by the world it inhabits.” Anderson (2014) observed that neural organization should be understood in terms of “the brain’s differential propensities to influence the organism’s response to the various features or affordances in its environment.”

There is strong evidence that brains have evolved to respond to environmental pressures (Marino, 2005). Various measures of brain size are positively correlated with: (1) feeding innovation, learning, and tool use; (2) size of behavioral repertoire; (3) social complexity; (4) dietary complexity; and (5) unpredictability of the environment. Ecological principles like the unpredictability of resources in space and time may drive different types of cognition (Lefebvre and Sol, 2008).

An alternative to the influence of environmental variation is that the complexity of the environment in general rather than the unique features of different environments determines the nature of consciousness (Shumway, 2008; Sol, 2009; Mettke-Hofmann, 2014). However, brain evolution does not necessarily proceed from simple to complex, small to large, or diffuse to differentiated (Bullock, 1984; Kaas, 2002). Marino (2005) believes that brain evolution relates more to environmental change than to its complexity. On the other hand, neural and behavioral complexity are clearly correlated (Parker and McKinney, 1999; Neubauer, 2012). So the role that complex environments play in shaping the nature of consciousness cannot be dismissed. Nonetheless, a symposium on the evolution of neurobiological specializations in mammals concluded that there are no simple brains; that brains reflect ecology (Marino and Hof, 2005). Ethologists and comparative animal psychologists have echoed this perspective. Hodos (1986), for instance, argued that natural selection optimizes mechanisms of perception maximally appropriate for the ecological demands of each species, rather than the complexity of information processing in a general sense.

Not everyone has embraced an emphasis on ecological determinants of cognition. Macphail and Bolhuis (2001) claimed to find no convincing support from an ecological account of cognition. In fact, their argument seems to focus on mechanisms of learning specifically more than on consciousness (Bolhuis, 2015). The fact is that all animals exist in an ecological setting, and to the extent that natural selection responds to ecological mandates (Marino, 2005; Sol et al., 2005), it is reasonable to assume that the nature of an animal’s conscious awareness must be attuned to its environmental setting.

Neurobiology

While evolutionary “gradations” in consciousness have been advocated since the time of Darwin (1871) and Romanes (1883), consciousness has usually been discussed as a bimodal phenomenon, with some evolutionary threshold needing to be achieved for consciousness in an-all-or-none fashion to appear (Jaynes, 1976; Humphrey, 1992; LeDoux, 2019). But neuroanatomy, neurochemistry, and adaptive behavior are not discontinuous over evolutionary time scales, so the sudden emergence of consciousness where it did not previously exist is not plausible. Rather, a gradual increase in resolution and complexity of awareness, coupled with increasing integration of perception with memory as brains became larger and more complex during phylogeny, is logically more persuasive (Rose, 1976; Bullock, 1984; Griffin, 1984; Tononi, 2004; Tononi and Koch, 2015; Mercado, 2008; Koch, 2012; Kiverstein and Rietveld, 2018).

This does not mean that discontinuities in the degree of consciousness do not exist across some phyletic lines. Stark differences in the complexity of consciousness between invertebrates and vertebrates, birds and mammals compared to the other vertebrate Classes, and especially between humans and non-human primates, are generally assumed. What neurobiological evidence can be advanced in support of any of these assumptions?

Feinberg and Mallatt (2016) have provided the most detailed and sophisticated analysis of the occurrence of consciousness across the animal kingdom. They posit seven indicators of sensory consciousness (a large number of neurons, three or more synaptic relays before the pre-motor center, isomorphic organization, reciprocal interactions, multisensory convergence, a neuroanatomical locus for memory, and a selective attention mechanism), as well as five indicators of affective consciousness (operantly learned response to reward or punishment, the behavioral distinction between good and bad outcomes, frustration behavior, self-delivery of analgesics, and approach to reinforcing drugs). Using these criteria, Feinberg and Mallatt (2016) propose the likelihood that some arthropods (especially among the insects and crustaceans), the cephalopods (octopi, squids, nautiloids), and all vertebrates are conscious. While others—especially those focused on the assessment of meta-cognition in mammals and birds—are more skeptical (Humphrey, 1992; Butler, 2008; LeDoux, 2019), the capacity of all vertebrates to experience primary and affective consciousness has widespread support (Griffin, 1984; Koch, 2012; Feinberg and Mallatt, 2016).

Because the neurobiology of all the vertebrates is known in much greater detail than for the invertebrates, distinctions among the vertebrates can be discerned. For example, brain mass relative to body mass clusters similarly among bony fishes, amphibians, and reptiles, with birds and mammals together at a higher level, and cartilaginous fishes in between those two groups. Neuron numbers exist in the tens of millions for the ectothermic Classes, hundreds of millions in most mammals, and billions in most primates (Herculano-Houzel, 2017). To the extent that neural complexity can be related to degrees of consciousness, the range for both is great across the vertebrates overall.

The waking state, and therefore the capacity for awareness of the environment, has long been recognized as dependent on activation of a diffuse network of neurons emanating from the ancestral brain stem of vertebrates (Moruzzi and Magoun, 1949). This circuitry appears to be necessary for sensory awareness, but does not in itself provide the content of experience; that is assumed to be obtained from perceptions formulated in various regions of the cerebral cortex, which also holds distributed memory (Koch, 2012) and is reciprocally connected to various nuclei in the thalamus.

The neural substrate for complex cognitive functions that are associated with higher-level consciousness in mammals and birds are based on patterns of circuitry that are substantially less elaborated, with some components actually lacking, in reptiles, while the major thalamopallial circuits involving sensory relay nuclei are conspicuously absent in amphibians (Butler and Cotterill, 2006). Based on these criteria, the potential for higher-level consciousness in reptiles and amphibians appears to be lower than in birds and mammals.

A plausible blend of the above observations is that all vertebrates have the capacity for primary (sensory and affective) consciousness, but that the qualitative resolution and detail of conscious processes have risen as neocortical cell numbers have increased across different taxa.

While neural complexity may be marginal in insects (~1 million neurons in bees), many display behavior that goes well beyond simple reflexes and conditioned responses (Menzel, 2012; Giurfa, 2013; Chittka and Wilson, 2019). The complexity of the cephalopod brain, with >100 million neurons, exceeds that of frogs (Godfrey-Smith, 2016). Many arthropods and some cephalopods have complex neural regions involved in learning (hence in-memory storage)—“mushroom bodies” in insects, and the vertical lobe in octopi and squid. The behavior of arthropods and cephalopods is highly variable, in keeping with the variety of environments the animals occupy. Ecological psychologists argue that behavior—hence the nature of consciousness that is presumed to be needed to make the behavior possible—varies with the environmental constraints on each animal’s survival. Therefore, sensory consciousness may be created by diverse neural architectures, often through convergent evolution (Kaas, 2002; Emery and Clayton, 2004; Emery, 2006; Lisney and Collin, 2006; Lefebvre and Sol, 2008; Feinberg and Mallatt, 2016). It also follows, however, that the nature of that consciousness may vary greatly from that experienced by humans (Nagel, 1974).

Ancestral vertebrates and arthropods were present in the early Cambrian, 510 million years ago (mya). The earliest cephalopods date from about 490 mya. If primary consciousness was present in some members of all three groups, which are very distantly related, the origin of consciousness is ancient and, in all probability, evolved independently (Feinberg and Mallatt, 2016).

Discussion

Modern science avoids subjectivity as much as possible, so the phenomenological, or subjective, nature of personal experience has always been the primary challenge to scientific research on consciousness. To that has been added centuries of introspection that treated consciousness as an entity, but one lacking material substance. Yet other subjective aspects of mental activity, like perception, emotion, learning, and dreaming have advanced progressively toward neuroscientific illumination. If consciousness were viewed more like those phenomena, as a process, perhaps the challenge to its scientific study would not seem so formidable.

The adaptive nature of consciousness, which focuses on the intimate interaction between body, brain, and environment, provides an ecological and evolutionary platform for the study of consciousness familiar to all biologists. The brain, and therefore the potential for consciousness, develops and functions in every species within a form of existence that is defined by the world it inhabits (Gallagher, 1997).

This article has argued that consciousness is more widespread than is generally believed, is generated by a diversity of neural substrates, has evolved independently several times, and appears over a range of degrees in a variety of forms (Feinberg and Mallatt, 2016). Technological advances in monitoring and visualizing brain activity in humans will surely illuminate human consciousness in ever-growing detail (Varela, 1996; Crick and Koch, 1998; Seth et al., 2006; Massimini et al., 2009; van Vugt et al., 2018). By superimposing the known electrophysiological correlates of place perception and motor activity (O’Keefe, 1990; Finkelstein et al., 2016; Moser et al., 2017) with established neuro indicators of awareness (Moruzzi and Magoun, 1949; Frith, 2002; Hameroff, 2010), a deeper understanding of the interaction between motor control and consciousness will be achieved (Stocker, 2016).

More attention should now be turned as well to the study of other animals in their natural environments, combining ethological techniques developed over the decades with increasingly sophisticated neuroscience methodologies (Morris, 2005; Gallagher and Zahavi, 2008; Boly et al., 2013; Reiter et al., 2017; Pennartz et al., 2019), to reveal indicators of consciousness in other species. Several other strategic and experimental approaches for assessing comparative cognition (including consciousness) are proposed in Irwin and Irwin (2020).

Author Contributions

The author confirms being the sole contributor of this work and has approved it for publication.

Conflict of Interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Al Banna M., Redha N. A., Abdulla F., Nair B., Donnellan C. (2016). Metacognitive function poststroke: a review of definition and assessment. J. Neurol. Neurosurg. Psychiatry 87, 161–166. 10.1136/jnnp-2015-310305 [DOI] [PubMed] [Google Scholar]
  2. Anderson M. L. (2014). After Phrenology: Neural Reuse and the Interactive Brain. Cambridge, MA: MIT Press. [Google Scholar]
  3. Bolhuis J. J. (2015). Evolution cannot explain how minds work. Behav. Processes 117, 82–91. 10.1016/j.beproc.2015.06.008 [DOI] [PubMed] [Google Scholar]
  4. Boly M., Seth A. K., Wilke M., Ingmundson P., Baars B., Laureys S., et al. (2013). Consciousness in humans and non-human animals: recent advances and future directions. Front. Psychol. 4:625. 10.3389/fpsyg.2013.00625 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Bullock T. H. (1984). The application of scientific evidence to the issues of use of animals in research: the evolutionary dimension in the problem of animal awareness. IBRO News 12, 9–11. [Google Scholar]
  6. Butler A. B. (2008). Evolution of brains, cognition, and consciousness. Brain Res. Bull. 75, 442–449. 10.1016/j.brainresbull.2007.10.017 [DOI] [PubMed] [Google Scholar]
  7. Butler A. B., Cotterill R. M. (2006). Mammalian and avian neuroanatomy and the question of consciousness in birds. Biol. Bull. 211, 106–127. 10.2307/4134586 [DOI] [PubMed] [Google Scholar]
  8. Buzsáki G. (2019). The Brain from Inside Out. New York, NY: Oxford University Press. [Google Scholar]
  9. Chaisson E. (1987). The Life Era: Cosmic Selection and Conscious Evolution (Authors Guild Backprint.com ed.). Boston: Atlantic Monthly Press. [Google Scholar]
  10. Chalmers D. J. (1995). Facing up to the problem of consciousness. J. Consciousness Stud. 2, 200–219. [Google Scholar]
  11. Chittka L., Wilson C. (2019). Expanding consciousness. Amer. Sci. 107, 364–369. 10.1511/2019.107.6.364 [DOI] [Google Scholar]
  12. Churchland P. S. (2002). Self-representation in nervous systems. Science 296, 308–310. 10.1126/science.1070564 [DOI] [PubMed] [Google Scholar]
  13. Churchland P. S. (2007). Neurophilosophy: the early years and new directions. Funct. Neurol. 22, 185–195. [PubMed] [Google Scholar]
  14. Churchland P. M. (2013). Matter and Consciousness. 3rd Edn. Cambridge, MA: MIT Press. [Google Scholar]
  15. Clark A. (2016). Surfing Uncertainty: Prediction, Action, and the Embodied Mind New York: Oxford University Press. [Google Scholar]
  16. Crick F., Koch C. (1998). Consciousness and neuroscience. Cereb. Cortex 8, 97–107. 10.1093/cercor/8.2.97 [DOI] [PubMed] [Google Scholar]
  17. Darwin C. (1871). The Descent of Man. 2nd Edn. New York: A. L. Burt. [Google Scholar]
  18. Dennett D. C. (2017). From Bacteria to Bach and Back. New York, NY: W.W. Norton and Co. [Google Scholar]
  19. Edelman G. M. (2003). Naturalizing consciousness: a theoretical framework. Proc. Natl. Acad. Sci. U S A 100, 5520–5524. 10.1073/pnas.0931349100 [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Emery N. J. (2006). Cognitive ornithology: the evolution of avian intelligence. Philos. Trans. R. Soc. Lond. B Biol. Sci. 361, 23–43. 10.1098/rstb.2005.1736 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Emery N. J., Clayton N. S. (2004). The mentality of crows: convergent evolution of intelligence in corvids and apes. Science 306, 1903–1907. 10.1126/science.1098410 [DOI] [PubMed] [Google Scholar]
  22. Engel A. K. (2010). “Directive minds: how dynamics shapes cognition,” in Enaction: Toward a New Paradigm for Cognitive Science, eds Stewart J., Gapenne O., Di Paolo E. A. (Cambridge, MA: MIT Press; ), 219–243. [Google Scholar]
  23. Feinberg T. E. (2012). Neuroontology, neurobiological naturalism, and consciousness: a challenge to scientific reduction and a solution. Phys. Life Rev. 9, 13–34. 10.1016/j.plrev.2011.10.019 [DOI] [PubMed] [Google Scholar]
  24. Feinberg T. E., Mallatt J. M. (2016). The Ancient Origins of Consciousness: How the Brain Created Experience. Cambridge, MA: MIT Press. [Google Scholar]
  25. Finkelstein A., Las L., Ulanovsky N. (2016). 3-D maps and compasses in the brain. Annu. Rev. Neurosci. 39, 171–196. 10.1146/annurev-neuro-070815-013831 [DOI] [PubMed] [Google Scholar]
  26. Frith C. (2002). Attention to action and awareness of other minds. Conscious. Cogn. 11, 481–487. 10.1016/s1053-8100(02)00022-3 [DOI] [PubMed] [Google Scholar]
  27. Gallagher S. (1997). Mutual enlightenment: recent phenomenology in cognitive science. J. Consciousness Stud. 4, 195–214. [Google Scholar]
  28. Gallagher S., Zahavi D. (2008). The Phenomenological Mind: An Introduction to Philosophy of Mind and Cognitive Science. 1st Edn. New York, NY: Routledge. [Google Scholar]
  29. Gibson J. J. (1977). “The theory of affordances,” in Perceiving, Acting and Knowing: Toward an Ecological Psychology, eds Shaw R., Bransford J. (Hillsdale, NJ: Lawrence Erlbaum Associates; ), 67–82. [Google Scholar]
  30. Giurfa M. (2013). Cognition with few neurons: higher-order learning in insects. Trends Neurosci. 36, 285–294. 10.1016/j.tins.2012.12.011 [DOI] [PubMed] [Google Scholar]
  31. Godfrey-Smith P. (2016). Other Minds: The Octopus, the Sea, and the Deep Origins of Consciousness. New York, NY: Farrar, Straus and Giroux. [Google Scholar]
  32. Griffin D. R. (1984). Animal Thinking. Cambridge, MA: Harvard University Press. [Google Scholar]
  33. Hameroff S. (2010). The “conscious pilot”-dendritic synchrony moves through the brain to mediate consciousness. J. Biol. Phys. 36, 71–93. 10.1007/s10867-009-9148-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Herculano-Houzel S. (2017). The Human Advantage: How Our Brains Became Remarkable. Cambridge, MA: MIT Press. [Google Scholar]
  35. Hodos W. (1986). “The evolution of the brain and the nature of animal intelligence,” in Animal Intelligence: Insights into the Animal Mind, eds Hoag R., Goldman L. (Washington, DC: Smithsonian Press; ), 77–87. [Google Scholar]
  36. Humphrey N. (1992). A History of the Mind: Evolution and the Birth of Consciousness. New York, NY: Copernicus Springer-Verlag. [Google Scholar]
  37. Irwin L. N., Irwin B. A. (2020). Place and environment in the ongoing evolution of cognitive neuroscience. J. Cogn. Neurosci. [Epub ahead of print]. 10.1162/jocn_a_01607 [DOI] [PubMed] [Google Scholar]
  38. James W. (1884). On some omissions of introspective psychology. Mind 9, 1–26. 10.1093/mind/os-ix.33.1 [DOI] [Google Scholar]
  39. James W. (1890). Principles of Psychology. New York, NY: Henry Holt. [Google Scholar]
  40. Jaynes J. (1976). The Origins of Consciousness in the Breakdown of the Bicameral Mind. Boston: Houghton Mifflin. [Google Scholar]
  41. John E. R. (2003). A theory of consciousness. Curr. Dir. Psychol. Sci. 12, 244–250. 10.1046/j.0963-7214.2003.01271.x [DOI] [Google Scholar]
  42. Kaas J. (2002). Convergences in the modular and areal organization of the forebrain of mammals: implications for the reconstruction of forebrain evolution. Brain Behav. Evol. 59, 262–272. 10.1159/000063563 [DOI] [PubMed] [Google Scholar]
  43. Kiverstein J. D., Rietveld E. (2018). Reconceiving representation-hungry cognition: an ecological-enactive proposal. Adapt. Behav. 26, 147–163. 10.1177/1059712318772778 [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Koch C. (2012). Consciousness: Confessions of a Romantic Reductionist. Cambridge, MA: MIT Press. [Google Scholar]
  45. Koch C., Tsuchiya N. (2007). Attention and consciousness: two distinct brain processes. Trends Cogn. Sci. 11, 16–22. 10.1016/j.tics.2006.10.012 [DOI] [PubMed] [Google Scholar]
  46. LeDoux J. (2019). The Deep History of Ourselves: The Four-Billion-Year Story of How We Got Conscious Brains. New York, NY: Viking. [Google Scholar]
  47. Lefebvre L., Sol D. (2008). Brains, lifestyles and cognition: are there general trends? Brain Behav. Evol. 72, 135–144. 10.1159/000151473 [DOI] [PubMed] [Google Scholar]
  48. Lisney T. J., Collin S. P. (2006). Brain morphology in large pelagic fishes: a comparison between sharks and teleosts. J. Fish Biol. 68, 532–554. 10.1111/j.0022-1112.2006.00940.x [DOI] [Google Scholar]
  49. Lou H. C., Rømer Thomsen K., Changeux J.-P. (2020). The molecular organization of self-awareness: paralimbic dopamine-GABA interaction. Front. Syst. Neurosci. 14:3. 10.3389/fnsys.2020.00003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Macphail E. M., Bolhuis J. J. (2001). The evolution of intelligence: adaptive specializations versus general process. Biol. Rev. Camb. Philos. Soc. 76, 341–364. 10.1017/s146479310100570x [DOI] [PubMed] [Google Scholar]
  51. Marino L. (2005). Big brains do matter in new environments. Proc. Natl. Acad. Sci. U S A 102, 5306–5307. 10.1073/pnas.0501695102 [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Marino L., Hof P. R. (2005). Nature’s experiments in brain diversity. Anat. Rec. A Discov. Mol. Cell. Evol. Biol. 287, 997–1000. 10.1002/ar.a.20261 [DOI] [PubMed] [Google Scholar]
  53. Massimini M., Boly M., Casali A., Rosanova M., Tononi G. (2009). A perturbational approach for evaluating the brain’s capacity for consciousness. Prog. Brain Res. 177, 201–214. 10.1016/s0079-6123(09)17714-2 [DOI] [PubMed] [Google Scholar]
  54. Menzel R. (2012). The honeybee as a model for understanding the basis of cognition. Nat. Rev. Neurosci. 13, 758–768. 10.1038/nrn3357 [DOI] [PubMed] [Google Scholar]
  55. Mercado E., III. (2008). Neural and cognitive plasticity: from maps to minds. Psychol. Bull. 134, 109–137. 10.1037/0033-2909.134.1.109 [DOI] [PubMed] [Google Scholar]
  56. Merker B. (2005). The liabilities of mobility: a selection pressure for the transition to consciousness in animal evolution. Conscious. Cogn. 14, 89–114. 10.1016/s1053-8100(03)00002-3 [DOI] [PubMed] [Google Scholar]
  57. Merleau-Ponty M. (1945). Phénoménologie de la Perception (Phenomenology of Perception) (S. C., Trans.). London: Routledge and Kegan Paul. [Google Scholar]
  58. Mettke-Hofmann C. (2014). Cognitive ecology: ecological factors, life-styles, and cognition. Wiley Interdiscip. Rev. Cogn. Sci. 5, 345–360. 10.1002/wcs.1289 [DOI] [PubMed] [Google Scholar]
  59. Miyahara K., Witkowski O. (2019). The integrated structure of consciousness: phenomenal content, subjective attitude, and noetic complex. Phenom. Cogn. Sci. 18, 731–758. 10.1007/s11097-018-9608-5 [DOI] [Google Scholar]
  60. Morris D. (2005). Animals and humans, thinking and nature. Phenom. Cogn. Sci. 4, 49–72. 10.1007/s11097-005-4257-x [DOI] [Google Scholar]
  61. Moruzzi G., Magoun H. W. (1949). Brain stem reticular formation and activation of the EEG. Electroencephalogr. Clin. Neurophysiol. 1, 455–473. 10.1016/0013-4694(49)90219-9 [DOI] [PubMed] [Google Scholar]
  62. Moser E. I., Moser M. B., McNaughton B. L. (2017). Spatial representation in the hippocampal formation: a history. Nat. Neurosci. 20, 1448–1464. 10.1038/nn.4653 [DOI] [PubMed] [Google Scholar]
  63. Nagel T. (1974). What is it like to be a bat? Philos. Rev. 83, 435–450. 10.2307/2183914 [DOI] [Google Scholar]
  64. Neubauer R. (2012). Evolution and the Emergent Self: The Rise of Complexity and Behavioral Versatility in Nature. New York, NY: Columbia University Press. [Google Scholar]
  65. O’Keefe J. (1990). A computational theory of the hippocampal cognitive map. Prog. Brain Res. 83, 301–312. 10.1016/s0079-6123(08)61258-3 [DOI] [PubMed] [Google Scholar]
  66. Parker S., McKinney M. (1999). Origins of Intelligence: The Evolution of Cognitive Develpment in Monkeys, Apes, and Humans. Baltimore: Johns Hopkins University Press. [Google Scholar]
  67. Pennartz C. M. A., Farisco M., Evers K. (2019). Indicators and criteria of consciousness in animals and intelligent machines: an inside-out approach. Front. Syst. Neurosci. 13:25. 10.3389/fnsys.2019.00025 [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Reiter S., Liaw H. P., Yamawaki T. M., Naumann R. K., Laurent G. (2017). On the value of reptilian brains to map the evolution of the hippocampal formation. Brain Behav. Evol. 90, 41–52. 10.1159/000478693 [DOI] [PubMed] [Google Scholar]
  69. Richards R. J. (1987). Darwin and the Emergence of Evolutionary Theories of Mind and Behavior. Chicago: University of Chicago Press. [Google Scholar]
  70. Romanes G. J. (1883). Mental Evolution in Animals. London: Kegan Paul, Trench. [Google Scholar]
  71. Rose S. P. R. (1976). The Conscious Brain (Updated ed.) New York, NY: Vintage. [Google Scholar]
  72. Seth A. K., Izhikevich E., Reeke G. N., Edelman G. M. (2006). Theories and measures of consciousness: an extended framework. Proc. Natl. Acad. Sci. U S A 103, 10799–10804. 10.1073/pnas.0604347103 [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. Sheets-Johnstone M. (1999). The Primacy of Movement Vol. 14. Amsterdam: John Benjamins Publishing. [Google Scholar]
  74. Shumway C. A. (2008). Habitat complexity, brain, and behavior. Brain Behav. Evol. 72, 123–134. 10.1159/000151472 [DOI] [PubMed] [Google Scholar]
  75. Smith J. D., Shields W. E., Washburn D. A. (2003). The comparative psychology of uncertainty monitoring and metacognition. Behav. Brain Sci. 26, 317–339; discussion 340–373. 10.1017/s0140525x03000086 [DOI] [PubMed] [Google Scholar]
  76. Sol D. (2009). Revisiting the cognitive buffer hypothesis for the evolution of large brains. Biol. Lett. 5, 130–133. 10.1098/rsbl.2008.0621 [DOI] [PMC free article] [PubMed] [Google Scholar]
  77. Sol D., Duncan R. P., Blackburn T. M., Cassey P., Lefebvre L. (2005). Big brains, enhanced cognition, and response of birds to novel environments. Proc. Natl. Acad. Sci. U S A 102, 5460–5465. 10.1073/pnas.0408145102 [DOI] [PMC free article] [PubMed] [Google Scholar]
  78. Stocker K. (2016). Place cells and human consciousness: a force-dynamic account. J. Consciousness Stud. 23, 146–165. [Google Scholar]
  79. Tononi G. (2004). An information integration theory of consciousness. BMC Neurosci. 5:42. 10.1186/1471-2202-5-42 [DOI] [PMC free article] [PubMed] [Google Scholar]
  80. Tononi G. (2012). Integrated information theory of consciousness: an updated account. Arch. Ital. Biol. 150, 56–90. 10.4449/aib.v149i5.1388 [DOI] [PubMed] [Google Scholar]
  81. Tononi G., Koch C. (2015). Consciousness: here, there and everywhere? Philos. Trans. R. Soc. Lond. B Biol. Sci. 370:20140167. 10.1098/rstb.2014.0167 [DOI] [PMC free article] [PubMed] [Google Scholar]
  82. van Vugt B., Dagnino B., Vartak D., Safaai H., Panzeri S., Dehaene S., et al. (2018). The threshold for conscious report: signal loss and response bias in visual and frontal cortex. Science 360, 537–542. 10.1126/science.aar7186 [DOI] [PubMed] [Google Scholar]
  83. Varela F. (1996). Neurophenomenology: a methodological remedy for the hard problem. J. Consciousness Stud. 3, 330–349. [Google Scholar]
  84. Von Uexküll J. (1926). Theoretical Biology. New York, NY: Harcourt, Brace and Co. [Google Scholar]

Articles from Frontiers in Systems Neuroscience are provided here courtesy of Frontiers Media SA

RESOURCES