Barron and Klein (1) define consciousness very narrowly, excluding most of the attributes commonly associated with it (e.g., self-awareness). By limiting the definition to a form of information processing, they find that insects qualify. However, just to put this conclusion in perspective, by the same definition, some robots are also conscious and capable of subjective experience. Autonomous, self-navigating robots with motivational circuitry [e.g., (2)] fit Barron and Klein’s definition of a conscious entity (1). These robots create an integrated “neural” simulation of themselves in space and use appropriate internal and external information to do so, and their behavior can be influenced by their motivational state (3). Like a bee, these robots can actively hunt for things beyond their immediate sensory environment, something that the Barron and Klein (1) argue is a key requirement for subjective experience. These robots often use biomimetic neural structures, such as central processors with feed-forward, feed-backward, and recurrent connections. These artificial brains allow robots to integrate information, have motivational goals, direct attention to salient environmental features (i.e., exhibit selective attention), and make appropriate behavioral choices (3, 4). Although these examples of artificial intelligence (AI) show impressive abilities, even the AI community does not consider them as examples of consciousness (3).
The analogy between circuits within the insect cerebral ganglion and vertebrate midbrain is interesting, but the enormous difference in neuronal number between the two raises the possibility that some elements that may be critical for consciousness are missing in insects. For example, insects do not appear to have circuits that subserve emotional behavior in the same way that vertebrates do (5). Insects appear to be under greater selective pressure than vertebrates to reduce the cost and size of their brain (6). This pressure probably reduces selection for traits such as subjective experience (e.g., emotional experience) in this group. Although I do not disagree with Barron and Klein (1) that comparative neurobiological studies will help us understand the evolution of neural mechanisms for a wide variety of abilities, I am not sure that I would look to the insects to learn about the evolution of consciousness. The constraints on the size of insect nervous systems (6) may preclude its development in this group.
Therefore, Barron and Klein’s conclusion that insects are conscious (1) is surprising, only because the word “conscious” carries certain connotations. Stripped of the term “consciousness,” their conclusions are not controversial. For example, a bee without the ability to “simulate the state of the animal’s own mobile body within the environment” would be incapable of foraging and navigation. What is surprising is not that bees can form neural simulations of their environment, but the labeling of this ability as consciousness.
Footnotes
The author declares no conflict of interest.
References
- 1.Barron AB, Klein C. What insects can tell us about the origins of consciousness. Proc Natl Acad Sci USA. 2016;113(18):4900–4908. doi: 10.1073/pnas.1520084113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Ames H, et al. The animat: New frontiers in whole brain modeling. IEEE Pulse. 2012;3(1):47–50. doi: 10.1109/MPUL.2011.2175638. [DOI] [PubMed] [Google Scholar]
- 3.Reggia JA. The rise of machine consciousness: Studying consciousness with computational models. Neural Netw. 2013;44:112–131. doi: 10.1016/j.neunet.2013.03.011. [DOI] [PubMed] [Google Scholar]
- 4.Ferreira C, Santon CP. Combining central pattern generators and reflexes. Neurocomputing. 2015;170:79–91. [Google Scholar]
- 5.Adamo SA. Do insects feel pain? A question at the intersection of animal behaviour, philosophy and robotics. Anim Behav. 2016 doi: 10.1016/j.anbehav.2016.05.005. [DOI] [Google Scholar]
- 6.Sterling P, Laughlin S. Principles of Neural Design. MIT Press; Cambridge, MA: 2015. [Google Scholar]