Skip to main content
Frontiers in Psychology logoLink to Frontiers in Psychology
. 2016 Aug 9;7:1206. doi: 10.3389/fpsyg.2016.01206

Subjectivity: A Case of Biological Individuation and an Adaptive Response to Informational Overflow

Jakub Jonkisz 1,*
PMCID: PMC4977275  PMID: 27555835

Abstract

The article presents a perspective on the scientific explanation of the subjectivity of conscious experience. It proposes plausible answers for two empirically valid questions: the ‘how’ question concerning the developmental mechanisms of subjectivity, and the ‘why’ question concerning its function. Biological individuation, which is acquired in several different stages, serves as a provisional description of how subjective perspectives may have evolved. To the extent that an individuated informational space seems the most efficient way for a given organism to select biologically valuable information, subjectivity is deemed to constitute an adaptive response to informational overflow. One of the possible consequences of this view is that subjectivity might be (at least functionally) dissociated from consciousness, insofar as the former primarily facilitates selection, the latter action.

Keywords: subjectivity, conscious experience, biological individuation, information, selection, action

Introduction

The subjective or experiential aspect of consciousness constitutes the most perplexing, and supposedly the hardest, problem for science. There is insufficient space to analyse the complicated problem of qualia here (for that purpose, see, for example, Dennett, 1988; Crane, 2000) or to explain the meanings and interrelations between such concepts as phenomenal consciousness, first-personal perspective, what-it-is-like-ness, for-me-ness and the like (Nagel, 1974; Jackson, 1982; Levine, 1983, 2001; Block, 1995; Chalmers, 1995, 1996; Kriegel, 2006; Bayne, 2009). However, it seems quite reasonable to assert that it is subjectivity that is in fact crucial in all those cases1. How might science account for the subjectivity of conscious experiences? What is the relation between consciousness and subjectivity? Are there conscious states that are not subjective, or subjective states that are not conscious? The article presents an empirically supported hypothesis about the biological origins and function of subjectivity and gives provisional answers to these questions.

Basic Facts and a Basic Fallacy

From the inside (or subjectively), consciousness consists of a variety of experiences: i.e., feelings, emotions, desires, sensations, perceptions, dreams, thoughts, etc. From the outside (or objectively), we can observe particular sorts of behavior and can access neurophysiological data, and (in the case of humans) may also obtain verbal reports associated with states of consciousness.

Science, we might say, is essentially objective, and conscious experience essentially subjective: that is just the way they are. Hence, “…subjective, qualitative aspects of consciousness, being private, cannot be communicated directly through a scientific theory, which by its nature is public and intersubjective. Accepting this assumption does not mean that the necessary and sufficient conditions for consciousness cannot be described. It implies only that describing them is not the same as generating and experiencing them” (Edelman and Tononi, 2000, pp. 139–140). The belief that science will ever permit us to know directly what it is like to see in color, or ‘what it is like to be a bat’ or any other being different from ourselves, is fallacious when understood literally. At the same time, though, we can ask empirically valid and productive questions about the ‘what-it-is-like-ness’ itself. In other words, qualitative experience is a property possessed by certain biological systems, not by any scientific claims or theories. Therefore, if proponents of the so called knowledge argument (formulated in many versions by, e.g., Broad, Feigl, Nagel, Jackson), or advocates of the so called explanatory gap postulate (coined by Levine), claim that science should provide direct knowledge of other subjects’ experiences, it may be assumed that their arguments involve a category error (Pigliucci, 2013). Indeed, the hard problem, formulated this way, “does not require a solution, but rather, a cure” (see Edelman et al., 2011, p. 5).

Empirically Valid Questions

There are two standard questions that may be asked about subjective experience: the ‘why’ and the ‘how’ questions. As an empirically valid question, the ‘why’ question (i.e., ‘Why is there subjective experience?’) asks about function. In a naturalistic scenario, it actually just concerns biological function: i.e., the evolutionary or adaptive reasons for subjectivity, or the advantages furnished by subjective experiences for the functioning of some animal. Meanwhile, two slightly different ‘how’ questions (in the sense of ‘How does it arise?’) may be distinguished: one about the biological origins of subjectivity (‘How did subjectivity develop among species?’), the other about the generation of subjective states in a given organism (‘How are they produced?’). Both are, in fact, mechanistic questions, as the former asks about global or developmental mechanisms, the latter about local or production mechanisms.

Consciousness, Information, and Subjectivity

All Conscious States are Informational States

This claim is quite common nowadays; some would even say that “all the components of consciousness are solely information in various forms” (see Earl, 2014, pp. 8–9). Obviously the very notion of information is crucial here, as not all of its versions seem to fit with consciousness studies (Pockett, 2014). The currently most popular informational approach to consciousness, integrated information theory or IIT, began with a rather classical understanding of information (in terms of uncertainty reduction; see Tononi, 2004, 2008, p. 217). Nevertheless, Koch and Tononi claim that “IIT introduces a novel, non-Shannonian notion of information – integrated information – which can be measured as ‘differences that make a difference’ to a system from its intrinsic perspective, not relative to an observer” (Koch and Tononi, 2013). What notion of information is or should be applied here is a fascinating question in its own right. However, if being internally differentiated is sufficient for being informational for given system, as IIT assumes, then, indeed, all conscious states would seem to be informational states – since it is rather obvious that in order to be identified at all, states of consciousness must differ relative to one another when viewed from the internal (i.e., subjective) perspective of a given system.

Not All Informational States are Conscious

That not all information processing is conscious is something that seems fairly obvious, and is apparently widely endorsed in cognitive science2. Some claim that not only lower-level, but also higher-level information processing, which engages prefrontal areas, is not necessarily conscious (van Gaal and Lamme, 2012) – not until recurrent processing enters in (e.g., Lamy et al., 2008). Others maintain that even executive, top-down control might be carried out unconsciously (Kiefer, 2012). What is significant for IIT is that even integrated informational states need not always reach consciousness – as Mudrik and other researchers, including Koch himself, have argued (Mudrik et al., 2011, 2014). In at least four cases, that might be the case: when the time and space of exposure is limited (narrow spatiotemporal window), when the information is insufficiently semantically complex, when it is not multimodal, and when it is not novel for the subject (see Mudrik et al., 2014, pp. 491–494). If this is so, then not all informational states available within a given system will necessarily be conscious, even if they are integrated.

All Informational States are Subjective

This claim is by no means obvious. However, IIT does assume the intrinsic existence of all conscious – i.e., integrated – informational states (as an axiomatic, Cartesian, phenomenological feature; see Tononi and Koch, 2015, p. 5)3. Philosophically speaking, such an assumption is rather fundamental, as it is in some sense really positing a subjective ontology of consciousness (Searle, 1992, 2000a,b). A plausible, naturalistic justification of the subjectivity of informational states will be proposed below, yet without drawing any ontological consequences (as these are by no means definite).

The phenomenon of consciousness is known only in situ and in vivo. Hence, if it is to be described in informational terms, consideration should be given to what kind of information we are dealing with here, in the case of biological systems. To the extent that we understand the natural world, accessibility of information may be considered deeply embedded in our evolutionary background (Feinberg and Mallatt, 2013). Each and every creature is morphologically and physiologically attuned and sensitive only to specific sorts of informational resource (i.e., those most efficacious for its ancestors). In other words, earthly creatures (see Millikan, 2004, p. 9) are naturally constrained to making use of just certain wavelengths and frequencies of light and sound, certain chemicals, and only particular capacities for movement, sensation, etc. These phylogenetic limitations are further modified epigenetically and socially, becoming still more specific and unique at the phenotypical level (Fraga et al., 2005; Swaddle et al., 2005; Bossdorf et al., 2008; Ballestar, 2010; Migicovsky and Kovalchuk, 2011). Finally, informational states accessible within an individual organism, already heavily preselected, are shaped by its very own developmental history, since biological systems are structured by all their past and future encounters (especially connection patterns in their brains; see Frith, 2011). The ultimate form and content of states accessed by an individual subject will be determined by the subject’s current state and situatedness – that is, by the specific internal and external conditions the creature is actually subject to (e.g., individual physiological parameters, location in a particular space, contextual relations, ongoing engagements, expectations, etc.,). As a result, “(a)t any given moment, a process of integration of collective neuronal activity generates an interwoven pattern of responses unique to a particular animal at that particular moment of time” (Edelman et al., 2011, p. 3).

In conclusion, it may be said that information possessing biological value is always individuated (Jonkisz, 2015)4. And this may even be put strongly: all informational states, no matter whether conscious or integrated, are biologically individuated within a given organism – i.e., directly accessible only from its internal perspective, meaning only subjectively.

Some Provisional Answers

All the factors described above contribute to the complex process of biological individuation, causing individual differences to occur in each and every organism at virtually every level (be it genetic, epigenetic, anatomical, behavioral etc.,). Even monozygotic twins reared together, or clones, differ significantly, as it is not possible to mimic all the environmental interactions and physiological conditions involved in forming each phenotype (Pfefferbaum et al., 2004; Fraga et al., 2005; Ballestar, 2010; Maiti et al., 2011; Freund et al., 2013). In the light of this, the conclusion that individual differences are also present at a given organism’s cognitive or epistemic level seems quite plausible (in the sense that what the organism is able to perceive or experience will also be unique, meaning directly accessible only from its perspective, i.e., subjectively). Ultimately, it may be said that the process of biological individuation furnishes, at the very least, a provisional answer to the question concerning the origins or developmental mechanisms of subjectivity (‘How did subjectivity develop among species?’). Below, a plausible hypothesis is presented which also answers the question concerning biological function (‘Why is there subjective experience?’).

What adaptive advantages might subjectivity furnish for the functioning of an animal? It is a rather obvious empirical observation, that the survival of an animal is determined by its ability to pick out the most efficient patterns of action and adapt them to the changing conditions of the moment. However, in a complex, ever-changing environment, organisms are potentially able to discern an infinite amount of information and an infinite number of states. Therefore, there is likely to be strong evolutionary pressure on their ability to minimize the more redundant states and select the most efficient ones. That is probably part of the reason why the plethora of possible informational resources is subject to such a degree of phylogenetic reduction and ontogenetic preselection (as described in “All Informational States are Subjective”). What remains accessible is what is potentially most useful, in a Bayesian sense, and hence most valuable biologically – both for a given species and for the individual organism. More specifically, one may say that those networks of experiential/informational associations that have worked well for an organism in the past form biases triggered by similar future occurrences. Attentional priorities for spotting differences (novelties) in the ongoing unfolding of informational states, together with the emotional markers ascribed to them, enable further learning and adaptation (brain plasticity and neuronal group selection, or so called neural Darwinism, account well for this picture; see Cleeremans, 2011; Edelman et al., 2011; Frith, 2011; Baars, 2012). From this perspective, organisms may be viewed as Bayesian systems specializing in anticipating what the future will bring and discriminating what is best for them from their individual, unique and private point of view – i.e., a subjective one. The process of formation of such an individuated informational space (accessed only subjectively), occurring in several distinct stages (described above), is likely to be the most efficient way of selecting biologically valuable information, and is therefore justified as far as the functioning of the animal is concerned. We may thus conclude that the hypothesis according to which subjectivity might be seen (functionally) as an adaptive response to informational overflow seems valid and plausible.

The hypothesized function of subjectivity (the selection of overflowing information) is partly supported by empirical evidence pertaining to both the bottom-up and the top-down influence of bodily factors on information processing (see Theeuwes, 2010; Fleming et al., 2010; Rochat, 2011; Shimono et al., 2012; Zhou et al., 2013; Pfeifer et al., 2014). The way our bodies are shaped (by their evolutionary background, developmental conditions, current engagements and actual physiological state) determine informational states’ availability, resulting in the formation of individuated informational spaces for each and every organism (the statistically most useful ones).

The foregoing considerations have introduced provisional answers for questions about the function of subjectivity (the ‘why’ question) and its developmental mechanisms (a global ‘how’ question). What still remains is the local ‘how’ question concerning the production of subjective experiences within an organism. The task of discovering experimentally the mechanisms responsible for generating states of consciousness has proved to be by far the most challenging one. We still seem far away from any sort of end to that journey, even though much knowledge about the neuronal areas engaged and the processes involved has been gathered (Metzinger, 2000; Crick and Koch, 2003; Hohwy, 2009; Lamme, 2010; Panagiotaropoulos et al., 2012; Baars, 2012). Amongst the findings so far, a few promising models have been elaborated: e.g., GWT and GNW, RP, and the more abstract IIT model (Baars, 1994, 1996; Dehaene et al., 1998; Dehaene and Naccache, 2001; Tononi, 2004, 2008; Lamme, 2006; Dehaene and Changeux, 2011; Baars et al., 2013; Tononi and Koch, 2015). Indeed, the generation problem seems to represent the most difficult challenge for science, though it does seem to remain tractable from the standpoint of an empirical science of consciousness. Undeniable proof that the hard problem had been solved would probably need to take the form of an at least minimally conscious artifact (Kurzweil, 2012; O’Regan, 2012). However, still more effective treatments, enabling a regaining of consciousness in patients who have lost vital components of the latter, would offer important indicators of the right path having been followed (Giacino, 2005).

Conclusion

The article presents a provisional account of both the developmental mechanism of subjectivity (described here as biological individuation) and its function (described as an adaptive response to informational overflow). It also points to some fairly unconventional conclusions about the relation between subjectivity and consciousness. All conscious states count as informational (see “All Conscious States are Informational States”), but not all informational states (even those that are integrated) count as conscious (see “Not All Informational States are Conscious”). It is also argued that all informational states are subjective (biologically individuated, see “All Informational States are Subjective”). While it is generally considered quite acceptable to claim that all conscious states are subjective (as is entailed by “All Conscious States are Informational States” and “All Informational States are Subjective”), the conclusion that some subjective states are not conscious (as is entailed by “Not All Informational States are Conscious” and “All Informational States are Subjective”) remains rather unpopular.

Subjectivity, as an adaptive strategy for selecting the most useful informational resources, might in fact be dissociated from consciousness, at least functionally. (That idea was first developed in Jonkisz, 2009; the separation presented here have consequences whose significance is above all explanatory). Whereas subjectivity, as described here, primarily facilitates selection (based on Bayesian statistics), consciousness may be seen as a phenomenon that primarily facilitates action (Jonkisz, 2015). The overall picture may be briefly described as follows: both the bottom-up bodily factors (genetic, epigenetic, and developmental) and the top-down priming effects or expectations (based on prior experiences) serve not only to reduce the number of possible states, but also to bias accessible ones toward or against certain other informational states (Fleming et al., 2010; Theeuwes, 2010; Rochat, 2011; Shimono et al., 2012; Zhou et al., 2013; Pfeifer et al., 2014). Hence, every creature ultimately ‘works on’ a biologically individuated informational space that consists of all of the possible experiential/informational contents formed within the system5. Those of the subjective, informational contents that are actually utilized in action become conscious, where this is likely to occur in a gradational manner (see Jonkisz, 2015; for the crucial role of action, see also, e.g., Gibson, 1977, 1979; Noë, 2006; Engel et al., 2013; etc.,).

Our general conclusion, then, is that subjectivity, which develops through many stages of biological individuation, is inherited in the form of our being-in-the-world. We are all structurally, functionally, and phenomenally different (as already mentioned, even monozygotic twins and clones, similar as they appear to be and actually are, differ significantly, see Pfefferbaum et al., 2004; Fraga et al., 2005; Ballestar, 2010; Maiti et al., 2011; Freund et al., 2013). A complex and dynamic ecosystem, unique in itself, contains myriads of unique, individuated systems with their own private perspectives. Admittedly, subjective Bayesian systems are fallible in their heuristic processing, being prone to illusion and undetected error (Lotto and Purves, 2002; Rossano, 2003; Lotto, 2004), yet this is forgivable, as biological individuation has itself most likely brought about not only a balance between efficacy and economy in their interactions with their environment, but also the emergence of selves (Rochat, 2011).

Author Contributions

The author confirms being the sole contributor of this work and approved it for publication.

Conflict of Interest Statement

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Funding. This article was supported by an OPUS grant from the National Science Centre of Poland (UMO-2013/11/B/HS1/03968).

1

Subjectivity is understood in the article as the epistemic property of being privately or first-personally accessible (see also Jonkisz, 2012, pp. 60–61).

2

There are, however, exceptions to this claim: e.g., Chalmers, who claims that “…wherever there is a causal interaction, there is information, and wherever there is information, there is experience,” and that “Experience is information from the inside; physics information from the outside” (by ‘experience’ he means ‘phenomenal consciousness’; see Chalmers, 1996, p. 297 and p. 305).

3

IIT also postulates that a physical mechanism needs to exercise a “cause-effect power upon itself” in order to support this feature (pp. 7–8), and characterizes conscious experience as “maximally irreducible intrinsically conceptual structure” (or “Φmax”).

4

The notion of individuation has played a central role in the philosophical theories developed by Simondon, and fits neatly into the context being described in the article (Simondon, 2005; Iliadis, 2013).

5

Such an individuated informational space may be seen as analogous to the Global Workspace (Baars, 1996; Edelman et al., 2011; Baars et al., 2013): as an embodied, and probably extended or externalized, version of the latter.

References

  1. Baars B. (1994). A thoroughly empirical approach to consciousness. Psyche 1 1–18. [Google Scholar]
  2. Baars B. (1996). Understanding subjectivity: global workspace theory and the resurrection of the observing self. J. Consciou. Stud. 3 211–216. [Google Scholar]
  3. Baars B. (2012). The biological cost of consciousness. Nat. Proc. 1–16. 10.1038/npre.2012.6775.1 [DOI] [Google Scholar]
  4. Baars B., Franklin S., Ramsoy T. Z. (2013). Global workspace dynamics: cortical “binding and propagation” enables conscious contents. Front. Psychol. 4:200 10.3389/fpsyg.2013.00200 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Ballestar E. (2010). Epigenetics lessons from twins: prospects for autoimmune disease. Clin. Rev. Allergy Immunol. 39 30–41. 10.1007/s12016-009-8168-4 [DOI] [PubMed] [Google Scholar]
  6. Bayne T. (2009). “Consciousness,” in Routledge Companion to the Philosophy of Psychology, eds Symons J., Calvo P. (Abingdon: Routledge; ), 477–494. [Google Scholar]
  7. Block N. (1995). On confusion about a function of consciousness. Behav. Brain Sci. 18 227–287. [Google Scholar]
  8. Bossdorf O., Richards C. L., Pigliucci M. (2008). Epigenetics for ecologists. Ecol. Lett. 11 106–115. 10.1111/j.1461-0248.2007.01130.x [DOI] [PubMed] [Google Scholar]
  9. Chalmers D. (1995). Facing up to the problem of consciousness. J. Conscious. Stud. 2 200–219. [Google Scholar]
  10. Chalmers D. (1996). The Conscious Mind: in Search of a Fundamental Theory. Oxford: Oxford University Press. [Google Scholar]
  11. Cleeremans A. (2011). The radical plasticity thesis: how the brain learns to be conscious. Front. Psychol. 2 1–12. 10.3389/fpsyg.2011.00086 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Crane T. (2000). “The origins of qualia,” in The History of the Mind-Body Problem, eds Crane T., Patterson S. (London: Routledge; ). [Google Scholar]
  13. Crick F., Koch C. (2003). A framework for consciousness. Nat. Neurosci. 6 119–126. [DOI] [PubMed] [Google Scholar]
  14. Dehaene S., Changeux J.-P. (2011). Experimental and theoretical approaches to conscious processing. Neuron 70 200–227. 10.1016/j.neuron.2011.03.018 [DOI] [PubMed] [Google Scholar]
  15. Dehaene S., Kerszberg M., Changeux J. P. (1998). A neuronal model of a global workspace in effortful cognitive tasks. Proc. Natl. Acad. Sci. U.S.A. 95 14529–14534. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Dehaene S., Naccache L. (2001). Towards a cognitive neuroscience of consciousness: basic evidence and a workspace framework. Cognition 79 1–37. [DOI] [PubMed] [Google Scholar]
  17. Dennett D.C. (1988). “Quining qualia,” in Consciousness in Modern Science, eds Marcel A., Bisiach E. (Oxford: Oxford University Press; ). [Google Scholar]
  18. Earl B. (2014). The biological function of consciousness. Front. Psychol. 5:697 10.3389/fpsyg.2014.00697 [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Edelman G., Gally J. A., Baars B. (2011). Biology of consciousness. Front. Psychol. 2:4 10.3389/fpsyg.2011.00004 [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Edelman G., Tononi G. (2000). “Reentry and the dynamic core: neural correlates of conscious experience,” in Neural Correlates of Consciousness, ed. Metzinger T. (Cambridge, MA: MIT Press; ), 139–151. [Google Scholar]
  21. Engel A. K., Maye A., Kurthen M., König P. (2013). Where’s the action? The pragmatic turn in cognitive science. Trends Cogn. Sci. 17 202–209. 10.1016/j.tics.2013.03.00 [DOI] [PubMed] [Google Scholar]
  22. Feinberg T. E., Mallatt J. (2013). The evolutionary and genetic origins of consciousness in the Cambrian Period over 500 million years ago. Front. Psychol. 4:667 10.3389/fpsyg.2013.00667 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Fleming S. M., Weil R. S., Nagy Z., Dolan R. J., Rees G. (2010). Relating introspective accuracy to individual differences in brain structure. Science 329 1541–1543. 10.1126/science.1191883 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Fraga M. F., Ballestar E., Paz M. F., Ropero S., Setien F., Ballestar M. L., et al. (2005). Epigenetic differences arise during the lifetime of monozygotic twins. Proc. Natl. Acad. Sci. U.S.A. 102 10604–10609. 10.1073/pnas.0500398102 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Freund J., Brandmaier A. M., Lewejohann L., Kirste I., Kritzler M., Krüger A., et al. (2013). Emergence of individuality in genetically identical mice. Science 340 756–759. 10.1126/science.1235294 [DOI] [PubMed] [Google Scholar]
  26. Frith C. D. (2011). What brain plasticity reveals about the nature of consciousness: commentary. Front. Psychol. 2:87 10.3389/fpsyg.2011.00087 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Giacino J. T. (2005). The minimally conscious state: defining the borders of consciousness. Prog. Brain Res. 150 381–395. 10.1016/S0079-6123(05)50027-X [DOI] [PubMed] [Google Scholar]
  28. Gibson J. (1977). “The Theory of Affordances,” in Perceiving, Acting, and Knowing, eds Shaw R., Bransford J. (Hillsdale, NJ: Erlbaum; ), 67–82. [Google Scholar]
  29. Gibson J. (1979). The Ecological Approach to Visual Perception. Boston, MA: Houghton Mifflin. [Google Scholar]
  30. Hohwy J. (2009). The neural correlates of consciousness. New experimental approaches needed? Conscious. Cogn. 18 428–438. 10.1016/j.concog.2009.02.006 [DOI] [PubMed] [Google Scholar]
  31. Iliadis A. (2013). Informational ontology: the meaning of gilbert Simondon’s Concept of individuation. Communication +1, 2:5. [Google Scholar]
  32. Jackson F. (1982). Epiphenomenal qualia. Philos. Q. 32 127–136. [Google Scholar]
  33. Jonkisz J. (2009). Świadomość i subiektywność – Razem czy osobno. Anal. Egzyst. 9 121–143. [Google Scholar]
  34. Jonkisz J. (2012). Consciousness: a four-fold taxonomy. J. Conscious. Stud. 19 55–82. [Google Scholar]
  35. Jonkisz J. (2015). Consciousness: individuated information in action. Front. Psychol. 6:1035 10.3389/fpsyg.2015.01035 [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Kiefer M. (2012). Executive control over unconscious cognition: attentional sensitization of unconscious information processing. Front. Hum. Neurosci. 6:61 10.3389/fnhum.2012.00061 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Koch C., Tononi G. (2013). Reply to Searle in The New York Review of Books 3/2013 Can a Photodiode be Conscious? Available at: http://www.nybooks.com/articles/archives/2013/mar/07/can-photodiode-be-conscious/ [Google Scholar]
  38. Kriegel U. (2006). “Consciousness: phenomenal consciousness, access consciousness, and scientific practice,” in Handbook of Philosophy of Psychology and Cognitive Science, ed. Thagard P. (Amsterdam: North-Holland; ), 195–217. [Google Scholar]
  39. Kurzweil R. (2012). How to Create a Mind: The Secret of Human Thought Revealed. New York, NY: Viking. [Google Scholar]
  40. Lamme V. A. (2006). Towards a true neural stance on consciousness. Trends Cogn. Sci. 10 494–501. 10.1016/j.tics.2006.09.001 [DOI] [PubMed] [Google Scholar]
  41. Lamme V. A. (2010). How neuroscience will change our view on consciousness. Cogn. Neurosci. 1 204–220. 10.1080/17588921003731586 [DOI] [PubMed] [Google Scholar]
  42. Lamy D., Mudrik L., Deouell L. Y. (2008). Unconscious auditory information can prime visual word processing: a process dissociation procedure study. Conscious. Cogn. 17 688–698. [DOI] [PubMed] [Google Scholar]
  43. Levine J. (1983). Materialism and qualia: the explanatory gap. Pac. Philos. Q. 64 354–361. [Google Scholar]
  44. Levine J. (2001). Purple Haze: The Puzzle of Consciousness. Oxford: Oxford University Press. [Google Scholar]
  45. Lotto B. (2004). Visual development: experience puts the colour in life. Curr. Biol. 14 619–621. 10.1016/j.cub.2004.07.045 [DOI] [PubMed] [Google Scholar]
  46. Lotto B., Purves D. (2002). The Empirical basis of color perception. Conscious. Cogn. 11 606–629. [DOI] [PubMed] [Google Scholar]
  47. Maiti S., Kumar K. H., Castellani C. A., O’Reilly R., Singh S. M. (2011). Ontogenetic de novo copy number variations (CNVs) as a source of genetic individuality: studies on two families with MZD twins for schizophrenia. PLoS ONE 6:e17125 10.1371/journal.pone.0017125 [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Metzinger T. (2000). (ed.), Neural Correlates of Consciousness: Empirical and Conceptual Questions Cambridge, MA: The MIT Press. [Google Scholar]
  49. Migicovsky Z., Kovalchuk I. (2011). Epigenetic memory in mammals. Front. Genet. 2:28 10.3389/fgene.2011.00028 [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Millikan R. (2004). Varieties of Meaning. Cambridge, MA: MIT Press. [Google Scholar]
  51. Mudrik L., Breska A., Lamy D., Deouell L. Y. (2011). Integration without awareness: expanding the limits of unconscious processing. Psychol. Sci. 22 764–770. 10.1177/0956797611408736 [DOI] [PubMed] [Google Scholar]
  52. Mudrik L., Faivre N., Koch C. (2014). Information integration without awareness. Trends Cogn. Sci. 18 488–496. 10.1016/j.tics.2014.04.009 [DOI] [PubMed] [Google Scholar]
  53. Nagel T. (1974). What is it like to be a bat? Philos. Rev. 83 435–451. [Google Scholar]
  54. Noë A. (2006). Precis of action in perception. Psyche 12 1–34. [Google Scholar]
  55. O’Regan J. K. (2012). How to Build a Robot that is Conscious and Feels. Minds Mach. 22 117–136. 10.1007/s11023-012-9279-x [DOI] [Google Scholar]
  56. Panagiotaropoulos T. I., Deco G., Kapoor V., Logothetis L. K. (2012). Neuronal discharges and gamma oscillations explicitly reflect visual consciousness in the lateral prefrontal cortex. Neuron 74 924–935. 10.1016/j.neuron.2012.04.013 [DOI] [PubMed] [Google Scholar]
  57. Pfefferbaum A., Sullivan E. V., Carmelli D. (2004). Morphological changes in aging brain structures are differentially affected by time-linked environmental influences despite strong genetic stability. Neurobiol. Aging 25 175–183. 10.1016/S0197-4580(03)00045-9 [DOI] [PubMed] [Google Scholar]
  58. Pfeifer R., Iida F., Lungarella M. (2014). Cognition from the bottom up: on biological inspiration, body morphology, and soft materials. Trends Cogn. Sci. 18 404–413. 10.1016/j.tics.2014.04.004 [DOI] [PubMed] [Google Scholar]
  59. Pigliucci M. (2013). Science and philosophy: what hard problem? Philos. Now 99 25–25. [Google Scholar]
  60. Pockett S. (2014). Problems with theories that equate consciousness with information or information processing. Front. Syst. Neurosci. 8:225 10.3389/fnsys.2014.00225 [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Rochat P. (2011). The self as phenotype. Conscious. Cogn. 20 109–219. 10.1016/j.concog.2010.09.012 [DOI] [PubMed] [Google Scholar]
  62. Rossano M. J. (2003). Expertise and the evolution of consciousness. Conscious. Cogn. 89 207–236. [DOI] [PubMed] [Google Scholar]
  63. Searle J. (1992). The Rediscovery of the Mind. Cambridge, MA: MIT Press. [Google Scholar]
  64. Searle J. (2000a). Consciousness. Annu. Rev. Neurosci. 23 557–578. [DOI] [PubMed] [Google Scholar]
  65. Searle J. (2000b). Consciousness, free action and the brain. J. Conscious. Stud. 7 3–22. [Google Scholar]
  66. Shimono M., Mano H., Niki K. (2012). The brain structural hub of interhemispheric information integration for visual motion perception. Cereb. Cortex 22 337–344. 10.1093/cercor/bhr108 [DOI] [PubMed] [Google Scholar]
  67. Simondon G. (2005). L’individuation à la Lumière des Notions de Orme et d’information. Paris: Millon. [Google Scholar]
  68. Swaddle J. P., Cathey M. G., Cornell M., Hopkinton B. P. (2005). Socially transmitted mate preferences in a monogamous bird: a non-genetic mechanism of sexual selection. Proc. Biol. Sci. 272 1053–1058. 10.1098/rspb.2005.3054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Theeuwes J. (2010). Top-down and bottom-up control of visual selection. Acta Psychol. (Amst.) 135 77–99. 10.1016/j.actpsy.2010.02.006 [DOI] [PubMed] [Google Scholar]
  70. Tononi G. (2004). An information integration theory of consciousness. BMC Neurosci. 5:42 10.1186/1471-2202-5-42 [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Tononi G. (2008). Consciousness as integrated information: a provisional manifesto. Biol. Bull. 215 216–42. 10.2307/25470707 [DOI] [PubMed] [Google Scholar]
  72. Tononi G., Koch C. (2015). Consciousness: here, there but not everywhere. Philos. Trans. R. Soc. B 370:20140167 10.1098/rstb.2014.0167 [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. van Gaal S., Lamme V. A. F. (2012). Unconscious high-level information processing: implication for neurobiological theories of consciousness. Neuroscientist 18 287–301. 10.1177/1073858411404079 [DOI] [PubMed] [Google Scholar]
  74. Zhou L., He Z. J., Ooi T. L. (2013). The visual system’s intrinsic bias and knowledge of size mediate perceived size and location in the dark. J. Exp. Psychol. Learn. Mem. Cogn. 39:6 10.1037/a0033088 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Frontiers in Psychology are provided here courtesy of Frontiers Media SA

RESOURCES