Skip to main content
Proceedings of the National Academy of Sciences of the United States of America logoLink to Proceedings of the National Academy of Sciences of the United States of America
. 2011 May 9;108(Suppl 3):15617–15623. doi: 10.1073/pnas.1101894108

Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain

Amy Wells Quinkert a,1, Vivek Vimal a, Zachary M Weil a, George N Reeke b, Nicholas D Schiff c, Jayanth R Banavar d, Donald W Pfaff a
PMCID: PMC3176607  PMID: 21555568

Abstract

We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA.

Keywords: deep brain stimulation, food anticipation, logistic equation, reticular formation, activation of behavior


In this brief review, we present a global concept of brain function, generalized arousal (GA), and illustrate the application of four mathematical methods to its description and analysis. Here, in order of the sections below, we (i) use factor analysis to help prove that GA actually exists; (ii) use information theory to characterize stimuli that elicit GA; (iii) resort to the logistic equation to speculate on how GA might make use of nonlinear dynamics; and (iv) having studied GA in the context of the hunger-induced activation of behavior, characterize the behavioral data with a simple Gaussian.

We have proposed that the most powerful and essential activity in any vertebrate nervous system is GA (1). As conceived, GA is universal and fundamental, initiating the activation of all behavioral responses in all vertebrate animals. The operational definition of GA is that a more-aroused animal or human is (i) more responsive to sensory stimuli in all modalities; (ii) more active motorically; and (iii) more reactive emotionally (1). GA's performance requirements are listed in Table 1.

Table 1.

Operating requirements of CNS arousal systems

Operational definition
 Provide alertness to sensory stimuli, body-wide, all sensory modalities
 Drive voluntary motor activity, body-wide, from fidgeting to running marathons
 Fuel emotional reactivity, positive and negative
Operational requirements
 Lability: “Hair triggered,” rapid, not sluggish
 Sensitivity: Especially to the momentary state of the organism
 Convergence: All sensory stimuli activate the same set of arousal subsystems, which, in turn, support each other
 Divergence: Activate cerebral cortex, autonomic nervous systems, and endocrine organs to initiate behavior
 Robustness: Does not fail. Survival of the organism depends on adequate CNS arousal

Evidence for the Existence of GA of the CNS

Three independent lines of evidence indicate that generalized CNS arousal actually exists: statistical, genetic, and mechanistic.

The first line of evidence derives from recently reported behavioral results that show, statistically, the influence of GA (2). We did several experiments with mice, which tapped all three components of the operational definition of GA: S (sensory alertness) measured as motor activity in response to sensory stimuli of various modalities; M (motor activity) measured as spontaneous home cage motor activity; and E (emotional reactivity) measured as motor activity and freezing behavior in a conditioned fear paradigm. These three parameters did not covary either genetically or phenotypically. We analyzed the data using factor analysis, which probes the covariance structure of a large data set—in our case it tabulates the statistical relations among various arousal-related response measures. Factor analysis as applied here “lets the subject (in our case, a mouse) tell us” the structure of its arousal functions. Application of this analysis to our five sets of experimental data allowed us to estimate the contribution of GA, measured as the most generalized, least specific factor, as revealed by an unrotated covariance matrix and a forced one-factor solution (3). Among the five experiments, the lowest contribution by GA to explain the variance in the data was 29.7%, and the highest was 45% (2). Surprisingly, the overall conclusion that GA accounted for approximately one third of the variance of our data held true despite (i) different populations of mice, (ii) different investigators, (iii) different experimental manipulations and details of response measures, and (iv) different configurations of individual, particular factor analysis solutions involving four to six factors for each experiment. Control calculations showed that our result was robust in three ways: (i) the GA factor was never identical to the first factor of any particular multifactor analysis; (ii) it accounted for significantly more real data than in any random-number control; and (iii) nothing similar to the GA factor appeared in a stringent control in which marginal averages were held constant but the individual data entries were scrambled randomly (all three controls significantly different from our result, P < 0.001). All of these calculations indicated that the mathematical structure of arousal functions in the CNS includes a primitive, undifferentiated form we call GA. Thus, these calculations offer the first line of evidence that GA exists.

As a second line of evidence, it is possible to breed mice for high or low GA, and to the extent that this breeding program is successful, that success offers a second line of proof that a function such as GA exists. Using a high-throughput assay in mice that incorporates all three components of the operational definition of GA, mice that score high in all three components (S, M, and E) are labeled “high arousal,” and mice that score low in all three components are labeled “low arousal.” High-arousal males are mated to high-arousal females, and low-arousal males are mated to low-arousal females. With each generation, we expect the two lines to diverge if there is, in fact, a genetic basis for GA. We note that one cannot breed for a function that does not exist, and thus our success so far offers independent evidence for GA. Success so far, through generation 7, in this breeding program has recently been reported (4), showing that animals bred for high arousal have significantly greater measures in our GA assay than animals bred for low arousal.

In addition to the separation of the two lines of mice with the breeding experiment, we found effects of GA on behaviors that require specific forms of arousal. In GA theory, the strength of motivated behaviors that require specific arousal should be modulated by alterations in GA. In the breeding experiment, we found that high levels of GA, as bred, had a significant impact on male sex behavior. The high-arousal male mice exhibited greater excitement and premature mounts in the presence of a receptive female and, having achieved an intromission, ejaculated significantly sooner (4). Additionally, high-arousal animals of both sexes exhibited greater levels of anxiety-like behaviors and reduced exploratory behavior in the elevated plus maze and light–dark box tasks. Taken together, these data from the breeding experiment illustrate the impact of GA on motivated behaviors.

Apart from the above breeding experiment, the functional genomics of CNS arousal can also be analyzed. Strikingly, more than 120 genes can be counted as contributing to arousal physiology. These have been discovered through studies of null mutant mice and through the cloning of genes whose molecular pharmacology had already implicated their gene products in arousal mechanisms (1, 2).

A third independent line of evidence for the existence of GA depends on a wealth of neuroanatomical, neurophysiological, and genomic data on the proximate mechanisms that underlie CNS arousal. One could not cite mechanisms for a CNS function that does not exist. Thus, it is important to note, as summarized previously (1, 5), that neuroscientists are beginning to understand how CNS arousal works in terms of neuroanatomical, neurophysiological, and molecular mechanisms. The classic neuroanatomical pathways ascending from the lower brainstem toward or in the forebrain can signal arousal using norepinephrine, dopamine, serotonin, histamine, and acetylcholine as transmitters.

Neuroanatomy.

Four sensory modalities feed these ascending pathways in obvious ways: touch (including pain), taste, vestibular, and auditory. The ascending pathways (summarized in Fig. 1A) include norepinephrine-containing systems that tend to emphasize projections to the more posterior cerebral cortex (except for occipital cortex) and to support sensory alertness. Dopaminergic systems tend to project more strongly to anterior, frontal cortex and to foster directed motor acts. Serotonergic neurons tend to project preferentially to a more ancient form of cortex (“limbic cortex”) and hypothalamus, and to be involved in emotional behaviors and autonomic regulation. Cholinergic neurons (ACh) in the basal forebrain support arousal by their widespread projections across the cerebral cortex, and pedunculopontine ACh cells drive thalamic neurons for thalamocortical excitation. Histamine-producing neurons likewise have extremely widespread projections, which actually originate in the hypothalamus and are strongly associated with increased CNS arousal.

Fig. 1.

Fig. 1.

(A) Simplified schematic representation of ascending arousal regulatory pathways. Norepinephrine-containing systems (NE, also known as noradrenergic) tend to emphasize projections to the more posterior cerebral cortex and to support sensory alertness. Dopaminergic systems (DA) tend to project more strongly to anterior, frontal cortex and to foster directed motor acts. Serotonergic (5HT) neurons project preferentially to limbic cortex and hypothalamus and are involved in emotional behaviors and autonomic controls. Cholinergic neurons (ACh) in the basal forebrain support arousal by their widespread projections across the cerebral cortex. Histamine-producing neurons (HA) likewise have extremely widespread projections that actually originate in the hypothalamus and are strongly associated with increased CNS arousal. (B) Simplified schematic representation of descending arousal regulatory pathways. Lateral hypothalamic area (LHA) hypocretin/orexin neurons project to monoamine-expressing cell groups in the lower brainstem and even the spinal cord. Neurons that express oxytocin (OT) and arginine vasopressin (AVP) in the parvocellular portion of the paraventricular hypothalamic nucleus (PVNp) control autonomic arousal through the lower brainstem and spinal cord and affect EEG arousal through projections to locus coeruleus. Hypothalamic neurons containing histamine (HA) in the tuberomammillary nucleus (TMN) have widespread projections and receive inputs from a biological clock, the suprachiasmatic nucleus (SCN). Preoptic area (POA) neurons have descending axons that affect sleep and autonomic physiology. Adapted from ref. 1.

A crucial feature of these ascending arousal pathways is that they show a tremendous degree of redundancy, which has the effect of protecting arousal systems from total failure after small insults. Among human patients, only bilateral damage to a substantial fraction of these pathways, particularly at the mesencephalic/diencephalic junction, causes coma or a vegetative state, as a result of focal injuries.

Descending neuroanatomical pathways (summarized in Fig. 1B) projecting from the forebrain toward the brainstem are also important. Lateral hypothalamic area hypocretin/orexin neurons project down to monoamine-expressing cell groups in the lower brainstem and even to the spinal cord. Oxytocin and arginine vasopressin-expressing neurons in the parvocellular portion of the paraventricular hypothalamic nucleus control autonomic arousal through the lower brainstem and spinal cord and affect EEG arousal through projections to locus coeruleus. Histamine-containing hypothalamic neurons in the tuberomammillary nucleus have widespread projections and receive inputs from a “biological clock,” the suprachiasmatic nucleus. Preoptic area neurons have descending axons that affect sleep and autonomic physiology. For example, neurons in the preoptic area connect to lower brain regions, which control the viscera. Likewise, the paraventricular nucleus of the hypothalamus has axonal projections that, in principle, could contribute to all aspects of arousal: cerebral cortical, autonomic, endocrine, and behavioral. In sum, although the ascending arousal systems have relatively few neurons, only sparse abilities to encode particular stimuli, and are responsible for “waking up” the cerebral cortex, descending arousal systems prepare the body for action by empowering reticulospinal neurons to activate our big posture-supporting trunk muscles, modifying sensory excitability, and activating autonomic systems.

Summarizing the neuroanatomy of CNS arousal pathways, we believe they are bilateral, bidirectional, and universal among vertebrate animals including humans and that they are always involved in response potentiation of either approach or avoidance responses (1).

Neurophysiology.

To gather electrophysiological evidence about arousal pathways, one looks for multimodal nerve cells that respond to a wide variety of salient stimuli across a range of sensory modalities. Such nerve cells are found up and down the brainstem (reviewed in ref. 1; see also ref. 6), from the medulla and the pons into the midbrain.

Some of the most fascinating cells in arousal systems are found in the hindbrain reticular formation near its midline toward the bottom of the brain. These large neurons have axons that split into ascending and descending limbs and could contribute both to ascending systems (for arousal of the cerebral cortex) and to descending systems (for arousal of the autonomic pathways controlling cardiovascular events and the viscera) at the same time (references in ref. 1). Regarding their sensory inputs, studied electrophysiologically, some of the neurons in this region have the large receptive fields and multimodal response characteristics expected of neurons serving GA (68). Likewise, their involvement in the control over a wide variety of motor activities suggests their capacity to subserve GA (9). Electrical stimulation of neurons in this area of the medullary reticular formation elevates cortical arousal (10), whereas dampening activity of medullary reticular neurons via the activation of GABA receptors decreases behavioral arousal (11). Moreover, in the spirit of “reverse engineering,” these neurons may serve as the middle of a “bow tie” configuration, having a wide range of inputs converging upon them and distributing a wide range of outputs. Control engineering theorist John Doyle (12, 13) has envisioned ways in which such a “bow tie” configuration confers robust system performance in the face of great uncertainty and variability in its environment.

Genomics.

In addition to the genetic information cited above, we note the existence of gene products that are essentially involved in CNS arousal. Among classical neurotransmitters, histamine is an arousal transmitter par excellence. Among neuropeptide genes, hypocretin/orexin not only supports normal forms of arousal, but certain mutations in this gene or those for its receptors lead to narcolepsy (14, 15).

With three lines of evidence that an elementary function called GA exists, how might we describe environments or stimuli that elicit GA?

Information Theory in the Description of GA

GA is conceived most easily with the use of classic Shannon information theory, which was first applied to nervous systems by MacKay and McCulloch (16) and, more recently, has been introduced in didactic form to theoretical neuroscience by Dayan and Abbott (17). The communications engineer Claude Shannon (18) successfully devised a method useful for quantifying the transfer of information. His equation, Eq. 1 below, states that in a series of events (i = 1 to n):

graphic file with name pnas.1101894108eq1.jpg

where H is the informational entropy, a measure of information, and pi is the probability of the ith event. For example, in a binary choice between A and B, information is minimized when the probability of either A or B approaches 1.0 and is maximized at P = 0.5, when uncertainty is the greatest. In other words, more information is transferred when the outcome is uncertain. During several decades, neuroscientists have endeavored to apply information theory to signaling in the CNS (1922). Despite many different methods for quantifying information, here we refer to only those methods that stem from Shannon's classic equation above.

Traditional approaches to the estimation of information content in temporal series of neuronal action potentials (23) break up the time line into a series of narrow bins so that the probabilities of 1’s (spikes) and 0’s (no spike) can be counted. Because calculating information content in neurophysiological data can encounter problems related to the sizes of temporal bins that are chosen, methods for calculation that include the construction of vector spaces and the estimation of interval entropy from an analytic distribution, methods that avoid bins, have been introduced (2426). Additionally, information calculations are not restricted to single neuron action potential sequences and can be applied to sets of firing neurons (27). The concept of mutual information, the reduced amount of information encoded by cell A, for example, given knowledge of the activity in cell B that influences cell A, has also been presented as useful for the examination of neural networks (28).

The importance of information content for the activation of behavior is supported by the universality of a phenomenon known as habituation (references in ref. 1). In this context, habituation is defined as the decline in the vigor of a behavioral response when a stimulus is repeated and is considered an example of nonassociative memory (29, 30). This learning process is casually described as information storage. Although information theory has not been used formally in the description of habituation, we can use the language of information theory to describe how informational entropy of a stimulus affects an animal's behavioral response. At the first exposure to a stimulus, the animal knows nothing about the stimulus; therefore, the stimulus’ informational entropy is high, and the animal's behavioral response is correspondingly high. If the stimulus is associated with neither aversive nor beneficial events, the animal learns nothing new with each successive exposure, and the stimulus’ informational entropy decreases. As informational entropy decreases, so does behavioral response to the stimulus. Here, we are content to think of classic Shannon information calculations as a useful metric to predict CNS arousal to a stimulus or a set of stimuli.

Nonlinear Dynamics and GA

Theory of How a GA Function May Work.

Although the use of information theory, above, to give a static description of environmental circumstances that can be arousing seems to work adequately, we have also sought an approach that might well describe the dynamics of CNS arousal.

Ever since the discovery of deterministic chaos, scientists have been interested in applying this form of mathematics to biology (31, 32). However, it has been difficult to generate neurobiological data that would show the power of this thinking for explaining aspects of brain function. In our case, no one would believe that GA systems work in a linear manner. Instead we seek a dynamic, nonlinear system capable of providing the rapid lability and great power of amplification that would allow a disturbing sensory stimulus to cause the entire CNS to swing into action. The logistic equation, one of several forms of mathematics that yield deterministic chaos (33), appealed to us, as a first step, consonant with our previously stated hypothesis (34).

For an initial set of experiments we were attracted to the use of the logistic equation because of its capacity for producing large and rapid changes in output and its sensitivity to small changes in the input, and because of its link to our previous theory about phase transitions in the regulation of CNS arousal (34). Of course, other nonlinear dynamic systems have similar characteristics, but the long history of intense work on the logistic equation recommended it to us (3537).

graphic file with name pnas.1101894108eq2.jpg

This equation, shown in Eq. 2, describes an output variable, X, generated recursively by some closed system at time n, and whose value is between 0 and 1. Xn-1 denotes the previous state of the system, whereas Xn gives the current state of the system. This time series of outputs can be stable or chaotic depending on the constant R, chosen to be between 0 and 4. We propose that arousal systems take advantage of these kinds of nonlinear dynamics when they are in a chaotic phase. Arousal systems, however, could not operate in a chaotic domain for very long and still produce organized behavioral responses, and so we were forced to conceive dynamics in which primitive neural systems, that will generate CNS arousal, operate in a chaotic zone but operate close to a phase transition, occasioned by critical slowing, during which a system relaxes to equilibrium much more slowly. As a result of this phase transition, these arousal systems are proposed (Fig. 2) to enter orderly, well-organized states capable of controlling motor behaviors (34).

Fig. 2.

Fig. 2.

Analogy between phases of matter and our hypothesis for CNS arousal. Upper: Schematic phase diagram for liquid crystals ranging from high temperatures and liquid phase (disordered, “random” molecules on right) to low temperatures and crystalline phase (completely ordered molecules on left). T, temperature. The liquid crystal phase is considered one of the most sensitive phases of matter because of its proximity to a phase transition to the liquid phase. Lower: In the quiescent animal at rest (depicted on right), large numbers of arousal-related neurons (the firing of a typical neuron is illustrated by the vertical lines as a function of time, t) have their rates of firing subject to chaotic dynamics, so that the effects of small perturbations from the arousing stimulus can be amplified selectively and very rapidly. When a movement in response to that stimulus is initiated, cortical and subcortical controls take over, moving the system across the nearby phase transition into the domain of orderly, high rates of firing (depicted on left). The system accrues significant advantages by not only being in the chaotic regime but also because it is poised in the vicinity of a phase transition. Adapted from ref. 34.

Consider, as we have before (34), a classic example of an exquisitely sensitive state of matter characterized by a rapid phase transition, the liquid crystal phase (38). We propose an analogy to a phase transition in CNS arousal systems from a state of quietude to a state in which behavioral activity is rapidly initiated. We suggest that the unaroused state is a “controlled chaotic” phase (39).

Chaotic systems have the potential to exhibit diverse behaviors. The exquisite sensitivity of chaotic systems to tiny perturbations is a powerful means of directing the trajectories in useful ways. Most important, the nonlinear dynamics of deterministic chaos provide exponential amplification of intrinsic fluctuations. Work on the control of chaos has demonstrated how one may use this sensitivity to develop feedback mechanisms for maintaining a system near dynamically unstable trajectories, thus vastly improving the flexibility in its performance. Furthermore, the unpredictability associated with chaos need not be a factor in the CNS because of the relatively short time that the system is in a chaotic state after receiving an arousing sensory input. This sensitivity to initial conditions and the inherent nonlinearity of a chaotic system near a phase transition (Fig. 2) would account for the organism's rapid response to an arousing stimulus.

We envision that a sensory input of sufficient magnitude in any modality (6) would trigger a rapid, nonlinear amplification of electrical activity in the defined neural circuits serving arousal. These could be, for example, norepinephrine-producing cells in the locus coeruleus, but they could also be cells expressing other small molecules or gene products related to arousal. When threshold firing rates of these arousal-driving neurons have been reached, the combination of the sensory input and the ascending arousal signals modulate neuronal activity in the cerebral cortex in a synchronized manner consistent with immediate attention. These cortical neurons, through their descending projections, then impose ordered patterns of activity not only to control excitability in arousal pathways but also to generate well-organized motor responses promptly.

The order/chaos phase transition has been considered before as an important feature of biological systems (28, 31). Our hypothesis simply proposes that CNS arousal systems have evolved such that they also take advantage of operating near a phase transition (31, 40). That is, they enjoy the benefits of chaotic lability and flexibility, and then, as soon as the organism is aroused, orderly, regular patterns of activity dominate. Along these lines, Rajan et al. (41) have recently carried out studies of the influence of the resonant effects of external stimuli on large chaotic neural networks and predict that the variance of neural responses ought to be significantly reduced, leading to elimination of chaos at frequencies in the range of many sensory systems. More generally, their calculations underscore the importance of the system being poised in the vicinity of the phase transition between a chaotic regime and one in which the chaos is completely absent.

There are three ways by which one could test the notion that GA systems in the mammalian brain operate in a manner that reflects chaotic dynamics: (i) recording neuronal activity in arousal-related neurons to search for evidence of chaotic attractors; (ii) simulating neural networks and looking for outputs that reflect in some way the operation of the logistic equation or other chaotic dynamics; and (iii) stimulating arousal-related neurons and looking for unusual results when pulse trains dictated by the logistic equation are applied.

Upon reflection, regarding approach (i), recording activity, we realized that even in low-dimensional systems, impossibly large amounts of data would have to be searched for evidence of chaotic dynamics and that the search would be even harder in the inevitable presence of biological noise. Therefore, we have worked on (ii), simulation, and (iii), stimulation. Approach (ii), simulating large neural nets, did lead to sudden breaks in activity at the value R = 3 (42), but neither the implications of that finding nor the network characteristics necessary for producing that finding is yet clear. Therefore, we have concentrated on approach (iii), deep brain stimulation (DBS).

Electrical Stimulation Using Temporally Patterned Pulse Trains.

DBS has been used to increase arousal in humans (43) and rats (10); one study also showed that stimulation can improve cognition in mice (44). However, these studies and all clinical applications of DBS have used simple fixed-frequency temporal patterns. Although neuroscientists observe temporally patterned neuronal responses, especially in sensory systems (45), it is still an open question whether these patterns are actually used by the CNS. A recently published study tested the hypothesis that, in DBS that increases arousal, temporally patterned electrical pulse trains work significantly better to support the animal's initiation of behavior, compared with conventional pulse trains that are physically identical in every way except the temporal patterning (46). Total number of pulses was also held constant.

In this study (46), mice were implanted with bilateral monopolar electrodes either in the ventral hippocampus or the medial thalamus. During the study, behavioral motor activity of each mouse was measured using an infrared home cage monitoring system. After recovery from surgery, animals were stimulated every 3 h during the dark phase of the light cycle for up to 3 d. The distributions of interpulse intervals in the three pulse trains used in this study are shown in Fig. 3A: the standard pattern of Fixed Frequency is in black and the two nonlinear patterns, Nonlinear1 and Nonlinear2, are shown in red and blue, respectively. Each animal was challenged with Fixed Frequency and at least one of the two nonlinear patterns. Order of stimulation for each pattern was counterbalanced as much as possible.

Fig. 3.

Fig. 3.

(A) Histogram of interpulse intervals for three temporal patterns of DBS. Counts represent the number of interpulse intervals that fall within the bin range. (B) Behavioral response to three temporal patterns of DBS. All stimulations were physically identical except with respect to temporal patterning. Data represent 10 min during and 10 min immediately after stimulation; are normalized to 10 min immediately before; and are reported as mean ± SEM. **P < 0.01 vs. Fixed Frequency (FF); ##P < 0.01 vs. Nonlinear2 (NL2). Adapted from ref. 46.

The temporal patterning of pulses within DBS affected the arousal-related behavior; specifically, Nonlinear1 increased whole-body movement (recorded as activity counts) during and after stimulation more than either Fixed Frequency or Nonlinear2 (Fig. 3B). Although these data are averaged across animals, the same effect can be seen in raw data from an individual. In Fig. 4, two activity outputs are displayed at high temporal resolution for the same animal during three different stimulations. Despite the fact that we know little about the mechanism of how temporal dynamics effect arousal, our data offer proof that temporal patterning within a pulse train can make a difference when all other physical parameters of the pulses are held constant. Under no circumstances would we assert that Nonlinear1 and -2 are unique, but instead these results show that temporal patterning of a pulse train for DBS is important, even when all other physical characteristics of the pulse train are held constant (46). These results suggest that work on the temporal dynamics of arousal systems will foster a deep understanding of how they work.

Fig. 4.

Fig. 4.

Behavioral response of one mouse to three temporal patterns of DBS. Data reported represent behavior 10 min before, 10 min during, and 10 min after hippocampal stimulation with (A) Fixed Frequency, (B) Nonlinear1, and (C) Nonlinear2. Gray boxes delineate stimulation epochs. Horizontal Activity (Hactv) and Total Distance (Totdist) are presented as sums of activity every second. Adapted from ref. 46.

Gaussian Distribution Describes the Activation of Behavior Due to Hunger

A different approach to the quantification of arousal is the use of an experimental manipulation to suddenly increase arousal when otherwise it would remain at a low level. That is, one of the most incisive ways to discover neuronal mechanisms underlying GA is to set up forces that regulate arousal against each other. Mistlberger (47) and a host of other scientists have done this by requiring hungry animals, in the absence of an alarm or any signal other than their hunger, to wake up and become active during the time of day they normally would be sleeping to receive food: they generate food-anticipatory activity (FAA). Although the neural network that generates FAA is still under investigation, we emphasize the importance of the ventromedial nucleus of the hypothalamus, a cell group in which we found the earliest activation of neurons that apparently support or even initiate FAA (48). Surprisingly, LeSauter et al. (49) discovered that a Gaussian distribution, shown below in Eq. 3,

graphic file with name pnas.1101894108eq3.jpg

closely represents the process of accumulating arousal, revealed by a correlation of r = 0.99 between the rising limb of the best-fit Gaussian and the cumulative activity. In this particular case, f(x) represents cumulative behavior at time x. The two parameters μ (mean) and σ (SD) were varied to achieve the excellent fit with the data. As LeSauter et al. (49) reasoned, “the close fit to the Gaussian indicates that the mechanisms underlying these data include a large number of individual neuronal go, no-go decisions with an increasing proportion of ‘go’ decisions as feeding time draws near.”

Further, this experiment (49) included an analysis of gene/behavior relations. It compared the FAA of ghrelin gene knockout animals with their wild-type littermates. Again, in the gene knockout animals, the excellent fit of the data to a Gaussian distribution suggests that the decision to activate this appetitive behavior can be understood as a series of repetitive binary choices, but the probability of a positive decision is only approximately half as large in the ghrelin knockout animals. The height of the Gaussian for the ghrelin knockout animals was half that of their wild-type littermates. In this sense, the comparison of the shapes and amplitudes of the Gaussians yields a mathematical description of a gene/behavior relationship.

Multiplicity of Approaches

We have used not just one but multiple quantitative approaches to conceive and describe a primitive, elementary CNS function. One of the three lines of evidence for the very existence of generalized CNS arousal depended on factor analysis, a mathematical statistical approach that uses a matrix of correlations among quantitative endpoints. Then, second, a conception of generalized CNS arousal used Shannon's equation to express “information” in mathematical terms. Third, following the initial conceptualization, we theorized that the most fundamental arousal mechanisms in the brainstems of all vertebrate animals operate in a nonlinear zone. A quantitative expression of our idea lay in the logistic equation, a formulation that has the capacity of yielding chaotic dynamics. Fourth, a Gaussian function accurately describes the increase in arousal caused by the anticipation of food. Under no circumstances would we claim that these four independent approaches are unique or that other equally useful approaches do not exist.

In addition to the multiplicity of quantitative approaches described here, it is important to understand that individual differences in this primitive function, GA, may be contributing to behavioral outcomes of various sorts without investigators realizing it, controlling for it, or incorporating it into their interpretations. By providing a quantitative assessment of GA based on simple behavioral measures, we hope to help investigators maximize the effectiveness of their studies and make further inroads into understanding the contribution of more specific variables to behavioral outputs.

In summary, we suspect that even for the analysis of elementary brain functions, let alone complex functions such as learning and memory, well-chosen sets of multiple equations will be required for their description and analysis.

Acknowledgments

We thank Christopher Kim and William Weinberger in the D.W.P. laboratory for their important work supplying some of the data on neural networks and the following readers for their useful feedback: Prof. T. James Matthews, Prof. Peggy Mason, Prof. Ralph Mistlberger, and Dr. Ilia Karatsoreos. This work was funded, in part, by a company, IntElect Medical Inc., in which Cornell University has part ownership. This work is also supported by National Institutes of Health Grants HD-05751 and MH-38273.

Footnotes

Conflict of interest statement: Through the licensing of technology, N.D.S. is a listed Cornell inventor and may benefit in the future from commercialization of intellectual property owned by Cornell.

This paper results from the Arthur M. Sackler Colloquium of the National Academy of Sciences, “Quantification of Behavior” held June 11–13, 2010, at the AAAS Building in Washington, DC. The complete program and audio files of most presentations are available on the NAS Web site at www.nasonline.org/quantification.

This article is a PNAS Direct Submission.

References

  • 1.Pfaff DW. Brain Arousal and Information Theory: Neural and Genetic Mechanisms. Cambridge, MA: Harvard Univ Press; 2006. p. vi. [Google Scholar]
  • 2.Garey J, et al. Genetic contributions to generalized arousal of brain and behavior. Proc Natl Acad Sci USA. 2003;100:11019–11022. doi: 10.1073/pnas.1633773100. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Gorsuch RL. Factor Analysis. 2nd Ed. Hillsdale, NJ: Lawrence Erlbaum Associates; 1983. p. xvii. [Google Scholar]
  • 4.Weil ZM, Zhang Q, Hornung A, Blizard D, Pfaff DW. Impact of generalized brain arousal on sexual behavior. Proc Natl Acad Sci USA. 2010;107:2265–2270. doi: 10.1073/pnas.0914014107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Pfaff D, Martin EM, Weingarten W, Vimal V. The central neural foundations of awareness and self-awareness. Prog Theor Phys Suppl. 2008;173:79–98. [Google Scholar]
  • 6.Martin EM, Pavlides C, Pfaff D. Multimodal sensory responses of nucleus reticularis gigantocellularis and the responses’ relation to cortical and motor activation. J Neurophysiol. 2010;103:2326–2338. doi: 10.1152/jn.01122.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Leung CG, Mason P. Physiological survey of medullary raphe and magnocellular reticular neurons in the anesthetized rat. J Neurophysiol. 1998;80:1630–1646. doi: 10.1152/jn.1998.80.4.1630. [DOI] [PubMed] [Google Scholar]
  • 8.Leung CG, Mason P. Physiological properties of raphe magnus neurons during sleep and waking. J Neurophysiol. 1999;81:584–595. doi: 10.1152/jn.1999.81.2.584. [DOI] [PubMed] [Google Scholar]
  • 9.Serafin M, Vidal PP, Mühlethaler M. Electrophysiological study of nucleus gigantocellularis neurons in guinea-pig brainstem slices. Neuroscience. 1996;73:797–805. doi: 10.1016/0306-4522(96)00054-1. [DOI] [PubMed] [Google Scholar]
  • 10.Wu HB, Stavarache M, Pfaff DW, Kow LM. Arousal of cerebral cortex electroencephalogram consequent to high-frequency stimulation of ventral medullary reticular formation. Proc Natl Acad Sci USA. 2007;104:18292–18296. doi: 10.1073/pnas.0708620104. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Mackersey K, Litvin Y, Pfaff DW, Martin EM. GABA-A receptor antagonism in the medullary gigantocellular reticular nucleus leads to increased responsiveness in isoflurane anesthetized mice. Society for Neuroscience. 2009 Program No. 276.272. 2009 Neuroscience Meeting Planner, Chicago. [Google Scholar]
  • 12.Csete M, Doyle J. Bow ties, metabolism and disease. Trends Biotechnol. 2004;22:446–450. doi: 10.1016/j.tibtech.2004.07.007. [DOI] [PubMed] [Google Scholar]
  • 13.Doyle J, Csete M. Motifs, control, and stability. PLoS Biol. 2005;3:e392. doi: 10.1371/journal.pbio.0030392. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Mikkelsen JD, et al. Hypocretin (orexin) in the rat pineal gland: A central transmitter with effects on noradrenaline-induced release of melatonin. Eur J Neurosci. 2001;14:419–425. doi: 10.1046/j.0953-816x.2001.01655.x. [DOI] [PubMed] [Google Scholar]
  • 15.Taheri S, Zeitzer JM, Mignot E. The role of hypocretins (orexins) in sleep regulation and narcolepsy. Annu Rev Neurosci. 2002;25:283–313. doi: 10.1146/annurev.neuro.25.112701.142826. [DOI] [PubMed] [Google Scholar]
  • 16.MacKay DM, McCullouch WS. The limiting information capacity of a neuronal link. Bull Math Biophys. 1952;14:127–135. [Google Scholar]
  • 17.Dayan P, Abbott LF. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Cambridge, MA: Massachusetts Institute of Technology Press; 2001. p. xv. [Google Scholar]
  • 18.Shannon CE. A mathematical theory of communication. Bell Syst Tech J. 1948;27:379–423. [Google Scholar]
  • 19.Paulin MG, Hoffman LF, Assad C. Distributed coding by single spikes in the bullfrog vestibular nerve: A basis for dynamical computation in neural systems. Neurocomputing. 2004;58-60:73–77. [Google Scholar]
  • 20.Rokem A, et al. Spike-timing precision underlies the coding efficiency of auditory receptor neurons. J Neurophysiol. 2006;95:2541–2552. doi: 10.1152/jn.00891.2005. [DOI] [PubMed] [Google Scholar]
  • 21.Pogosyan A, et al. Elevations in local gamma activity are accompanied by changes in the firing rate and information coding capacity of neurons in the region of the subthalamic nucleus in Parkinson's disease. Exp Neurol. 2006;202:271–279. doi: 10.1016/j.expneurol.2006.06.014. [DOI] [PubMed] [Google Scholar]
  • 22.Nemenman I, Lewen GD, Bialek W, de Ruyter van Steveninck RR. Neural coding of natural stimuli: information at sub-millisecond resolution. PLoS Comput Biol. 2008;4:e1000025. doi: 10.1371/journal.pcbi.1000025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Strong SP, Koberle R, de Ruyter van Steveninck RR, Bialek W. Entropy and information in neural spike trains. Phys Rev Letts. 1998;80:197–200. [Google Scholar]
  • 24.Victor JD. Binless strategies for estimation of information from neural data. Phys Rev E Stat Nonlin Soft Matter Phys. 2002;66:051903. doi: 10.1103/PhysRevE.66.051903. [DOI] [PubMed] [Google Scholar]
  • 25.Reeke GN, Coop AD. Estimating the temporal interval entropy of neuronal discharge. Neural Comput. 2004;16:941–970. doi: 10.1162/089976604773135050. [DOI] [PubMed] [Google Scholar]
  • 26.Nemenman I, Bialek W, de Ruyter van Steveninck R. Entropy and information in neural spike trains: Progress on the sampling problem. Phys Rev E Stat Nonlin Soft Matter Phys. 2004;69:056111. doi: 10.1103/PhysRevE.69.056111. [DOI] [PubMed] [Google Scholar]
  • 27.Yu YG, Crumiller M, Knight B, Kaplan E. Estimating the amount of information carried by a neuronal population. Front Comp Neurosci. 2010;4 doi: 10.3389/fncom.2010.00010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Greenfield E, Lecar H. Mutual information in a dilute, asymmetric neural network model. Phys Rev E Stat Nonlin Soft Matter Phys. 2001;63:041905. doi: 10.1103/PhysRevE.63.041905. [DOI] [PubMed] [Google Scholar]
  • 29.Kandel ER. Cellular mechanisms of learning and the biological basis of individuality. In: Kandel ER, editor. Principles of Neural Science. 4th Ed. New York: McGraw-Hill, Health Professions Division; 2000. [Google Scholar]
  • 30.Sanderson DJ, et al. Spatial working memory deficits in GluA1 AMPA receptor subunit knockout mice reflect impaired short-term habituation: Evidence for Wagner's dual-process memory model. Neuropsychologia. 2010;48:2303–2315. doi: 10.1016/j.neuropsychologia.2010.03.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Kauffman SA. The Origins of Order: Self-organization and Selection in Evolution. New York: Oxford Univ Press; 1993. p. xviii. [Google Scholar]
  • 32.May R. The best possible time to be alive. In: Farmelo G, editor. It Must Be Beautiful: Great Equations of Modern Science. London: Granta Books; 2002. pp. 28–45. [Google Scholar]
  • 33.Cohen JE. Unexpected dominance of high frequencies in chaotic nonlinear population models. Nature. 1995;378:610–612. doi: 10.1038/378610a0. [DOI] [PubMed] [Google Scholar]
  • 34.Pfaff D, Banavar JR. A theoretical framework for CNS arousal. Bioessays. 2007;29:803–810. doi: 10.1002/bies.20611. [DOI] [PubMed] [Google Scholar]
  • 35.Anderson RM, May RM. Regulation and stability of host-parasite population interactions. 1. Regulatory processes. J Anim Ecol. 1978;47:219–247. [Google Scholar]
  • 36.Feigenbaum MJ. Quantitative universality for a class of non-linear transformations. J Stat Phys. 1978;19:25–52. [Google Scholar]
  • 37.Feigenbaum MJ. Universal metric properties of non-linear transformations. J Stat Phys. 1979;21:669–706. [Google Scholar]
  • 38.de Gennes PG, Prost J. The Physics of Liquid Crystals. 2nd Ed. Oxford: Oxford Univ Press; 1993. p. xvi. [Google Scholar]
  • 39.Ott E, Sauer T, Yorke JA. Coping with Chaos: Analysis of Chaotic Data and the Exploitation of Chaotic Systems. New York: Wiley-Interscience; 1994. p. 418. [Google Scholar]
  • 40.Kauffman SA, Johnsen S. Coevolution to the edge of chaos: Coupled fitness landscapes, poised states, and coevolutionary avalanches. J Theor Biol. 1991;149:467–505. doi: 10.1016/s0022-5193(05)80094-3. [DOI] [PubMed] [Google Scholar]
  • 41.Rajan K, Abbott LF, Sompolinsky H. Stimulus-dependent suppression of chaos in recurrent neural networks. Phys Rev E Stat Nonlin Soft Matter Phys. 2010;82:011903. doi: 10.1103/PhysRevE.82.011903. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Vimal V, Quinkert AW, Weingarten WF, Reeke GN, Pfaff DW. Chaotic dynamics reflected in the output of a simulated neural network. Society for Neuroscience. 2010 Program No. 208.231. 2010 Neuroscience Meeting Planner, San Diego. [Google Scholar]
  • 43.Schiff ND, et al. Behavioural improvements with thalamic stimulation after severe traumatic brain injury. Nature. 2007;448:600–603. doi: 10.1038/nature06041. [DOI] [PubMed] [Google Scholar]
  • 44.Shirvalkar P, Seth M, Schiff ND, Herrera DG. Cognitive enhancement with central thalamic electrical stimulation. Proc Natl Acad Sci USA. 2006;103:17007–17012. doi: 10.1073/pnas.0604811103. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Wasserman GS. Isomorphism, task dependence, and the multiple meaning theory of neural coding. Biol Signals. 1992;1:117–142. doi: 10.1159/000109318. [DOI] [PubMed] [Google Scholar]
  • 46.Quinkert AW, Schiff ND, Pfaff DW. Temporal patterning of pulses during deep brain stimulation affects central nervous system arousal. Behav Brain Res. 2010;214:377–385. doi: 10.1016/j.bbr.2010.06.009. [DOI] [PubMed] [Google Scholar]
  • 47.Mistlberger RE. Food-anticipatory circadian rhythms: Concepts and methods. Eur J Neurosci. 2009;30:1718–1729. doi: 10.1111/j.1460-9568.2009.06965.x. [DOI] [PubMed] [Google Scholar]
  • 48.Ribeiro AC, LeSauter J, Dupré C, Pfaff DW. Relationship of arousal to circadian anticipatory behavior: Ventromedial hypothalamus: One node in a hunger-arousal network. Eur J Neurosci. 2009;30:1730–1738. doi: 10.1111/j.1460-9568.2009.06969.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.LeSauter J, Hoque N, Weintraub M, Pfaff DW, Silver R. Stomach ghrelin-secreting cells as food-entrainable circadian clocks. Proc Natl Acad Sci USA. 2009;106:13582–13587. doi: 10.1073/pnas.0906426106. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Proceedings of the National Academy of Sciences of the United States of America are provided here courtesy of National Academy of Sciences

RESOURCES