What is perceptual learning?
Perceptual learning is experience-dependent enhancement of our ability to make sense of what we see, hear, feel, taste or smell. These changes are permanent or semi-permanent, as distinct from shorter-term mechanisms like sensory adaptation or habituation. Moreover, these changes are not merely incidental but rather adaptive and therefore confer benefits, like improved sensitivity to weak or ambiguous stimuli.
Why is it interesting?
Three aspects of perceptual learning make it of general interest. First, perceptual learning reflects an inherent property of our perceptual systems and thus must be studied to understand perception. Second, perceptual learning is robust even in adults and thus represents an important substrate for studying mechanisms of learning and memory that persist beyond development. Third, perceptual learning is readily studied in a laboratory using simple perceptual tasks and thus researchers can exploit well-established psychophysical, physiological and computational methods to investigate the underlying mechanisms.
What is its history?
Perceptual learning was among the earliest research topics in perceptual psychology. Studies from over 150 years ago examined training-induced improvements in the ability to distinguish two points touched to the skin. These improvements included a nearly 100-fold decrease in the distance between two points that could be distinguished when placed on a human subject’s back. The improvements were assumed to be too dramatic and rapid to involve changes in the number of peripheral receptors and instead likely involved changes inside the nervous system. This idea has remained a dominant theme of perceptual learning since that time, with many studies designed to reveal the nature of the underlying neural changes.
Our understanding of the neural changes responsible for perceptual learning has benefitted greatly from the application of signal detection theory to perception. Signal detection theory is a computational framework that describes how to extract a signal from noise, while accounting for biases and other factors that can influence the extraction process. It has been used effectively to describe how the brain overcomes noise from both the environment and its own internal processes to perceive sensory signals. Many theories and models of perceptual learning are directly in the tradition of signal detection theory, relating improvements in behavior to particular changes in how perceptual systems extract sensory-related signals from noise.
How is it different from other kinds of learning?
Training-induced improvements in performance on perceptual tasks do not, by themselves, imply perceptual learning. Other forms of learning, such as those that establish task rules, associations and strategies, can similarly affect performance. Unlike these higher-order forms of learning, however, perceptual learning involves improved sensitivity independent of cognitive, motor or other, non-perceptual factors. Thus, perceptual learning is often measured as decreases in the strength, quality or duration of a stimulus needed to obtain a particular level of accuracy. Applying signal detection theory to these data can help to distinguish changes in perceptual sensitivity from other factors like choice biases. Identifying such changes in sensitivity in the absence of comparable changes in performance for easily perceived stimuli can further distinguish perceptual learning from higher-order task learning, a particularly important consideration for non-human subjects who must learn task rules by trial and error. In addition, perceptual learning is often, but not always, specific to the stimulus configuration used during training, like the location and orientation of visual stimuli in a texture-discrimination task. Such specificity is unlikely to arise with more cognitive adjustments of task performance.
The relationship between perceptual and associative learning merits further comment. Early theories of perceptual learning-like phenomena focused on associative concepts, like the improved ability to associate meaning with particular stimuli. Even recent studies often use similar conditions to study both forms of learning, which can make principled distinctions between the two difficult. However, perceptual learning is now recognized to involve improved sensory processing independent of associated meaning, sometimes even occurring in the absence of directed attention towards or even perception of the trained stimulus. But even these findings are somewhat complicated, in that association with a reward provided for a different context might be necessary for at least some forms of perceptual learning, and merit further study.
Are there different kinds of perceptual learning?
Yes. One of the difficulties in defining perceptual learning is that it can occur under a wide range of conditions that likely reflect an equally diverse set of neural changes. For example, perceptual learning has been described for different sensory modalities, with corresponding changes in different sensory pathways. Moreover, even within each modality, the mechanisms and characteristics of perceptual learning can differ considerably, particularly with respect to two factors: attention and reward processing.
Attention to a particular task or task-relevant sensory feature appears to be necessary for some forms of perceptual learning. Under some conditions, however, perceptual learning can occur for features to which focused attention is not directed, although it is not always as strong as when the feature is attended. Thus, attention can either enable or facilitate perceptual learning.
Likewise, reward can enable some forms of perceptual learning. For example, perceptual learning of a visual feature can occur if reward is given even without focused attention to the feature. Other forms of visual perceptual learning are thought to involve reward-driven changes in how the brain forms perceptual judgments. Such changes likely involve the dopaminergic system, which plays a central role in other forms of reinforcement learning and can drive perceptual learning-related changes in the representation of tones in auditory cortex. The prevalence of such reward-related mechanisms in perceptual learning and their relationship to attentional systems remain active areas of research.
What are the brain mechanisms of perceptual learning?
Three approaches have been used to study mechanisms of perceptual learning. The first is inference from behavior. The specificity of perceptual learning to the stimulus configuration used during training has often been used to argue that the changes must occur in early areas of sensory cortex, where information is represented with similar specificity. However, it has also been noted that logically this is not the only possibility, in that more central changes could also, in principle, exhibit such specificity. Another influential theory, called the reverse hierarchy theory, interprets attention-related differences in perceptual learning in terms of different loci of learning from higher to lower processing levels in the brain. More recently, it has been proposed that the key attentional subsystem is related to alerting (as opposed to orienting or executive function), which helps to ‘tag’ stimulus features to be learned. This idea can help explain the involvement of higher brain areas and the learning of task-irrelevant features when presented concurrently with task-relevant ones.
The second approach combines measures of behavior and brain activity using non-invasive techniques like functional magnetic resonance imaging (fMRI) in human subjects. For example, the blood oxygen-level-dependent (BOLD) signals of fMRI are enhanced in regions of primary visual cortex that correspond to the retinotopic location of a trained visual stimulus after perceptual learning, although at least in some cases the changes in BOLD are more transient than the changes in behavior. Recent work has begun to show the importance of sleep to these kinds of changes in primary visual cortex. Moreover, although several studies have not observed strong BOLD signal enhancement in areas higher than V1, one study found enhancement in not only early visual areas but also the parietal cortex, suggesting the involvement of higher areas in perceptual learning.
The third approach is to combine measures of brain activity and behavior in non-human subjects, typically monkeys. Several influential studies identified changes in primary auditory and somatosensory cortices of monkeys that had been trained on discrimination tasks in those modalities. Similar changes have been found in the visual pathway, although changes found in primary visual cortex were small relative to those found in auditory and somatosensory cortex. A primary challenge for these kinds of studies is how exactly to relate neural and behavioral changes. A recent study exploited extensive prior work relating single-neuron activity in multiple brain regions to behavior on a visual motion direction-discrimination task to show that, in monkeys learning the task, visual motion processing changed in a sensory-motor but not a sensory area. This work established the first evidence from single-unit studies for perceptual learning-related changes well beyond sensory cortex, in areas that interpret stimuli to form perceptual judgments.
What are the important outstanding questions?
Many of the most basic questions about perceptual learning remain unanswered, particularly those that concern the underlying neural mechanisms. Neural correlates have been identified for only a small fraction of behavioral perceptual learning phenomena. Do these generalize to other tasks? If not, what other mechanisms are used? How can the discrepancies between results from human and animal studies be resolved? Moreover, previous results have been primarily correlative: what are the neural changes that play a causal role in perceptual learning? Answering this question will require other techniques like the manipulation of neural activity during learning. Moreover, much more work is needed to relate identified changes in perceptual processing with cellular and synaptic mechanisms of plasticity.
Many questions also remain unanswered about the computational principles that govern perceptual learning. Many models have been proposed. For example, under some conditions perceptual learning is associated with the sharpening of tuning curves at or near the trained feature, to improve detectability or discriminability of that feature. Some models assume that perceptual learning occurs as a result of signal enhancement or noise reduction in the perceptual pathway. These improvements can be implemented by changes in connectivity between sensory and decision areas. Other models focus on the role of attention. However, it is still not known how to reconcile these different models with each other and with all of the perceptual learning-related behavioral and physiological phenomena. Are these models mutually exclusive and simply apply to different conditions? If so, what are those conditions? If not, do at least some of the models describe different aspects of the same phenomena? More systematic investigations of perceptual learning under different conditions will hopefully clarify these issues in the future.
Contributor Information
Joshua I. Gold, Email: jigold@mail.med.upenn.edu.
Takeo Watanabe, Email: takeo@bu.edu.
Where can I find out more?
- Fahle M. Perceptual learning: specificity versus generalization. Curr Opin Neurobiol. 2005;15:154–160. doi: 10.1016/j.conb.2005.03.010. [DOI] [PubMed] [Google Scholar]
- Gilbert CD, Sigman M, Crist RE. The neural basis of perceptual learning. Neuron. 2001;31:681–697. doi: 10.1016/s0896-6273(01)00424-x. [DOI] [PubMed] [Google Scholar]
- Karni A, Sagi D. Where practice makes perfect in texture discrimination: evidence for primary visual cortex plasticity. Proc Natl Acad Sci USA. 1991;88:4966–4970. doi: 10.1073/pnas.88.11.4966. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Law CT, Gold JI. Neural correlates of perceptual learning in a sensory-motor, but not a sensory, cortical area. Nat Neurosci. 2008;11:505–513. doi: 10.1038/nn2070. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Law CT, Gold JI. Reinforcement learning can account for associative and perceptual learning on a visual-decision task. Nat Neurosci. 2009;12:655–663. doi: 10.1038/nn.2304. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sasaki Y, Nanez JE, Watanabe T. Advances in visual perceptual learning and plasticity. Nat Rev Neurosci. 2010;11:53–60. doi: 10.1038/nrn2737. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Watanabe T, Nanez JES, Koyama S, Mukai I, Liederman J, Sasaki Y. Greater plasticity in lower-level than higher-level visual motion processing in a passive perceptual learning task. Nat Neurosci. 2002;5:1003–1009. doi: 10.1038/nn915. [DOI] [PubMed] [Google Scholar]