Skip to main content
Proceedings of the National Academy of Sciences of the United States of America logoLink to Proceedings of the National Academy of Sciences of the United States of America
. 2021 Feb 1;118(6):e2020434118. doi: 10.1073/pnas.2020434118

Temporal dissociation of neural activity underlying synesthetic and perceptual colors

Lina Teichmann a,b,1,2, Tijl Grootswagers c, Denise Moerel a,b, Thomas A Carlson d, Anina N Rich a,b
PMCID: PMC8017966  PMID: 33526693

Abstract

Grapheme-color synesthetes experience color when seeing achromatic symbols. We examined whether similar neural mechanisms underlie color perception and synesthetic colors using magnetoencephalography. Classification models trained on neural activity from viewing colored stimuli could distinguish synesthetic color evoked by achromatic symbols after a delay of ∼100 ms. Our results provide an objective neural signature for synesthetic experience and temporal evidence consistent with higher-level processing in synesthesia.

Keywords: synesthesia, MEG, MVPA, color


Grapheme-color synesthesia involves involuntary, vivid, and consistent color experiences evoked by achromatic symbols such as letters or digits (1, 2). Similar to mental imagery, synesthesia involves experiences that are not directly driven by sensory input. In contrast, however, synesthetic experiences are highly consistent, evoked involuntarily, and cannot be suppressed, making synesthesia an ideal tool to study inherently subjective experiences. It also offers a unique perspective to understanding color representations, which range from direct perception through to object color knowledge (e.g., knowing a banana is yellow).

Despite two decades of increasing research, the neural mechanisms underpinning synesthetic colors and their relationship to direct color perception are still unclear (3). Bottom-up theories of synesthesia, such as the cross-activation theory (4, 5), propose that synesthetic colors occur due to additional neuronal connections in early visual areas, resulting in color representations being automatically coactivated when perceiving synesthesia-inducing stimuli. This predicts that color perception and synesthesia involve activation in the same brain regions at similar times (Fig. 1). Top-down theories of synesthesia, such as the conceptual mediation theory (6), propose that associative and conceptual-level processing play a critical role: for synesthetes, when a symbol is processed, the activated semantic network includes color attributes. This predicts that color representations evoked by synesthesia might be similar to perception but should occur later, after conceptual processing (Fig. 1).

Fig. 1.

Fig. 1.

Temporal predictions for two types of synesthesia theories. Cross-activation theory (4, 5) predicts that color perception and synesthetic color activation occur at the same time in the visual hierarchy, so neural representations evoked by perception should cross-generalize to synesthesia at the same time. Conceptual mediation theory (6) predicts that synesthetic color is an additional conceptual feature of the stimulus. Conceptual features are activated later in the visual hierarchy than color perception, predicting a delay between color representations evoked by direct perception and synesthetic association. Neural activation pattern evoked by direct and synesthetic color perception can be compared by training a classifier on neural data evoked by colored stimuli and tested on data evoked by (achromatic) synesthesia-inducing symbols. Cross-activation theory predicts on-diagonal generalization at early time points because color information in both direct and synesthetic color experiences should occur simultaneously. Conceptual mediation theory predicts later off-diagonal generalization because for synesthetic color to be elicited, there is an additional stage of accessing the concept of the inducing letter.

We used time-resolved neural data recorded with magnetoencephalography (MEG) to test whether the pattern of neural activity during direct color perception also occurs for synesthetic colors and if so, at what time. Two previous MEG studies (7, 8) focused on which brain areas of single synesthetes process synesthetic colors (as have other neuroimaging studies with single cases or groups of synesthetes [e.g., refs. 911]). In contrast, we use a whole-brain multivariate approach that does not rely on selection of specific sensors, brain regions, or time points. This allows us to compare the temporal predictions of the cross-activation and conceptual mediation theories.

Results and Discussion

We report MEG data from 18 grapheme-color synesthetes viewing colored shapes and synesthesia-inducing achromatic symbols (Fig. 2) (12). A challenge for exploring brain activity related to synesthesia is that the stimuli that evoke it differ in more ways than just synesthetic color. For example, the letters “A” and “B” might elicit different colors, but they also differ in shape, sound, and word associations. Thus, we cannot simply look at the brain activation patterns these stimuli evoke. Instead, we trained a classification model on sensor activation patterns from synesthetes viewing red- and green-colored shapes and tested it using brain activity from the same participants viewing achromatic symbols reported to evoke synesthetic red or green. If a classifier trained on activity from perceiving red vs. green shapes can successfully predict synesthetic red vs. green based on activity in achromatic trials, there must be a similarity in the neural pattern between these conditions. Using these different stimuli to train and test the classification model prevents any nonhue differences (e.g., shape) between the letters from influencing the model, so we can be confident that color is the key feature driving the effect.

Fig. 2.

Fig. 2.

Design and analysis. (A) Example colored shapes and synesthesia-inducing symbols (different for each synesthete). (B) Example trial showing stimulus presentation and interstimulus interval (ISI) durations. Participants pressed a button for a different target object per block. Colored shapes and synesthesia-inducing symbols were presented in separate blocks.

We used temporal generalization methods (13) to test whether similar neural patterns occur at any time. The classification model was trained at each time point of the neural data from seeing red- and green-colored shapes and then tested at all time points of the test set. We first checked whether this method could classify direct color perception data (train on half color trials, test on remainder) (Fig. 3A). The neural signal of perceiving red and green can be decoded from ∼80 to 200 ms, with classification falling mainly on the diagonal, showing it occurs at the same time in training and testing datasets (Fig. 3A). We then tested our main question of whether color representations evoked by perception and synesthesia are similar. The model trained on neural data from seeing red vs. green shapes at ∼200 ms could generalize to patterns evoked by the induced synesthetic color at ∼300 to 400 ms (Fig. 3B). Thus, the neural similarity between color representations evoked by perception and synesthesia emerged only at later time points and only off the diagonal. Conceptually, this means that a similar neural color representation at 200 ms for direct perception is present, but at a later time, when people experience synesthetic colors.

Fig. 3.

Fig. 3.

Results. (A) Decoding results for color perception: the object’s color can be decoded from ∼70 to ∼200 ms with above-chance decoding time point combinations falling on the diagonal. (B) Decoding results for the classification model trained on red vs. green shapes and tested on synesthetic colors: the pattern for color perception at ∼200 ms generalizes to the pattern for synesthetic color above chance stretching ∼300 to 400 ms. Colored dots are above-chance decoding time point combinations (P < 0.05, random effect cluster-based permutation tests corrected for multiple comparisons using the maximum cluster statistic across time points). Pixel color shows decoding accuracy at this time point. White pixels are time point combinations not significantly above chance. Gray bars show the number of significant pixels in different time bins.

This study provides an objective verification of synesthetic colors and demonstrates the value of time-resolved decoding methods for studying subjective phenomena. The results show that later stages of direct color perception (∼200 ms) correspond to the neural signal of synesthesia. The finding that color representations are activated later via synesthesia than perception indicates the synesthesia-inducing stimulus requires considerable processing before the color experience is generated. This time course is more consistent with the conceptual mediation than with the cross-activation theory. Integrated with studies of direct color perception (e.g., ref. 14), object color knowledge (e.g., ref. 15), and mental imagery (e.g., ref. 16), our results provide a unique insight into the time course of the influence of knowledge on visual perception. Overall, our results demonstrate a neural signature for the unusual color experiences of synesthetes and strongly support the role of higher-level processing in synesthesia.

Materials and Methods

A detailed materials and methods section can be found in SI Appendix.

Eighteen grapheme-color synesthetes completed a target detection task (Fig. 2A) while we recorded MEG data with whole-head axial gradiometers. In the colored shapes condition, synesthetes viewed three different shapes (17) that were red or green (equated for perceptual luminance at three levels; identical shapes in the two color categories). In the synesthesia-inducing symbol condition, we used six different symbols that consistently evoked red or green synesthetic colors for each individual, presented in black in three different fonts (Fig. 2B).

Linear discriminant classification models combined with time generalization methods (13) were used to test for shared representations evoked by color perception and synesthetic colors. We trained the classification model to distinguish between red- and green-colored shape trials at each time point. Then, we tested the model’s prediction accuracy on the independent data of colored shape trials and on data of the achromatic letter trials at all time points. For the within-color decoding (Fig. 3A), we used a split-half cross-validation for training and testing the classifier (702 trials in training and testing sets). For cross-decoding (Fig. 3B), we used all colored shape trials for training (1,404 trials) and all synesthesia-inducing symbol trials for testing (1,404 trials).

We based statistical inferences on random effects Monte Carlo cluster statistics with Threshold Free Cluster Enhancement (18) to measure cluster support, as implemented in the CoSMoMVPA toolbox (19). We generated a null distribution using shuffled label permutations (20) and corrected for multiple comparisons by selecting the maximum value across time. We determined statistical significance by comparing decoding accuracies with the 95th percentile of this corrected null distribution.

Raw data and custom-written analysis codes are available at https://doi.org/10.17605/OSF.IO/SBQDW. The study was approved by the Macquarie University Ethics Committee, and participants gave written informed consent.

Supplementary Material

Supplementary File
pnas.2020434118.sapp.pdf (111.9KB, pdf)

Acknowledgments

We acknowledge support from the KIT-Macquarie Brain Research (MEG) Laboratory (Macquarie University); The Menzies Foundation; Australian Research Council (ARC) Future Fellowship FT120100816 (to T.A.C.); and ARC Discovery Grants DP160101300 (to T.A.C.), DP12102835 (to A.N.R.), and DP170101840 (to A.N.R.). We thank C. Baker, B. Mamo, J. Mattingley, S. Wardle, and M. Williams for useful comments.

Footnotes

The authors declare no competing interest.

This article contains supporting information online at https://www.pnas.org/lookup/suppl/doi:10.1073/pnas.2020434118/-/DCSupplemental.

Data Availability.

All data are publicly available on the author's Open Science Framework repository: https://doi.org/10.17605/OSF.IO/SBQDW.

References

  • 1.Rich A. N., Bradshaw J. L., Mattingley J. B., A systematic, large-scale study of synaesthesia: Implications for the role of early experience in lexical-colour associations. Cognition 98, 53–84 (2005). [DOI] [PubMed] [Google Scholar]
  • 2.Simner J., et al., Synaesthesia: The prevalence of atypical cross-modal experiences. Perception 35, 1024–1033 (2006). [DOI] [PubMed] [Google Scholar]
  • 3.Hupé J.-M., Dojat M., A critical review of the neuroimaging literature on synesthesia. Front. Hum. Neurosci. 9, 103 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Ramachandran V. S., Hubbard E. M., Synaesthesia—a window into perception, thought and language. J. Conscious. Stud. 8, 3–34 (2001). [Google Scholar]
  • 5.Hubbard E. M., Brang D., Ramachandran V. S., The cross-activation theory at 10. J. Neuropsychol. 5, 152–177 (2011). [DOI] [PubMed] [Google Scholar]
  • 6.Chiou R., Rich A. N., The role of conceptual knowledge in understanding synaesthesia: Evaluating contemporary findings from a “hub-and-spokes” perspective. Front. Psychol. 5, 105 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Brang D., Hubbard E. M., Coulson S., Huang M., Ramachandran V. S., Magnetoencephalography reveals early activation of V4 in grapheme-color synesthesia. Neuroimage 53, 268–274 (2010). [DOI] [PubMed] [Google Scholar]
  • 8.Yokoyama T., et al., Multiple neural mechanisms for coloring words in synesthesia. Neuroimage 94, 360–371 (2014). [DOI] [PubMed] [Google Scholar]
  • 9.Ruiz M. J., Dojat M., Hupé J.-M., Multivariate pattern analysis of fMRI data for imaginary and real colours in grapheme-colour synaesthesia. Eur. J. Neurosci. 52, 3434–3456 (2020). [DOI] [PubMed] [Google Scholar]
  • 10.Hubbard E. M., Arman A. C., Ramachandran V. S., Boynton G. M., Individual differences among grapheme-color synesthetes: Brain-behavior correlations. Neuron 45, 975–985 (2005). [DOI] [PubMed] [Google Scholar]
  • 11.Rich A. N., et al., Neural correlates of imagined and synaesthetic colours. Neuropsychologia 44, 2918–2925 (2006). [DOI] [PubMed] [Google Scholar]
  • 12.Teichmann L., Grootswagers T., Moerel D., Carlson T. A., Rich A. N., Temporal dissociation of neural activity underlying synesthetic and perceptual colors. Open Science Framework. 10.17605/OSF.IO/SBQDW. Deposited 22 January 2021. [DOI] [PMC free article] [PubMed]
  • 13.King J. R., Dehaene S., Characterizing the dynamics of mental representations: The temporal generalization method. Trends Cogn. Sci. 18, 203–210 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Zeki S., Marini L., Three cortical stages of colour processing in the human brain. Brain 121, 1669–1685 (1998). [DOI] [PubMed] [Google Scholar]
  • 15.Hansen T., Olkkonen M., Walter S., Gegenfurtner K. R., Memory modulates color appearance. Nat. Neurosci. 9, 1367–1368 (2006). [DOI] [PubMed] [Google Scholar]
  • 16.Pearson J., The human imagination: The cognitive neuroscience of visual mental imagery. Nat. Rev. Neurosci. 20, 624–634 (2019). [DOI] [PubMed] [Google Scholar]
  • 17.Op de Beeck H. P., Baker C. I., DiCarlo J. J., Kanwisher N. G., Discrimination training alters object representations in human extrastriate cortex. J. Neurosci. 26, 13025–13036 (2006). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Smith S. M., Nichols T. E., Threshold-free cluster enhancement: Addressing problems of smoothing, threshold dependence and localisation in cluster inference. Neuroimage 44, 83–98 (2009). [DOI] [PubMed] [Google Scholar]
  • 19.Oosterhof N. N., Connolly A. C., Haxby J. V., CoSMoMVPA: Multi-modal multivariate pattern analysis of neuroimaging data in Matlab/GNU octave. Front. Neuroinform. 10, 27 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Stelzer J., Chen Y., Turner R., Statistical inference and multiple testing correction in classification-based multi-voxel pattern analysis (MVPA): Random permutations and cluster size control. Neuroimage 65, 69–82 (2013). [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary File
pnas.2020434118.sapp.pdf (111.9KB, pdf)

Data Availability Statement

All data are publicly available on the author's Open Science Framework repository: https://doi.org/10.17605/OSF.IO/SBQDW.


Articles from Proceedings of the National Academy of Sciences of the United States of America are provided here courtesy of National Academy of Sciences

RESOURCES