Skip to main content
. 2012 Jul 2;109(29):11854–11859. doi: 10.1073/pnas.1205381109

Fig. 1.

Fig. 1.

Illustration of object-based neural representations. Here, the auditory scene is illustrated using a mixture of two concurrent speech streams. (A) If a complex auditory scene is not neurally parsed into separate auditory objects, cortical activity (Upper, curve) phase locks to the temporal envelope of the physical stimulus [i.e., the acoustic mixture (Lower, waveform)]. (B) In contrast, using the identical stimulus (but illustrated here with the unmixed instances of speech in different colors), for a hypothetical neural representation of an individual auditory object, neural activity would instead selectively phase lock to the temporal envelope only of that auditory object. (C) Neural representation of an auditory object should, furthermore, neurally adapt to an intensity change of its own object (Upper) but should remain insensitive to intensity changes in another auditory object (Lower). Neither of these modifications to the acoustic stimulus therefore significantly changes the neural representation (comparing A and C).