Skip to main content
Neuroscience Insights logoLink to Neuroscience Insights
. 2022 Aug 11;17:26331055221119221. doi: 10.1177/26331055221119221

A Distributed Model of Face and Body Integration

Celia Foster 1,2,3,
PMCID: PMC9386443  PMID: 35991808

Abstract

Separated face- and body-responsive brain networks have been identified that show strong responses when observers view faces and bodies. It has been proposed that face and body processing may be initially separated in the lateral occipitotemporal cortex and then combined into a whole person representation in the anterior temporal cortex, or elsewhere in the brain. However, in contrast to this proposal, our recent study identified a common coding of face and body orientation (ie, facing direction) in the lateral occipitotemporal cortex, demonstrating an integration of face and body information at an early stage of face and body processing. These results, in combination with findings that show integration of face and body identity in the lateral occipitotemporal, parahippocampal and superior parietal cortex, and face and body emotional expression in the posterior superior temporal sulcus and medial prefrontal cortex, suggest that face and body integration may be more distributed than previously considered. I propose a new model of face and body integration, where areas at the intersection of face- and body-responsive regions play a role in integrating specific properties of faces and bodies, and distributed regions across the brain contribute to high-level, abstract integration of shared face and body properties.

Keywords: Face, body, occipitotemporal cortex, orientation, viewpoint, identity, expression


COMMENT ON: Foster C, Zhao M, Bolkart T, Black MJ, Bartels A, Bülthoff I. The neural coding of face and body orientation in occipitotemporal cortex. Neuroimage. 2022 Feb 1;246:118783. doi: 10.1016/j.neuroimage.2021.118783. Epub 2021 Dec 5. PMID: 34879251.

Visual neuroscience has identified networks of brain regions across occipitotemporal cortex that show strong responses to particular stimulus categories, including faces, bodies, scenes and words. 1 Among these categories, the separation of face- and body-responsive brain regions is somewhat surprising as faces and bodies are part of a single object – a person. Thus, this leads to the question of why separated face- and body-responsive networks exist. One possibility is that separated networks may somehow allow occipitotemporal cortex to optimally perform computations that allow us to encode and extract high-level face and body properties (eg, identities, categories, direction of attention, emotional expressions) from varied low-level visual input. However, as we determine many of these properties using information from both the face and body, we must integrate face and body information somewhere in the brain.

In recent years, several potential models of face and body integration in the brain have been proposed.2-4 Common to all of these models is a proposal that integration is hierarchical, with faces and bodies initially processed more separately in posterior brain regions and then gradually being integrated in more anterior brain regions. Figure 1 illustrates the locations of the main face-responsive regions (Figure 1: red) and body-responsive regions (Figure 1: blue) in occipitotemporal cortex. One model of face and body integration proposed that faces and bodies are processed in a parts-based manner in the occipital face area (OFA) and extrastriate body area (EBA), as whole faces and bodies in the fusiform face area (FFA) and fusiform body area (FBA), and then as whole persons in the anterior temporal cortex. 2 A second model extended this proposal to include some person-responsive voxels in the fusiform gyrus and multimodal person integration in the superior temporal sulcus. 3 Finally, a review paper outlined 4 different models of face and body integration, ranging from face and body networks remaining completely separated in occipitotemporal cortex and integrated elsewhere in the brain, to face and body networks being fully integrated networks with a hierarchical organisation from lateral occipitotemporal cortex through to the anterior temporal cortex. 4 Thus, these models show that different possibilities exist for the location of face and body integration in the brain, but it is generally considered that faces and bodies are initially processed more separately, before being integrated and then processed as whole persons.

Figure 1.

Figure 1.

Locations of face- and body-responsive regions and coordinates showing integration of face and body properties. Face-responsive regions are shown in red, body-responsive regions are shown in blue and overlapping face- and body-responsive regions are shown in purple. These regions are based on mean coordinates and volumes for the pSTS-body, 5 and all other regions. 6 Spheres show coordinates of face and body orientation integration in green, 6 face and body identity integration in yellow 7 and face, body and voice expression integration in cyan. 8

Abbreviations: ATFA, anterior temporal face area; EBA, extrastriate body area; FBA, fusiform body area; FFA, fusiform face area; OFA, occipital face area; pSTS-body, posterior superior temporal sulcus body area; pSTS-face, posterior superior temporal sulcus face area.

Evidence for Early Integration of Face and Body Properties

Our recent study 6 suggests that this assumption of hierarchical coding may be incorrect, as we find evidence of integration of face and body information at an early stage of face and body processing in the lateral occipitotemporal cortex. We recorded brain responses using functional magnetic resonance imaging (fMRI) while participants viewed images of faces and bodies from different orientations (ie, facing directions). We then used a multivoxel pattern analysis cross-classification approach to test whether patterns of brain responses distinguishing the different face orientations could generalise to responses evoked by the body orientations, and vice versa. Such generalisation would suggest that these regions encode orientation information using a common coding, and would therefore suggest that there is an integration of orientation information from the face and body. Our whole-brain searchlight results showed that we could decode orientation across face and body stimuli bilaterally in a region at the intersection of the face-responsive OFA and the body responsive EBA (Figure 1: green spheres). Thus, our results demonstrate that face and body orientation information is integrated at a very early stage of face and body processing.

In addition to this early integration of face and body processing, our results also showed that there is further processing of face orientation beyond the lateral occipitotemporal cortex. We found that the FFA encoded face orientation, but not body orientation, which is consistent with previous findings of face-specific orientation coding in the FFA. 9 Potentially, such further processing could aid us in determining direction of attention when the head is turned to orient in a different direction on the body. However, further work, testing neural responses to stimuli where the head is turned away from the direction of body orientation, will be required to determine if this is the case.

Findings from other studies are also in accordance with early integration of face and body information. The posterior lateral face patch, as well as the middle lateral face patch, in the macaque monkey showed responses to the position a face would be expected to be located when this face was only implied by contextual cues from the presence of a body. 10 This suggests that body information is integrated into the responses of this early face-responsive area. Furthermore, effective connectivity of face and body networks determined by microstimulation of face- and body-responsive patches identified some voxels at the intersection of face- and body-responsive patches, including the posterior lateral face patch and the middle superior temporal sulcus body patch, that were activated by both microstimulation of face and body patches. 11 This finding provides further support for integration of face and body information at the intersection of face- and body-responsive regions, including the most posterior regions. Lastly, human transcranial magnetic stimulation (TMS) has also provided evidence for early interactions between face and body processing. TMS applied separately to the right OFA and right EBA impaired both face and body processing at an early time period after stimulus onset. 12 Thus, this finding suggests there is shared processing of faces and bodies that is disrupted by stimulation of these early face- and body-responsive human brain regions.

Integration of Other Shared Face and Body Properties

Where in the brain are other shared properties of faces and bodies integrated? We tested how shared identity, detected from the face and body shown alone, is encoded in the brain using a similar searchlight cross-classification approach. 7 Participants were trained to recognise individuals from images of the face and body, and then their brain activity was recorded using fMRI as they viewed these face and body images. Searchlight results identified patterns of responses that could generalise across face and body identities in several regions, including a region in the right lateral occipitotemporal cortex close to our coordinates that showed face and body orientation integration, but also more dispersed locations including the right parahippocampal cortex and right superior parietal cortex (Figure 1: yellow spheres). Thus, face and body identity integration appears to involve more dispersed brain regions as compared to face and body orientation integration.

Why might identity integration involve dispersed regions? One possibility is that these dispersed regions encode different aspects of face and body identity. Identity coding in the right lateral occipitotemporal cortex might be driven by shared shape properties of face and body identities, and might be orientation-specific. In previous work, we found that we could decode stimulus weight (ie, fatter vs thinner) across face and body stimuli from the EBA, 13 suggesting there may be a shared coding of this category in lateral occipitotemporal cortex that could be driven by shared shape-related aspects or abstract coding of these stimuli. Stimuli in this study were all shown in the same orientation, and although stimuli in our identity decoding were from different orientations, it is possible that successful decoding across face and body stimuli could be driven by successful decoding within the same orientation. Orientation-specific decoding in lateral occipitotemporal cortex would be consistent with our orientation decoding results, 6 as well as a general model that proposes that posterior category-responsive regions encode view-specific representations. 14 In contrast to view-specific coding in the lateral occipitotemporal cortex, previous work has identified view-invariant coding of face identity in the fusiform and anterior temporal cortex,15,16 and we also identified view-invariant coding of body identity in these regions. 7 Thus, potentially the right parahippocampal cortex, located in the anterior temporal cortex, may encode identity in a more abstract, view-invariant manner. Similarly, the intraparietal sulcus, close to our coordinates in the superior parietal cortex, has been associated with abstract encoding of face and object identity, 17 suggesting that face and body identity may also be encoded in an abstract manner in the parietal cortex. Altogether, these insights suggest that different aspects of identity-related information may be encoded in different regions, but further research is required to understand the exact coding used by these distributed regions.

A third shared face and body property, for which shared coding has been tested, is emotional expression. Participants viewed movies of faces and bodies (shown separately) displaying emotional expressions, as well as voices expressing the same emotions. 8 Peelen et al 8 then performed a searchlight analysis to identify brain regions that encoded the perceived emotion (eg, happiness, sadness, anger) across face, body and voice stimuli. They identified 2 regions showing abstract integration of emotional expressions, the left posterior superior temporal sulcus (pSTS) and the right medial prefrontal cortex (Figure 1: cyan spheres). Notably, the coordinates of expression integration in the pSTS are in-between the location of the pSTS-face region, a region known be involved in encoding facial expressions, 18 and the body-responsive EBA. As decoding of emotional expression was across voice stimuli, in addition to face and body stimuli, it is possible that the left hemisphere specificity of the pSTS results may have been influenced by language lateralisation in the left hemisphere. Decoding of emotional expression across stimuli from the medial prefrontal cortex shows that emotional expression integration, like identity integration, occurs across distributed brain regions. Emotional expression coding in the medial prefrontal cortex has been shown to be highly abstract, as coding has also been shown to generalise to emotions that are inferred by situation context. 19 Altogether, these results show distributed coding of emotional expression across the brain, and some evidence that different aspects of emotional expressions may be encoded in these different regions.

A Distributed Model of Face and Body Integration

The experimental findings outlined above demonstrate that several regions are involved in face and body integration, and that different brain regions are involved in the integration of different face and body properties (eg, orientations, identities, emotional expressions). In Figure 2, I give an overview of the brain regions in occipitotemporal cortex that show integration of shared face and body properties, in relation to the known functional properties of face- and body-responsive occipitotemporal regions.18,20 This model highlights the importance of areas at the intersection of face- and body-responsive regions, with the intersection of the OFA and EBA involved in orientation and shape integration, and the intersection of the EBA and pSTS-face in expression integration. Thus, intersecting areas appear to integrate specific face and body properties that may first be processed separately for faces and bodies in the 2 nearby face- and body-responsive regions. Future studies could investigate this proposal by testing whether neuropsychological patients with damage to these intersecting regions show deficits in the integration of specific face and body properties, or whether TMS to these intersecting regions causes disruptions in behavioural tasks that require an integration of face and body processing.

Figure 2.

Figure 2.

An overview model of the brain regions involved in visual processing and integration of faces and bodies in occipitotemporal cortex, including their known functional response properties. Face-responsive brain regions are shown in red, body-responsive regions are shown in blue and regions showing integration of face and body properties are shown in purple.

Further distributed regions, beyond the occipitotemporal regions outlined in Figure 2, have also been found to integrate face and body properties, and may do so in a high-level abstract manner. In the studies outlined above, the right superior parietal cortex was found to encode face and body identity information, and the right medial prefrontal cortex was found to encode face, body and voice emotional expression information,7,8 and both of these regions have been also associated with abstract coding of these respective properties.17,19 Moreover, it is likely that there may be further regions, both within and outside of occipitotemporal cortex, that are involved in the integration of other shared face and body properties that have not yet been tested experimentally. In summary, the proposed model outlines a new perspective on face and body integration that is distributed and property-specific, but further research is needed to uncover the full extent of brain regions involved in face and body integration, and the specific coding of information in these regions.

Footnotes

Funding: The author disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Celia Foster is supported by German Research Foundation (DFG), grant number 6368/4-1 in the DFG/ANR programme for German-French Projects in the Natural, Life, and Engineering Sciences.

Declaration of Conflicting Interests: The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Author Contribution: CF wrote the article and prepared the figures.

References

  • 1. Op de Beeck HP, Pillet I, Ritchie JB. Factors determining where category-selective areas emerge in visual cortex. Trends Cogn Sci. 2019;23:784-797. [DOI] [PubMed] [Google Scholar]
  • 2. Harry BB, Umla-Runge K, Lawrence AD, Graham KS, Downing PE. Evidence for integrated visual face and body representations in the anterior temporal lobes. J Cogn Neurosci. 2016;28:1178-1193. [DOI] [PubMed] [Google Scholar]
  • 3. Hu Y, Baragchizadeh A, O’Toole AJ. Integrating faces and bodies: psychological and neural perspectives on whole person perception. Neurosci Biobehav Rev. 2020;112:472-486. [DOI] [PubMed] [Google Scholar]
  • 4. Taubert J, Ritchie JB, Ungerleider LG, Baker CI. One object, two networks? Assessing the relationship between the face and body-selective regions in the primate visual system. Brain Struct Funct. 2022;227:1423-1438. [DOI] [PubMed] [Google Scholar]
  • 5. Ross P, de Gelder B, Crabbe F, Grosbras MH. A dynamic body-selective area localizer for use in fMRI. MethodsX. 2020;7:100801. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Foster C, Zhao M, Bolkart T, Black MJ, Bartels A, Bülthoff I. The neural coding of face and body orientation in occipitotemporal cortex. Neuroimage. 2022;246:118783. [DOI] [PubMed] [Google Scholar]
  • 7. Foster C, Zhao M, Bolkart T, Black MJ, Bartels A, Bülthoff I. Separated and overlapping neural coding of face and body identity. Hum Brain Mapp. 2021;42:4242-4260. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Peelen MV, Atkinson AP, Vuilleumier P. Supramodal representations of perceived emotions in the human brain. J Neurosci. 2010;30:10127-10134. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Ramírez FM, Cichy RM, Allefeld C, Haynes JD. The neural code for face orientation in the human fusiform face area. J Neurosci. 2014;34:12155-12167. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Arcaro MJ, Ponce C, Livingstone M. The neurons that mistook a hat for a face. eLife. 2020;9:e53798. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Premereur E, Taubert J, Janssen P, Vogels R, Vanduffel W. Effective connectivity reveals largely independent parallel networks of face and body patches. Curr Biol. 2016;26:3269-3279. [DOI] [PubMed] [Google Scholar]
  • 12. Pitcher D, Goldhaber T, Duchaine B, Walsh V, Kanwisher N. Two critical and functionally distinct stages of face and body perception. J Neurosci. 2012;32:15877-15885. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Foster C, Zhao M, Romero J, et al. Decoding subcategories of human bodies from both body- and face-responsive cortical regions. Neuroimage. 2019;202:116085. [DOI] [PubMed] [Google Scholar]
  • 14. Bao P, She L, McGill M, Tsao DY. A map of object space in primate inferotemporal cortex. Nature. 2020;583:103-108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Anzellotti S, Fairhall SL, Caramazza A. Decoding representations of face identity that are tolerant to rotation. Cereb Cortex. 2014;24:1988-1995. [DOI] [PubMed] [Google Scholar]
  • 16. Guntupalli JS, Wheeler KG, Gobbini MI. Disentangling the representation of identity from head view along the human face processing pathway. Cereb Cortex. 2017;27:46-53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Jeong SK, Xu Y. Behaviorally relevant abstract object identity representation in the human parietal cortex. J Neurosci. 2016;36:1607-1619. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Duchaine B, Yovel G. A revised neural framework for face processing. Annu Rev Vis Sci. 2015;1:393-416. [DOI] [PubMed] [Google Scholar]
  • 19. Skerry AE, Saxe R. A common neural code for perceived and inferred emotion. J Neurosci. 2014;34:15997-16008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Peelen MV, Downing PE. The neural basis of visual body perception. Nat Rev Neurosci. 2007;8:636-648. [DOI] [PubMed] [Google Scholar]

Articles from Neuroscience Insights are provided here courtesy of SAGE Publications

RESOURCES