Skip to main content
PLOS Biology logoLink to PLOS Biology
. 2020 Jun 8;18(6):e3000729. doi: 10.1371/journal.pbio.3000729

Is an artificial limb embodied as a hand? Brain decoding in prosthetic limb users

Roni O Maimon-Mor 1,2, Tamar R Makin 1,2,3,*
Editor: Karunesh Ganguly4
PMCID: PMC7302856  PMID: 32511238

Abstract

The potential ability of the human brain to represent an artificial limb as a body part (embodiment) has been inspiring engineers, clinicians, and scientists as a means to optimise human–machine interfaces. Using functional MRI (fMRI), we studied whether neural embodiment actually occurs in prosthesis users’ occipitotemporal cortex (OTC). Compared with controls, different prostheses types were visually represented more similarly to each other, relative to hands and tools, indicating the emergence of a dissociated prosthesis categorisation. Greater daily life prosthesis usage correlated positively with greater prosthesis categorisation. Moreover, when comparing prosthesis users’ representation of their own prosthesis to controls’ representation of a similar looking prosthesis, prosthesis users represented their own prosthesis more dissimilarly to hands, challenging current views of visual prosthesis embodiment. Our results reveal a use-dependent neural correlate for wearable technology adoption, demonstrating adaptive use–related plasticity within the OTC. Because these neural correlates were independent of the prostheses’ appearance and control, our findings offer new opportunities for prosthesis design by lifting restrictions imposed by the embodiment theory for artificial limbs.


How is a prosthetic limb represented in the brains of its users, and how does everyday usage shapes that neural representation? Successful prosthesis usage by individuals missing a hand was predicted not by embodiment (hand-similarity), but by a more distinct categorical representation of artificial limbs.

Introduction

The development of wearable technology for substitution (e.g., prosthetic limbs [1], exoskeletons [2]) and augmentation (e.g., supernumerary fingers and arms [3]) is rapidly advancing. Clinical research on prosthetic limbs, the most established form of wearable motor technology to date, teaches us that technological development is necessary but not sufficient for successful device adoption and usage. For example, only 45% of all arm amputees choose to use their prosthesis regularly [4]. The causes for prosthesis rejection are multiplex and include awkward control over the device, lack of tactile feedback, and complex training requirements [46]. Crucially, successful prostheses adoption depends on the human brain’s ability to effectively represent and operate it [7]. A popular (and yet untested) assumption is that amputees reject their prostheses—tools designed to substitute hand function—because they do not feel like it is a real body part (i.e., embodied) [8] and that embodiment can improve prosthesis usage [916].

The first challenge in harnessing the potential powers of embodiment to improve prosthesis design and usage is measuring embodiment. Embodiment is an umbrella term for the multisensory association of external objects with body representation, which engages multiple levels from both conceptual and perceptual perspectives (see “Discussion”). From a neural standpoint, embodiment is defined by the successful allocation of brain resources, originally devoted to controlling one’s own body, to represent and operate external objects [17]. Here, we focus on visual embodiment—the successful allocation of visual hand-related neural resources. Visual embodiment is particularly relevant for studying prosthesis representation; this is because prosthesis usage is highly visually guided [18]. Moreover, visual internal models of the body have been suggested as essential gateways for processing multisensory integration that will result in successful bodily ownership [19], a desired feature of prosthesis usage [9,10]. Previous efforts to test visual prosthesis embodiment have been centred on an illusion of body ownership over the prosthesis (most commonly, the rubber hand illusion, which relies on visual manipulations) with mixed success [11,2023]. Crucially, such efforts focused on measuring visual embodiment but did not associate it with improved prosthesis usage [11,2026] (though see the work by Graczyk and colleagues [27] for results from 2 individuals). We have recently found that as a whole, one-handed individuals tend to respond neutrally to embodiment statements in relation to their own prostheses [28]. Moreover, these measures of embodiment tend to rely on an explicit sense of body ownership, which might not be a necessary consequence of implicit neuronal embodiment (i.e., reallocation of body part resources to represent or control the prosthesis [17]).

As a more direct means of measuring neural visual embodiment, or how the brain represents prosthetic limbs, we recently assessed activity levels in prosthesis users’ occipitotemporal cortex (OTC) while participants were viewing images of prosthetic limbs, using functional MRI (fMRI) [29]. The OTC is known to contain distinct visual representations of different object categories [30], e.g., hands [31,32], tools [33], faces [34], and bodies [35]. It was previously shown to contain overlapping visual and motor body part selective representations [31,36] and even respond to touch [37,38]. OTC has been previously implicated in multisensory integration processing related to embodiment [3941] and OTC’s connectome associates it with hand actions. For example, hand- and body-selective visual areas uniquely connect to the primary sensorimotor hand area [37]. These characteristics qualify the OTC as an ideal candidate for studying action-related visual body representation. We previously found that individuals who used their prosthesis more in daily lives also showed greater activity in OTC’s hand-selective visual areas when viewing images of prostheses and greater functional coupling with sensorimotor hand areas [29]. This result demonstrates that prosthesis users are able to engage visuomotor hand-selective areas while passively representing a prosthesis. However, it is yet unknown whether prosthesis visual representation actually mimics that of a hand (i.e., neural embodiment). Alternatively, because the visual hand-selective regions partially overlap with handheld tool representation [42], the observed activity gains may reflect the prosthesis being represented as a tool. As a third option, because object categorisation in OTC is thought to be based on their semantic and functional properties [43], expert prosthesis usage may result in the emergence of prostheses representation as a new ‘category’, diverging from its existing natural categories (e.g., ‘hands’, ‘tools’). This third alternative is consistent with recent evidence showing that visual expertise can contribute to the shaping of categorical representation in OTC [44] (see ‘Discussion’).

Brain decoding techniques that take advantage of multivoxel representational patterns allow us to reveal the representational structural underlying fMRI activity, e.g., to dissociate overlapping hand and tool representations within the lateral OTC [33,45]. Here, we utilised fMRI data of 32 individuals missing a hand, with varying levels of prosthesis usage (hereafter, ‘prosthesis users’; see Table 1) and 24 two-handed control participants, who have varying life experience of viewing prostheses (See S1 Table). The OTC’s extrastriate body-selective area [35] was independently localised. Two main hypotheses were tested: the embodiment hypothesis, assessing whether prosthetic limbs are in fact represented visually as hands and not tools and the categorisation hypothesis, assessing whether a new ‘prosthesis’ category has formed. To provide distinct predictions for each of these hypotheses, we studied representational similarities between hand, tools, and upper-limb prosthetics images (both cosmetic—designed to resemble hand appearances—and active—designed to afford a grip, e.g., a ‘hook’; Fig 1A) and compared prosthesis users to controls. Broadly speaking, the visual hand embodiment hypothesis predicts that compared to controls, the various prosthesis conditions in prosthesis users will be more similar to hands than to tools (notice that this prediction also allows us to test the inverse prediction—that prostheses are represented more like tools in one-handers). The categorisation hypothesis predicts that, in prosthesis users, the prosthesis conditions will be more similar to each other relative to hands and tools.

Table 1. Prosthesis users’ demographic details and daily prosthesis usage habits.

Subject Gender Age Age at amputation Level of amputation Missing hand side Cause Usage skill (PAL)a Usage timeb Prosthesis usage scorec Own prosthesis conditiond Own condition hand-likee
Cosmetic Active
Mechanical Myoelectric
01 Male 57 20 Below elbow Left Trauma 0.57 5 0 0 2.99 Cosmetic 1
02 Female 49 0 Below elbow Left Congenital 0.46 4 0 0 1.92 Cosmetic 1
03 Male 59 40 Above elbow Left Trauma 0 0 0 0 −2.42 N/A N/A
04 Female 52 0 Below elbow Right Congenital 0.15 5 1 0 1.00 Cosmetic 1
05 Male 58 27 Above elbow Left Trauma 0.09 5 2 0 0.72 Mechanical 0
06 Male 53 28 Below elbow Left Trauma 0.24 3 5 0 1.43 Mechanical 0
07 Male 52 0 At wrist Left Congenital 0.04 0 3 0 −0.60 N/A N/A
08 Male 41 27 Above elbow Right Trauma 0.09 2 1 0 −0.91 Cosmetic 1
09 Male 48 17 Above elbow Left Trauma 0 2 2 0 −1.34 N/A N/A
10 Female 25 0 At wrist Right Congenital 0 0 0 0 −2.42 Cosmetic 1
11 Male 49 0 Above elbow Left Congenital 0.26 1 4 0 0.98 Mechanical 1
12 Male 37 27 Above elbow Left Trauma 0.28 0 2 0 −0.01 N/A N/A
13 Female 46 38 Below elbow Left Trauma 0 0 0 0 −2.42 Cosmetic 1
14 Female 28 0 At wrist Left Congenital 0 0 0 0 −2.42 N/A N/A
15 Male 64 33 Below elbow Right Trauma 0.33 0 2 5 1.85 Myoelectric 1
16 Male 38 0 Below elbow Left Congenital 0.39 5 0 0 2.14 Cosmetic 1
17 Female 24 18 Below elbow Right Trauma 0 0 0 0 −2.42 Cosmetic 1
18 Female 27 0 Below elbow Left Congenital 0.54 5 0 0 2.84 Cosmetic 1
19 Male 49 37 Above elbow Left Trauma 0 1 0 0 −1.88 Cosmetic 1
20 Male 60 0 At wrist Left Congenital 0.06 2 0 0 −1.05 Cosmetic 1
21 Female 34 0 Below elbow Right Congenital 0.46 5 0 0 2.47 Cosmetic 1
22 Female 36 0 Below elbow Right Congenital 0.57 5 0 0 2.99 Cosmetic 1
23 Female 50 45 Above elbow Left Tumor 0 0 2 0 −1.34 Mechanical 0
24* Female 41 0 Below elbow Left Congenital 0.54 0 0 5 2.84 Myoelectric 1
25* Male 29 24 Through shoulder Left Trauma 0.09 0 0 2 −0.91 Myoeletric 1
27* Male 25 0 Below elbow Left Congenital 0.59 1 0 5 3.08 Myoelectric 1
28* Male 34 0 At wrist Left Congenital 0.11 0 0 3 −0.27 Myoelectric 1
29* Male 25 18 At wrist Left Trauma 0 0 2 0 −1.34 N/A N/A
30 Male 38 0 Below elbow Left Congenital 0 0 2 1 −1.34 Myoelectric 1
31 Female 49 0 At wrist Left Congenital 0 1 0 0 −1.88 Cosmetic 1
32 Male 45 20 Below elbow Right Trauma 0.09 2 0 0 −0.91 Cosmetic 1
33 Male 32 31 Above elbow Left Trauma 0 0 2 0 −1.34 Mechanical 0

aHow frequently users incorporate their prosthesis in an inventory of 27 daily activities (e.g., taking money out of wallet, etc.). Scores of 0 = never, 1 = sometimes, 2 = very often. The sum of all items was divided by the highest possible score, such that individuals were rated on a scale ranging from 0 to 1.

bProsthesis wear time for each prosthesis type: 0 = never; 1 = rarely; 2 = occasionally; 3 = daily <4 h; 4 = daily 4–8 h, 5 = daily >8 h. Bold indicates the user’s primary prosthesis.

cA composite measure of prosthesis wear time and skill.

dThe prosthesis the participant brought on the day of experiment and was shown in the own prosthesis condition.

eRelates to the visual appearance of the prosthesis they viewed in the own prosthesis condition. Note that participants 11’s prosthesis was a mechanical hook with a glove and is therefore hand-like.

*These participants were presented with a myoelectric prosthesis for the active prosthesis condition.

Abbreviations: N/A, not applicable; PAL, prosthesis activity log

Fig 1. Example stimuli and ROI.

Fig 1

(A) Example stimuli from the main 4 experimental conditions (columns, left to right): hands (upper limbs), active prostheses, cosmetic prostheses, hand-held tools. One-handers also observed images from multiple viewpoints of their own prosthesis. One image was shown per trial in an event-related design. (B) Probability maps of the body selective ROI. For each participant and hemisphere, the top 250 most activated voxels within the OTC were chosen based on a headless bodies > objects contrast, providing an independent ROI. ROIs from all participants (n = 56) were superimposed, yielding ROI probability maps. Warmer colours represent voxels that were included in greater numbers of individual ROIs. See S1 Fig for the probability maps of each group separately. Data used to create this figure can be found at https://osf.io/4mw2t/. ROI, region of interest.

Results

Clustering of prostheses types in prosthesis users but not in controls

Analysis was focused on the OTC’s extrastriate body-selective area (EBA) [46,47]. This region of interest (ROI) was independently localised for each participant by choosing the 250 voxels in each hemisphere showing the strongest preference to images of headless bodies over everyday objects (Fig 1B).

To investigate the underlying representational structure within this region, we first characterised multivoxel activity patterns for each participant and condition (hands, tools, cosmetic prostheses that look like hands, and active prostheses that tend to resemble tools rather than hands, Fig 1A; all exemplars are available at https://osf.io/kd2yh/). Distances between each pair of activity patterns (e.g., hands and cosmetic prostheses) were calculated (noise-normalised cross-validated mahalanobis distances [48]). More similar activity patterns will result in smaller distances, or in other words, the more dissimilar patterns are, the greater their relative distance is. Because these are multidimensional patterns, one way to visualise the structure is using a dendrogram, or linkage tree, which illustrates how the different conditions cluster. Using this method on the average distances for each group, we found qualitative differences in the representational structure between controls and prosthesis users (Fig 2). For control participants, cosmetic prostheses were clustered with hands, and active prostheses were clustered with tools, reflecting their native intercategorical similarities across conditions (see S2 Fig for visual similarity analysis). For prosthesis users, however, the 2 prostheses were clustered together, potentially reflecting a newly formed prosthesis category, with tools and hands being further away from both prosthesis conditions.

Fig 2. Representational structure in body-selective visual cortex.

Fig 2

Multidimensional distances between activity patterns of each of the main condition (hands, tools, cosmetic prostheses, active prostheses) were calculated using representational similarity analysis. (A) Representational dissimilarity matrices for each group showing pairwise distances between the 2 prostheses conditions (active and cosmetic), hands, and tools. (B) To visualise the underlying representational structure a linkage tree (dendrogram) was calculated in each group of participants, combining information from all pairwise distances (two-handed controls, left; one-handed prosthesis users, right). Pairs of stimuli that are closer together in the multidimensional space are clustered together under the same branch. Longer connections across the vertical axis indicate greater relative distances. In controls, cosmetic prostheses are clustered with hands and active prostheses with tools, reflecting their visual similarities. In prosthesis users, however, the 2 prostheses types (cosmetic and active) are clustered together, with tools and hands represented dissimilarly from both prostheses. Data used to create this figure can be found at https://osf.io/4mw2t/.

Prosthesis-like (categorical) and not hand-like (embodied) representation of prostheses in prosthesis users

Next, we set out to quantify prosthesis representation using the 2 alternative theoretical frameworks—embodiment versus categorisation. According to the embodiment hypothesis, prosthesis representation should resemble hand representation. This hypothesis predicts that in prosthesis users, each of the 2 prosthesis conditions will be more similar (smaller distance) to hands than to tools, compared to controls (quantified as a hand-similarity index; see “Methods”). Notice that the hand-similarity index is the inverse of a tool-similarity index. Therefore, significantly negative embodiment should be taken as evidence for association of the prosthesis with a tool. When comparing hand-similarity indices based on the multidimensional distances across participant groups, we found no significant differences (t(54) = 0.47, p = 0.64; Fig 3A). A Bayesian t test provided substantial evidence in support of the null hypothesis (BF = 0.2), i.e., that on average amputees do not visually represent an unfamiliar prosthesis more similarly to a hand than controls. Similar results were observed across the 2 hemispheres (right OTC: t(54) = 0.46, p = 0.65, left OTC: t(54) = 0.54, p = 0.60).

Fig 3. Assessing the embodiment and categorisation hypotheses.

Fig 3

(A–B) A hand similarity index was calculated for each participant to quantify the degree to which both prostheses conditions (cosmetic and active) are more similar to hands than tools. A higher index in value indicates greater embodiment (hand similarity of prostheses). (A) A group comparison of the hand similarity index between controls and prosthesis users showed no significant differences (t(54) = 0.47, p = 0.64). Horizontal lines indicate group means and dots indicate individual participants. (B) Correlation between the hand similarity index and prosthesis usage was not significant across users (Pearson’s r(30) = −0.03, p = 0.86). Dark/light circles indicate congenital/acquired one-handers, respectively, and grey shading indicates a bootstrapped 95% confidence interval. (C) Hand (left) and tool (right) distances from users’ ‘own’ prosthesis. Individual distances were normalised by the controls’ group mean distance, depending on the visual features of the ‘own’ prosthesis (hand likeness). A value of 1 indicates similar hand/tool distance to controls’. Users showed significantly greater distances between their own prosthesis and hands (t(25) = 4.33, p < 0.001) contrary to the embodiment hypothesis. Together, these findings demonstrate that hand similarity under the embodiment hypothesis does not adequately explain differences in prosthesis representation in users’ OTC. (D–F) A prosthesis similarity index was calculated for each participant to quantify the degree to which the prostheses representation moved away from their natural categories (hands for cosmetic prostheses and tools for active prostheses) and towards one another. (D) A visual illustration of the prosthesis similarity index formula. Arrows pointing outward indicate distances that should grow in users compared to controls (e.g., hands and cosmetic prostheses) and are therefore positively weighted. Arrows pointing inward indicate distances that should shrink in users compared to controls (i.e., cosmetic and active prostheses) and are therefore negatively weighted. (E) Group comparison of the prosthesis similarity index between controls, and prosthesis users showed greater prosthesis similarity in users (t(54) = −2.55, p = 0.01). (F) The prosthesis similarity index significantly correlated with prosthesis usage; higher prosthesis index associated with greater prosthesis usage (based on wear frequency and skill; Pearson’s r(30) = 0.43, p = 0.01). Together, these findings demonstrate the categorisation hypothesis explains differences in user’s prosthesis representation in the OTC, both with respect to controls and interindividual prosthesis usage. Data used to create this figure can be found at https://osf.io/4mw2t/. n.s., no significance; OTC, occipitotemporal cortex.

It is possible that the effects of embodiment are present in a subset of users that rely on their prosthesis for daily function most but that this effect is masked in the group effect by the wide range of usage in our group of prosthesis users. We therefore further tested the relationship between visual embodiment and prosthesis usage by correlating the hand-similarity index with a prosthesis usage score, reflecting usage time and habits (see ‘Methods‘). According to the embodiment hypothesis, hand-like prosthesis representation should scale with prosthesis usage. We found no significant correlation between the hand-similarity index and everyday prosthesis usage (or with tool similarity; Pearson’s r(30) = −0.03, p = 0.86), suggesting that prosthesis embodiment is not a strong predictor of usage (Fig 3B).

According to the categorisation hypothesis, prosthesis usage should promote a more independent representation of prostheses. This hypothesis predicts that within our multidimensional space, both prosthesis conditions would move away from the existing natural categories (e.g., the distance between cosmetic prostheses and hands will become larger whereas their distance from tools will become smaller; see Fig 3D) and towards one another (smaller distances between cosmetic and active prostheses). In other words, the 2 different prosthesis types will form a prosthesis cluster within the hand–tool axis. We calculated a prosthesis-similarity index for each participant (see ‘Methods‘ and S3 Fig for intercategory pairwise distances). We found a significant group difference of the prosthesis-similarity index (t(54) = −2.55, p = 0.01; Fig 3E). This indicates that using a prosthesis alters one’s visual representation of different prostheses types into a distinct category and does so in a way that is more complex than prostheses simply becoming more hand- or tool-like. Although previous studies suggest tool representation is left-lateralised [42,49], the reported effect was robust across both hemispheres (right OTC: t(54) = −2.25 p = 0.03, left OTC: t(54) = −2.66, p = 0.01). Exploratory data-driven analysis ran on the one-handers pairwise distances comprising the prosthesis-similarity index further revealed that the departure of the active prostheses in particular away from its native ‘tool’ category is linked with its increased association with cosmetic prostheses (See S4 Table).

Supporting the interpretation that different neural representational structure in prosthesis users is associated with prosthesis usage, we found a significant positive correlation between the prosthesis-similarity index and the prosthesis usage score described above (Pearson’s r(30) = 0.43, p = 0.01; Fig 3F). In other words, the more users use a prosthesis, the more they represented the different prostheses types as a separate category. Conversely, individuals who do not use a prosthesis frequently have a more similar representational structure to that of controls.

The 2 indices for hand- and prosthesis-similarity are not statistically independent, and, as such, we were not able to compare them directly. However, we could use them as a model for predicting daily-life prosthesis usage. Comparing the correlation between prosthesis usage and the indices for the 2 hypotheses (embodiment and categorical) revealed a significant difference (z = −2.52, p = 0.01). Therefore, at least when considering unfamiliar prostheses, the prosthesis-similarity index is a better predictor of prosthesis usage than the hand-similarity index.

Prosthesis categorisation does not depend on users’ developmental period of hand loss or prosthesis type

When considering hand-selective neural resources, individuals who were born without a hand might develop different representational structures than those with an acquired amputation [50]. Considering this, we tested whether the hand-similarity index differed between the 2 prosthesis users subgroups (congenital versus acquired) and found no significant differences (t(30) = −0.615, p = 0.54). Moreover, the reported prosthesis-similarity effects prevailed even when accounting for any potential differences between the 2 subgroups, as demonstrated by an analysis of covariance (ANCOVA) with subgroup (congenital versus acquired) as a fixed-effect and usage as a covariate. The ANCOVA revealed no significant subgroup effect (F(1,29) = 2.02, p = 0.17), demonstrating that prosthesis-similarity does not significantly differ because of cause/developmental period of hand loss, and a nearly significant usage effect (F(1,29) = 4.11, p = 0.052), indicating that the relationship between prosthesis categorisation and usage is relatively independent from potential subgroup effects.

Beyond cause of limb loss, the users tested in this study also diverged with respect to prosthesis type, shape, and history of usage, involving primary users of cosmetic (40%), active (41%, comprising of mechanical hook [body-powered; 25%] and myoelectric [motor-powered, 16%]), as well as nonprosthesis users (16%) and a hybrid user (3%; see Table 1). A key question is what aspects of the prosthesis itself might affect neural prosthesis adoption in OTC. Because our key focus is on prosthesis usage, we looked at whether individuals who primarily use a prosthesis that has a degree of active grip control (e.g., a mechanical hook) show different effects for those who primarily use a cosmetic prosthesis that affords more limited motor control (no grip). Comparing the prosthesis-similarity index of users of the 2 prosthesis types revealed no significant effect (t(20) = 0.055, p = 0.96). Using an ANCOVA with primary prosthesis type in daily life (cosmetic/active prosthesis) as a fixed effect and usage as a covariate revealed no prosthesis type effect (F(1,19) = 0.432, p = 0.52), indicating that the categorisation effect might not depend on the type of prosthesis individuals primarily use. The usage effect remained significant (F(1,19) = 6.01, p = 0.02), indicating that the correlation between usage and categorisation is independent of prosthesis type. Repeating the same analysis with the hand-similarity index revealed no significant effects, indicating that even when accounting for prosthesis type, no relationship is found with visual embodiment. Though null results should be interpreted with caution, our analysis indicates that the categorisation effect observed in prosthesis users is not driven by the prosthesis design or control mechanism but by the functionality the user extracts from it (as reflected in our daily usage scores).

In the active prosthesis condition, a minority of active prosthesis users (n = 5) were shown images of a myoelectric prosthesis (that is not their own; Table 1 marked in asterisks). Because these are arguably visually distinct from the mechanical hooks seen by the control participants, we repeated the analysis of prosthesis-similarity index by replacing the subset of relevant pairwise distances relating to the active prosthesis with the mean distances of the prosthesis users’ group (see ‘Methods‘). In this analysis, the observed effect remained but was reduced to a trend (t(54) = −1.87, p = 0.067). Importantly, the correlation between categorisation and usage remained significant (r(30) = 0.48, p = 0.006) even when excluding the myoelectric users from the analysis altogether (r(25) = 0.46, p = 0.015). This further analysis confirms that our findings were not driven by the subset of myoelectric active prostheses.

Own prosthesis representation

The results discussed so far were derived from visual presentation of prosthesis exemplars that each user was not personally familiar with, allowing us to easily compare the results between the users and controls. However, under the embodiment framework, it could be argued that embodiment can only be achieved for the user’s own prosthesis. To account for this possibility, in addition to the general prosthesis types shown to all participants, most prosthesis users (n = 26; see ‘Methods‘ and Table 1) were also shown images of their own prosthesis (for the many prosthesis users using more than one prosthesis, this refers to the prosthesis each user wore on the day of testing). Because controls do not have a prosthesis of their own, in this analysis, we compared the user’s own prosthesis to the same prosthesis type shown to controls. Therefore, cosmetic ‘own’ prostheses (n = 15) were matched with controls’ general cosmetic condition, and active ‘own’ prostheses (n = 11) were matched with the controls’ general active condition. To allow us to group values across the cosmetic and active prostheses users, the distances between the ‘own’ prosthesis from hands and tools were normalised by the mean distances measured from the control group (using the above-mentioned conditions). Because we hypothesised that altered prosthesis representation is driven by usage, the controls’ averaged distance is used here as a ‘baseline’ measure of how the representation is structured before prosthesis use. This normalised score was entered into a one-sample t test for statistical comparison.

Based on the embodiment hypothesis, users should show greater similarity between users’ own prosthesis and hands. Instead, we found that users showed significantly greater dissimilarity (greater distances), relative to controls, indicated by a normalised distance that was greater than 1 (t(25) = 2.85, p = 0.009). This analysis therefore shows that users’ own prostheses are less similar to hands, providing direct evidence against the embodiment hypothesis. The normalised ‘own’ prosthesis distance from tools was also found to be significantly greater than 1 (t(25) = 2.91, p = 0.008), further supporting the categorisation interpretation. We also repeated the analysis as described above, but this time we standardised distances for 7 users with an ‘own’ active prosthesis with hand-like visual features (see Table 1) with controls’ cosmetic prosthesis. This complementary analysis produced similar results for hands (t(25) = 4.33, p < 0.001) but not for tools (t(25) = 1.62, p = 0.118; Fig 3C). This means that even when taking both visual-feature and operational considerations, prosthetic limbs are not represented as hands in their owner’s visual cortex.

Prosthesis representation beyond EBA

To demonstrate that our results are not specific to our ROI definition in OTC of EBA, we have repeated the same analysis and found similar results in ‘hand’ and ‘tool’ ROIs within OTC, generated from the meta-analysis tool Neurosynth [51] (see S1 Text and S4 Fig).

We next explored prosthesis representation beyond OTC. The stimuli used in the current study were specifically designed to interrogate the information content underlying prosthesis representation in body-selective regions in the OTC. Nevertheless, as demonstrated in S5A Fig, the visual stimuli also significantly activated the intraparietal sulcus (IPS), providing us with an opportunity to explore visual prosthesis representation in a dissociated brain area (though notably, this activity was observed less consistently within individual subjects). Because 11 of the participants did not have enough significantly activated voxels within IPS to meet our ROI criteria, we constructed an anatomical ROI based on the Juliech Histological Atlas in FMRIB’s Software Library (FSL), including human intraparietal (hIP)1-3 bilaterally [52,53] (see S5B Fig). With respect to the representational structure, we did not find significant differences between prosthesis users and controls when comparing both the hand- and prosthesis-similarity indices, even when wavering corrections for multiple comparisons, which are customary for exploratory analysis (hand-similarity: t(54) = 0.71, p = 0.48; prosthesis-similarity: t(54) = −0.45, p = 0.66). This could be due to insufficient power to explore this representational structure or possibly due to a different organising principle in this region. However, we did find that users showed significantly greater dissimilarity (greater distances), relative to controls, when comparing the representation of their own prosthesis to that of both hands and tools (hand: t(25) = 10.11, p < 0.001; tool: t(25) = 4.62, p < 0.001, corrected for 2 comparisons; see S5C Fig). This analysis indicates that in parietal cortex, similar to the OTC, users’ own prostheses are less similar to hands and tools, providing further support against the embodiment hypothesis.

Discussion

Here, we used an fMRI brain decoding technique to probe the information content underlying visual prosthesis representation in the OTC. Previous research shows that prosthesis users are able to activate hand-selective areas in the OTC when viewing a prosthesis [29]. These areas, however, encompass a rich representational structure, spanning well beyond visual hand processing. It is therefore unknown whether, by utilising these hand-selective areas, the users’ brain actually represents the prosthesis as a hand or whether it follows a different representational principle. We pitted the embodiment theory against another prominent theory—the categorisation theory—which is well established in visual neuroscience [43] (as detailed below) but to our knowledge has not been explored for wearable devices. Although not directly opposing, both theories generate different predictions on how a prosthesis should be represented in the brain of its user. Contrary to the embodiment theory, we found that prosthesis users represented their own prosthesis unlike a hand. For unfamiliar prostheses, users formed a prosthesis category, distinct from their natural (existing) categories (hands and tools), as demonstrated in controls. Importantly, this shift scales with usage, such that people who use their prosthesis more in daily life show more independence of the prosthesis category. When comparing the 2 models’ success in explaining interindividual differences between prosthesis users, we find that the prosthesis category model was significantly more correlated with prosthesis usage. This indicates that, for visual representation of unfamiliar prostheses, categorisation provides a better conceptual framework than embodiment. Together with preliminary results showing that prosthesis users exhibited a less hand-like representation of their own prosthesis in parietal cortex, our results collectively show that neural visual embodiment does not predict successful adoption of wearable technology by the human brain. Despite benefiting from hand-selective cortical resources [29], the prosthesis is not ‘embodied’ into people’s visual hand representation. We also did not find any evidence for greater representation of prostheses as tools. However, because the experimental design and analysis we implemented were a priori focused on visual hand representation in OTC, it is possible that other brain areas might find a different representational ‘solution’ to support the daily use of an artificial limb.

As stated above, an intuitive and increasingly popular view that has been inspiring biomedical design and innovation is that embodiment will optimise the usage of technology, such as upper-limb prostheses [10,11,1316,54,55]. How can this view be resolved with our findings? One potential solution originates from the fact that embodiment is a multifaceted phenomenon [17], with distinct levels (i.e., phenomenological—does the prosthesis feel like my hand?; cognitive—do I react to/with the prosthesis like I would to/with my own hand?; neural—is the prosthesis represented via the same mechanisms as a hand? [7]). In another recent study in which we probed the phenomenological and cognitive levels of prosthesis embodiment, we found both to correlate with prosthesis usage [28]. Here, we focused our research on the neural level, because it is entirely unknown whether objective representation of the prosthesis as a body part associates with prosthesis acceptance and usage, let alone benefits it. It is possible, and even likely, that embodiment manifests differently in each of these distinct levels, which may also vary in their importance or even relevance for successful technological implementation [7]. To disentangle the complex concept of ‘embodiment’, future studies should aim to acquire measurements of the different levels of embodiment together with a detailed account of prosthesis skill and use.

A second important consideration is that embodiment is a multisensory process, involving multiple brain areas beyond visual cortex [41,56,57]. Here, we focused on visuomotor processing, and our experimental approach does have some limitations that should be considered. For example, our use of static images was designed to drive activity in the OTC but not in other sensorimotor-related regions in the parietal and frontal regions [58], thereby limiting our conclusions to high-level visual cortex. The use of generic hand images that are not the participants’ own hands can also be limiting when approaching questions of embodiment. Here, it is important to mention that despite using a profoundly nonecological task in the scanner, the resulting brain representations, as captured with our prosthesis-similarity index, correlated significantly with the extent of everyday prosthesis usage. Therefore, despite the inherent limitations of our fMRI task, our task outcomes are ecologically relevant. Still, it is possible that other brain areas involved more directly in motor planning would produce other representational structures with respect to hand representation. Future research aimed at this question should take into consideration that at present, commercially available prosthesis motor control is fundamentally different from that of motor control of the hand, producing further potential challenges for sensorimotor embodiment.

Thirdly, and most speculatively, it is possible that the prosthesis may still be represented in OTC as a body part but one that is not a hand. After all, prosthesis users have strong semantic and sensorimotor knowledge of a hand (all users in the study had one intact hand; acquired amputees had prior experience of their missing hand, including lingering phantom sensations; see also the work by Striem-Amit and colleagues [59] for related findings showing normal visual hand representation in individuals born without both hands). Their experience with operating a prosthesis is fundamentally different from that of a hand. If body representation is not strictly tuned to the specific fine-grained features of a given body part (e.g., the digits of the hand) but is instead also represented at a higher level (e.g., effectors [60] or based on other functionality features [61]), then the dissociation of prostheses from hand representation observed in our study should not be taken as evidence for lack of embodiment per se but rather lack of embodiment as a hand. In this context, the previously reported recruitment of hand-specific visual cortical regions could reflect an underlying embodied representation of a prostheses as a (nonhand) body part. Therefore, we propose that future studies of artificial limb embodiment should not be limited to identifying and/or quantifying hand representation (as the current common practice, e.g., using the rubber hand illusion).

Instead of visual hand embodiment (or tool-like representation), we found that a significant amount of individual differences in prosthesis usage can be predicted by the extent of prosthesis categorisation within the visual body-selective area, providing a significantly better model than hand embodiment. This result is also consistent with the known organising principle of the OTC, in which categorical representation reflects our knowledge of grouping and patterns, which are not necessarily present in the bottom-up sensory inputs [43,6264]. Moreover, categorical representation in OTC was shown to reflect individual differences in conceptual knowledge [65]. Accordingly, people who acquire a specific visual expertise, such as car experts, show increased activity in object-selective areas in OTC (for a review, see the work by Harel [66]). Research on object experts, therefore, provides compelling evidence for the role of visual learning and experience in shaping and fine-tuning categorical representation. Although various studies have demonstrated a relation between expertise and activation, few studies performed multivariate analyses, and those that did reported mixed results [44,6770]. For example, Martens and colleagues, 2018, found that expertise did not alter the representational configuration of the category of expertise, whereas McGugin and colleagues, 2015, who studied car representation of experts in the Fusiform Face Area, report that car representation became more similar to that of faces with expertise. In this context, our present results provide a novel perspective on visual expertise. This is because our results show divergence of prosthesis representation from the ‘natural’ categories normally existing in this brain area, consistent with the formation of a new categorical representation. In other words, rather than refining a category, prosthesis usage results in the formation of a new category. Gomez and colleagues, 2019, recently reported that childhood experience with playing a Pokémon videogame relates to increased classification of Pokémon, compared with other related categories, in the ventral occipital cortex [44]. Extending this finding, we report that categorical prosthesis representation correlates with (ongoing) visuomotor experience in adulthood. As these effects were found in both congenital and acquired one-handers, this prosthesis categorisation effect does not seem to relate to a specific developmental period. Because prosthesis usage relies on visuomotor expertise, it is difficult for us to disentangle the relative contribution of perceptual expertise to the observed prosthesis categorisation. Further research examining the representation of prosthesis in exclusively perceptual experts (e.g., prosthetists) will provide an interesting test case for our expertise hypothesis.

A further distinguishing feature of prosthesis users compared to other experts is that they not only have increased experience with prosthetic limbs but also, arguably, a reduction in exposure to hands, at least from a first-person perspective. Reorganisation is the process by which a specific brain area qualitatively changes its input–output dynamics to support a novel representation. This raises the question of whether or not the OTC becomes reorganised to support a new visual function (prosthesis control, known to strongly rely on visual feedback [18,71]). In this context, in recent years, adaptive behaviour has been suggested [60,7275] and challenged [59,76,77] as a causal driver of brain reorganisation. According to this framework, the function of the deprived cortex is not reassigned; instead, it is the input (prosthesis versus hand) that changes, while the local processing and resulting output persist (domain specificity [78,79]). For example, in a study conducted in individuals with a congenitally missing limb, multiple body parts used to compensate for the missing limb’s function benefited from increased activity in the missing hand’s sensorimotor cortical area [60], replicated in the recent work by Hahamy and Makin [80]. It was, therefore, suggested that opportunities for reorganisation may be restricted by a predetermined functional role of a given brain area, e.g., hand resources will only support processing related to hand-substitution behaviours (other body parts or a prosthesis). This framework has been successfully implemented to demonstrate that the categorical organisation of OTC is preserved following congenital blindness [8183]. For example, the OTC body area was shown to be selectively activated by tactile [84] and auditory [85] inputs conveying hand/body information. Our findings advance beyond these studies by demonstrating that a parallel form of reorganisation can occur even when the relevant sensory pathway (vision) is largely unaffected, further highlighting the role of daily behaviour in shaping brain organisation across the life span.

Finally, our results suggest that the relationship between prosthesis representation and usage is independent of key design and usage features of the artificial device (such as visual mimicry of the human hand) and cause of limb loss (congenital or acquired). This should inspire future efforts in neurologically-informed substitution and augmentative artificial limb design to not be strictly confined to biomimetics, a design principle that is currently highlighted in the development of substitution technology [86] (e.g., the vine prosthesis [87]).

To conclude, we provide a novel neural correlate for the adoption of wearable technology that is distinct from visual embodiment of a prosthesis as a hand. Successful prosthesis usage, in terms of both wear time and habit in daily life, was predicted not by visual embodiment (hand-similarity) or tool-similarity but by a more distinct categorical representation of artificial limbs. Understanding whether the brain can treat a prosthesis as a hand and whether this hand-like representation provides a real advantage for prosthesis users will have important implications on future design and assessment of wearable technology. Considering the limitations related to our focus on visual prosthesis representation in passive settings, we are currently unable to offer a sweeping answer to how the entire brain represents artificial limbs. However, our findings provide an important alternative to the highly prominent embodiment theory that needs to be considered. As such, much more research is necessary to provide a comprehensive understanding of the neural basis of successful prosthesis usage in the human brain.

Methods

Ethics statement

This study was approved by the Oxford University’s Medical Sciences interdivisional research ethics committee (Ref: MSD-IDREC- C2-2014-003). Written informed consent and consent to publish was obtained in accordance with ethical standards set out by the Declaration of Helsinki.

Participants

Thirty-two individuals missing an upper-limb (one-handed prosthesis users, mean age [SD] = 42.3 [11.8], 12 females, 8 missing their right hand) were recruited to take part in the study (Table 1). Sixteen prosthesis users lost their hand following an amputation, and sixteen had a unilateral congenital upper-limb below-elbow deficiency (due to complete arrest of hand development). One additional one-hander was recruited to the study but did not participate in the scanning session because of claustrophobia. In addition, 24 age- and gender-matched two-handed controls (age = 41.7 [13.1]; 12 females; 8 left-handed) took part in the study. Fourteen of the control participants were family members, friends, or held professional relationships with prosthesis users, resulting in passive visual experience of prosthesis usage. All participants took part in a larger study, involving multiple tasks (https://osf.io/kd2yh/). Univariate data from the fMRI task reported here was previously published [29].

Prosthesis usage measurements

Prosthesis usage was assessed by combining two, highly correlated measurements of usage: prosthesis wear frequency and a prosthesis activity log (PAL) [29,88]. Participants rated their prosthesis wear frequency on a scale: 0 = never, 1 = rarely, 2 = occasionally, 3 = daily (<4 hours), 4 = daily (4–8 hours), 5 = daily (>8 hours). Some participants use more than one type of prosthesis; in that case, the measurement from the most frequently used prosthesis was used. The PAL is a revised version of the motor activity log [89] as described in the work by Makin and colleagues [74]. In brief, participants were requested to rate how frequently (0 = never, 1 = sometimes, 2 = always) they incorporate their prosthesis in an inventory of 27 daily activities, with varying degrees of motor control. PAL is calculated as the sum of the levels of frequencies in all activities divided by the maximum possible sum (27 × 2) creating a scale of 0 to 1. This questionnaire, indexing bimanual usage, was previously validated using limb acceleration data [74] and behavioural lab testing [60], collected in ecological settings. Because neither measure is able to fully capture prosthesis use, both the prosthesis wear frequency and PAL were Z-transformed and summed to create a prosthesis usage score.

Stimuli

Participants viewed still object images of the following categories: (i) hands (upper limbs with and without the arm, in multiple orientations and configurations, and of different sizes, genders, and skin-colours; hereafter hands); (ii) man-made hand-held tools; (iii) cosmetic prostheses (aesthetically hand-like but nonoperable), (iv) active prostheses (affording a grip; either mechanical hooks or myoelectric prostheses); and (v) (when available) participants’ own prosthesis (more details below). For hands and prosthesis images, the effector was matched to the prosthesis users’ missing-hand side or the nondominant hand in controls (e.g., participants missing their left hand were presented with ‘left-handed’ hands/prostheses). Headless bodies and typically nonmanipulable man-made object images were also included for localising independent ROIs (see “OTC ROI” section). Additional conditions that were also included in the fMRI paradigm and initial analysis but not reported here were dismorphed images (originally intended to account for low-level visual activity but discarded after univariate analysis revealed increased activity in OTC) and lower limbs (intended as a control body part but not included in final analysis).

Images used for the main stimulus categories can be found at https://osf.io/kd2yh/. All images had their background removed, normalised for size, placed on an equiluminant grey background, and overlaid with a red fixation square. Nonprosthesis conditions were taken from an online database and involved multiple exemplars, whereas each of the 3 prosthesis conditions involved multiple shots of a single prosthesis of different orientations. We chose to use multiple configurations of hands and prostheses to probe the experience of congenital one-handers and control participants (who mostly see prostheses/hands-of-the-missing-side from a third person perspective, respectively). We note that previous studies probing visual hand representation in similar populations [88], including specifically in OTC [59], used a similar approach. We further note that the few studies finding differences between egocentric/allocentric [90] or self/others [91] visual hand representation in OTC identified lateralised effects to the right OTC, whereas our effects are comparable across hemispheres. Prosthesis images of other users’ prostheses (in the cosmetic and active conditions) or of the participant’s prosthesis (in the ‘own’ condition) were taken by the experimenters prior to the functional MRI session. The subset of the active prosthesis users (marked with an asterisk in Table 1) were shown another myoelectric prosthesis in the active prosthesis condition.

In the ‘own’ prosthesis condition, all prosthesis users who had brought their prosthesis to the study were presented with images of their own prostheses, either cosmetic or active (n = 26; see Table 1). All other participants (i.e., the remaining 6 prosthesis users who did not bring a prosthesis and all control participants) were shown pictures of their own shoe instead. Shoes were selected as a familiar external object that was intended to exert similar cognitive effects (e.g., in terms of arousal) as the prosthesis and therefore minimise differences in the scan time course across groups. Because we had no a priori interest in studying shoe representation, the shoe condition was not included in further analysis.

Post hoc shape similarity analysis [92] confirmed that the prosthesis images spanned a diverse range, resulting in similar shape dissimilarity for the 2 prosthesis types with respect to hand and tool exemplars (S2 Fig). It is highly likely that other measurements of visual similarity (e.g., based on contrast/colour comparison or perceptual judgements) would reveal more distinct intercategorical (dis)similarities. However, any such visual (dis)similarities should impact intercategorical representational similarity in the control group. As such, in the present study, all key analyses are interpreted with respect to the controls, providing us with a representational ‘baseline’.

Experimental design

Each condition consisted of 8 different images. In each trial, a single image from one of the conditions was shown for 1.5 s, followed by 2.5 s of fixation. Participants were engaged in a one-back task and were asked to report whenever an image was repeated twice in succession. This occurred once for each condition within each functional run, resulting in 9 trials per condition per run (8 distinct exemplars and 1 repetition). This design was repeated in 4 separate functional runs, resulting in 36 events per condition. First-order counterbalancing of the image sequences was performed using Optseq (http://surfer.nmr.mgh.harvard.edu/optseq), which returns the most optimal image presentation schedule. Run order was varied across participants. The specifics of this design were validated against an event-related design with a jittered interstimulus interval and a block design during piloting (n = 4). Stimuli were presented on a screen located at the rear end of the scanner and were viewed through a mirror mounted on the head coil. Stimulus presentation was controlled by a MacBook Pro running the Psychophysics Toolbox in MATLAB (The MathWorks, Natick, MA).

MRI data acquisition

The MRI measurements were obtained using a 3-Tesla Verio scanner (Siemens, Erlangen, Germany) with a 32-channel head coil. Anatomical data were acquired using a T1-weighted magnetization prepared rapid acquisition gradient echo sequence with the parameters: TR = 2040 ms, TE = 4.7 ms, flip angle = 8°, and voxel size = 1 mm isotropic resolution. Functional data based on the blood oxygenation level-dependent signal were acquired using a multiband gradient echo-planar T2*-weighted pulse sequence [93] with the parameters: TR = 1300 ms, TE = 40 ms, flip angle = 66°, multiband-facto = 6, voxel size = 2 mm isotropic, and imaging matrix = 106 × 106. Seventy-two slices with slice thickness of 2 mm and no gap were oriented in the oblique axial plane, covering the whole cortex, with partial coverage of the cerebellum. The first dummy volume of each scan was saved and later used as a reference for coregistration. Additional dummy volumes were acquired and discarded before the start of each scan to reach equilibrium. Each functional run consisted of 256 volumes.

Preprocessing and first-level analysis

fMRI data processing was carried out using FEAT (FMRI Expert Analysis Tool) version 6.00, part of FSL (www.fmrib.ox.ac.uk/fsl). Registration of the functional data to the high resolution structural image was carried out using the boundary based registration algorithm [94]. Registration of the high resolution structural to standard space images was carried out using FLIRT [95,96] and was then further refined using FNIRT nonlinear registration [97,98]. The following prestatistics processing was applied; motion correction using MCFLIRT [96]; nonbrain removal using BET [99]; B0-unwarping using a separately acquired field-map; spatial smoothing using a Gaussian kernel of FWHM 4 mm; grand-mean intensity normalisation of the entire 4D data set by a single multiplicative factor; highpass temporal filtering (Gaussian-weighted least-squares straight line fitting, with sigma = 50 s). Time-series statistical analysis was carried out using FILM with local autocorrelation correction [100]. The time series model included trial onsets and button presses convolved with a double gamma HRF function; 6 motion parameters were added as confound regressors. Indicator functions were added to model out single volumes identified to have excessive motion (>1 mm). A separate regressor was used for each high motion volume, no more than 9 volumes were found for an individual run (3.5% of the entire run).

OTC ROI

Because the focus of the study was on visual representation of prostheses in the OTC, the representational similarity analysis was restricted to individualised ROIs. ROIs of the extrastriate body area (EBA) [47] were identified bilaterally in each participant, by selecting the top 250 activated voxels, in each hemisphere, in the headless bodies > nonmanipulable objects. Voxel selection was restricted to the lateral occipital cortex, inferior and middle temporal gyri, occipital fusiform gyrus, and temporal occipital fusiform cortex (all bilateral), as defined by the Harvard-Oxford atlas [101]. Voxels from both hemispheres were treated as a single ROI (see S3 Table for the all analyses repeated for each hemisphere separately). Mean activity within the ROI for each participant was calculated by averaging the parameter estimate (beta) for each condition across all 500 voxels.

Representational similarity analysis

To assess the hand–prosthesis–tool representation structure within the ROI, pairwise distances between conditions were calculated using a multivariate approach, generally known as representational similarity analysis [102]. Prior to performing the multivariate analysis, we first examined differences in univariate activity in the ROI, which could drive differences in the multivariate analysis. When comparing activity levels (averaged across all voxels) between controls and prosthesis users within this region, no significant differences were found for each of the image conditions of hands, tools, and prostheses (p > 0.1 for all; see S2 Table), indicating that the 2 groups did not activate this region significantly differently. We then continued with the multivariate analysis. For each participant, parameter estimates of the different conditions and GLM residuals of all voxels within the ROI were extracted from each run’s first-level analysis. To increase the reliability of the distance estimates, parameter estimates underwent multidimensional normalisation based on the voxels’ covariance matrix calculated from the GLM residuals. This was done to ensure that parameter estimates from noisier voxels will be down-weighted [48]. Cross-validated (leave-run-out) Mahalanobis distances (also known as LDC—linear discriminant contrast) [48,103] were then calculated between each pair of conditions. Analysis was run on an adapted version of the RSA Toolbox in MATLAB [103], customised for FSL [104]. Visualisation of the distances in dendrogram was performed using the plotting functions available in the RSA Toolbox.

Hand-similarity and prosthesis-similarity indices

Two indices were calculated to test each of the aforementioned predictions—one based on embodiment and the other on the categorical structure of OTC. Each index’s formula was designed so that positive values support the prediction. Therefore, similar to GLM contrasts, each of the relevant pairwise distances were weighed with a positive multiplier if, under the specific prediction, that distance should grow (decreased similarity) in prosthesis users, and a negative multiplier if it should shrink (greater similarity). For instance, the embodiment hypothesis predicts that for both prostheses (active and cosmetic) the distance from tools would grow (will be less similar), whereas the distance from hands will shrink (will be more similar). Therefore, the formula used to calculate the Hand-Similarity index was (Tools↔CosmP + Tools↔ActP) − (Hands↔CosmP + Hands↔ActP), in which ‘↔’ indicates the distance between a pair of conditions. Using the same logic, the prosthesis-similarity index was calculated as a measurement of how much the 2 prosthesis conditions are represented as a separate cluster away from their native condition (cosmetic prostheses resembling hands and active prostheses resembling tools). The index was calculated using the following formula: 3(Hands↔CosmP + Tools↔ActP) − 2(Hands↔ActP + Tools↔CosmP + ActP↔CosmP); see Fig 3D for a visualisation of the formula. To control for individual differences in absolute distance values, both indices were standardised by the individuals’ distances between hands and tools (i.e., the residuals after accounting for the variance in the Hands↔Tools distance; see S3 Table for all analyses performed with the raw index values).

As mentioned earlier (see ‘Stimuli’), 5 active prosthesis users viewed images of myoelectric prostheses as the active prosthesis condition while all other participants viewed mechanical hooks under that condition. Because this creates a possible bias in our analysis, we have attempted to remedy this in two ways. The first is to replace all distances that involved the active prosthesis condition with the mean distances of the rest of the users’ group. In other words, before calculating the indices, replacing the distances (Tools↔ActP, Hands↔ActP, ActP↔CosmP) for these 5 individuals with the mean distances of the remaining 27 users. Another approach was to remove these individuals from the analysis altogether.

Analysis by prosthesis type

To test the influence of prosthesis type on users’ prosthesis-similarity index, we repeated the same analysis as described above and compared the hand- and prosthesis-similarity indices between individuals using a cosmetic prosthesis (n = 13) and active prosthesis users (n = 9). The following participants have been excluded from this analysis: (1) 5 individuals not using a prosthesis (usage time of 0 = never); (2) the 5 participants that viewed myoelectric prostheses as the active prosthesis condition mentioned above.

Own prosthesis representation

In this analysis we aimed to assess each user’s own prosthesis representation, defined by its distance from hands and tools. Our approach was designed to overcome 2 challenges: First, controls do not have an ‘own’ prosthesis; and second, the visual and operational features of different prostheses may vary significantly and need to be accounted for before similarity measures can be averaged across all prosthesis users. Therefore, for each participant, we normalised (divided) the ‘own’ prosthesis distance by the mean distance of a similarly looking/operating prosthesis as found in controls. This produced a measure that reflects the magnitude of the representational shift of individual’s ‘own’ prosthesis from the average distance of the control participants. A value of one would therefore indicate no difference between controls and the user’s representation of their own prosthesis. For the 7 prostheses users wearing an active prosthesis that had hand-like visual features (see Table 1 for a full breakdown of prostheses types), we repeated the analysis twice over, standardising their distances with control’s active and cosmetic prosthesis distances. This allowed us to account for both visual and operational features.

IPS ROI

The IPS ROI was generated using the Juliech Histological Atlas, including all voxels that have more than 30% probability of being within the grey matter of the IPS areas: hIP1, hIP2, and hIP3, [52,53] in both hemispheres.

Statistical analysis

All statistical testing was performed using SPSS (version 24), with the exception of the Bayesian analysis which was run on JASP [105]. Comparisons between prosthesis users and two-handed controls were performed using a two-tailed Student t test. To test the relationship between the indices and individuals’ prosthesis use, a usage score, described above, was correlated with the indices using a two-tailed Pearson correlation. An ANCOVA with prosthesis usage as a covariate was used to test the contribution of several factors such as cause of limb loss and type of prosthesis used. Own prosthesis analyses were performed using a one-sample t test, comparing the mean to one as the control means were used to calculate the individual indices. To interpret key null results, we ran a one-tailed t test. The Cauchy prior width was set at 0.707 (default). We interpreted the test based on the well accepted criterion of Bayes factor smaller than 1/3 [106,107] as supporting the null hypothesis. Corrections for multiple comparisons were included for exploratory analysis when resulting in significant differences, as indicated in the results section. To minimise flexible analysis, which could lead to p-hacking [108], only a limited set of factors, prespecified in the initial stages of the analysis or based on the reviewer’s comments, were tested in this manuscript. Further exploratory analysis on other recorded clinical factors can be conducted using the full demographic details: https://osf.io/kd2yh/.

Supporting information

S1 Text. Supplementary results: ‘Hand’ and ‘Tool’ ROI analysis.

Additional analyses conducted on ‘hand’ and ‘tool’ ROIs generated from the meta-analysis tool Neurosynth. ROI, region of interest

(DOCX)

S1 Table. Control participants demographics.

This table was taken from the supplementary material of van den Heiligenberg and colleagues, 2018, using the same controls cohort.

(XLSX)

S2 Table. Group comparison of average activity levels.

Group comparison of average activity levels between controls and prosthesis users within the visual body-selective ROI. Results are shown for both the bilateral ROI (500 voxels) and for each hemisphere separately (250 voxels each). ROI, region of interest

(XLSX)

S3 Table. Confirmatory additional analyses.

A summary table for analyses of hand-similarity index and prosthesis-cluster index: group comparisons (controls versus prosthesis users) and correlation with prosthesis usage. Including the results reported in the paper (bilateral ROI), results within the same ROI with the raw indices without controlling for the hand-tool distance (bilateral ROI raw), and for the results of the indices for each hemisphere separately. ROI, region of interest

(XLSX)

S4 Table. PCA of pairwise distances.

To explore which of the pairwise distances contributed to the underlying observed effect of prosthesis categorisation we ran a data-driven analysis (PCA) on the 5 distances of the one-handed group. Values in the table are the weights given to each distance within a component. The first component shows a ‘main effect’ of interindividual differences across participants, in which some individuals have overall larger distances than others across all condition pairs. In our calculated indices, we control for this effect by normalising the individual’s selectivity indices by their Hands ↔ Tools distance (see ‘Methods‘). The second component explains almost half of the remaining variance (after accounting for the interindividual differences in component 1). In the second component, individuals showing greater distances between the active prostheses and the tool condition also show greater similarity between the active prosthesis and the cosmetic prosthesis conditions. In other words, when the active prosthesis condition moves away from the tool category, it also tends to get closer to the cosmetic prostheses (as can be seen by the high weights and opposite signs of these 2 distances in the second component). This data-driven analysis provides further support for the hypothesised categorical shift of prosthesis representation. PCA, principle component analysis

(XLSX)

S1 Fig. Group probability maps for visual body-selective ROIs in control participants and prosthesis users.

All individual visual ROIs were superimposed per group, yielding corresponding probability maps. Warmer colours represent voxels that were included in greater numbers of individual ROIs. Data used to create this figure can be found at https://osf.io/4mw2t/. ROI, region of interest

(TIF)

S2 Fig. Intercategorical shape dissimilarities.

(A) Two exemplars from the ‘hand’ and ‘active prosthesis’ categories. (B) All exemplars shown to each individual participant were submitted to a visual shape similarity analysis (Belongie and colleagues, 2002), in which intercategorical pairwise shape similarity was assessed. (C) A histogram showing intercategorical similarity from one participant’s shown cosmetic (blue) and active (red) prosthesis exemplars, with respect to hand exemplars (all exemplars are available on https://osf.io/kd2yh/). As demonstrated in this example, these dissimilarity ranges were largely overlapping. (D) This intercategory dissimilarity analysis was repeated for each of the participants (based on the specific prostheses exemplars shown to them), and mean histogram values were averaged. As indicated in the resulting matrix, cosmetic and active prostheses did not show strong differences in similarities, on average. This is likely due to the wide range of exemplars/shapes used in the study data set. Data used to create this figure can be found at https://osf.io/4mw2t/.

(TIF)

S3 Fig. Pairwise distances in EBA.

(A) Pairwise distances between patterns of activations of hands, cosmetic prostheses, active prostheses, and tools. In the labels, ‘↔’ indicates the distance between a pair of conditions. Within the plot, x indicates the group’s mean. (B) same as panel A, only with each distance standardised by the individuals’ distances between hands and tools. (C) A table illustrating the direction of the effect predicted by each index. Data used to create this figure can be found at https://osf.io/4mw2t/. EBA, extrastriate body-selective area

(TIF)

S4 Fig. Neurosynth ‘hand’ and ‘tool’ ROIs.

Using the association maps for the words: ‘hand’ (blue) and ‘tools’ (green), ROIs were defined by using all significant voxels within the OTC. These ROIs are projected on inflated brain for visualisation. Surface and volume masks can be found at https://osf.io/4mw2t/. OTC, occipitotemporal cortex; ROI, region of interest

(TIF)

S5 Fig. IPS analysis.

(A) Univariate activations in prosthesis users. Results of the group level univariate contrast of (Active Prosthesis + Cosmetic Prosthesis) > Objects show that at the group level the IPS is also activated. (B) The IPS region of interest was taken from the Juelich Histological Atlas (30% probability of hIP1, hIP2, and hIP3). (C) Hand (left) and tool (right) distances from users’ ‘own’ prosthesis in IPS. Individual distances were normalised by the controls’ group mean distance, depending on the visual features of the ‘own’ prosthesis (hand-likeness). A value of 1 indicates similar hand/tool distance to controls. Users showed significantly greater distances between their own prosthesis and hands (t(25) = 10.11, p < 0.001) contrary to the embodiment hypothesis. A significant increase in the distance of the ‘own’ prosthesis from tools was also observed (t(25) = 4.62, p < 0.001). Data used to create this figure can be found at https://osf.io/4mw2t/. hIP, human intraparietal; IPS, intraparietal sulcus.

(TIF)

Acknowledgments

We thank the authors of our previous manuscript [29] and, in particular, Fiona van den Heiligenberg and Jody Culham for contributions in experimental design, data collection, and helpful advice on data analysis. We thank Opcare for help in participant recruitment; Chris Baker and Stephania Bracci for their comments on the manuscript; and our participants and their families for their ongoing support of our research.

Abbreviations

ANCOVA

analysis of covariance

EBA

extrastriate body-selective area

fMRI

functional MRI

FSL

FMRIB’s Software Library

hIP

human intraparietal

IPS

intraparietal sulcus

OTC

occipitotemporal cortex

PAL

prosthesis activity log

ROI

region of interest

Data Availability

Full study protocol, key materials and full clinical/demographic details are available from the Open Science Framework at https://osf.io/kd2yh/ Data used in all the reported analyses and figures are available from the Open Science Framework at https://osf.io/4mw2t/ Full raw data including each participant/ROI fMRI BOLD activity values across voxels/conditions/runs, will be available upon request.

Funding Statement

This work was supported by a Wellcome Trust Senior Research Fellowship (https://wellcome.ac.uk/, grant number: 215575/Z/19/Z), and an ERC Starting Grant (https://erc.europa.eu/, grant number: 715022 EmbodiedTech) and a Sir Henry Dale Fellowship jointly funded by the Wellcome Trust (https://wellcome.ac.uk/) and the Royal Society (https://royalsociety.org/, 104128/Z/14/Z), awarded to T.R.M. R.O.M.M. is supported by the Clarendon scholarship (http://www.ox.ac.uk/clarendon) and University College, Oxford (http://www.univ.ox.ac.uk/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1.Osborn LE, Dragomir A, Betthauser JL, Hunt CL, Nguyen HH, Kaliki RR, et al. Prosthesis with neuromorphic multilayered e-dermis perceives touch and pain. Sci Robot. 2018;3: eaat3818 10.1126/scirobotics.aat3818 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Soekadar SR, Witkowski M, Gómez C, Opisso E, Medina J, Cortese M, et al. Hybrid EEG/EOG-based brain/neural hand exoskeleton restores fully independent daily living activities after quadriplegia. Sci Robot. 2016;1: eaag3296 10.1126/scirobotics.aag3296 [DOI] [PubMed] [Google Scholar]
  • 3.Penaloza CI, Nishio S. BMI control of a third arm for multitasking. Sci Robot. 2018;3: eaat1228 10.1126/scirobotics.aat1228 [DOI] [PubMed] [Google Scholar]
  • 4.Jang CH, Yang HS, Yang HE, Lee SY, Kwon JW, Yun BD, et al. A Survey on Activities of Daily Living and Occupations of Upper Extremity Amputees. Ann Rehabil Med. 2011;35: 907 10.5535/arm.2011.35.6.907 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Engdahl SM, Christie BP, Kelly B, Davis A, Chestek CA, Gates DH. Surveying the interest of individuals with upper limb loss in novel prosthetic control techniques. J Neuroeng Rehabil. 2015;12: 53 10.1186/s12984-015-0044-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Østlie K, Lesjø IM, Franklin RJ, Garfelt B, Skjeldal OH, Magnus P. Prosthesis rejection in acquired major upper-limb amputees: a population-based survey. Disabil Rehabil Assist Technol. 2012;7: 294–303. 10.3109/17483107.2011.635405 [DOI] [PubMed] [Google Scholar]
  • 7.Makin TR, de Vignemont F, Faisal AA. Neurocognitive barriers to the embodiment of technology. Nat Biomed Eng. 2017;1: 0014 10.1038/s41551-016-0014 [DOI] [Google Scholar]
  • 8.Murray CD. Embodiment and Prosthetics In: Gallagher P, Desmond D, MacLachlan M, editors. Psychoprosthetics: State of the Knowledge. London: Springer London; 2008. pp. 119–129. 10.1007/978-1-84628-980-4_9 [DOI] [Google Scholar]
  • 9.Beckerle P, Kirchner EA, Christ O, Kim S-P, Shokur S, Dumont AS, et al. Feel-Good Robotics: Requirements on Touch for Embodiment in Assistive Robotics. Front Nerorobotics. 2018;12: 1–7. 10.3389/fnbot.2018.00084 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Giummarra MJ, Gibson SJ, Georgiou-Karistianis N, Bradshaw JL. Mechanisms underlying embodiment, disembodiment and loss of embodiment. Neuroscience and Biobehavioral Reviews. Pergamon; 2008. pp. 143–160. 10.1016/j.neubiorev.2007.07.001 [DOI] [PubMed] [Google Scholar]
  • 11.Rognini G, Petrini FM, Raspopovic S, Valle G, Granata G, Strauss I, et al. Multisensory bionic limb to achieve prosthesis embodiment and reduce distorted phantom limb perceptions. Journal of Neurology, Neurosurgery and Psychiatry. BMJ Publishing Group Ltd; 2019. pp. 833–836. 10.1136/jnnp-2018-318570 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Hellman RB, Chang E, Tanner J, Helms Tillery SI, Santos VJ. A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss. Front Hum Neurosci. 2015;9: 1–10. 10.3389/fnhum.2015.00001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Tyler DJ. Neural interfaces for somatosensory feedback: bringing life to a prosthesis. Curr Opin Neurol. 2015;28: 574–581. 10.1097/WCO.0000000000000266 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Pazzaglia M, Molinari M. The embodiment of assistive devices-from wheelchair to exoskeleton. Physics of Life Reviews. Elsevier; 2016. pp. 163–175. 10.1016/j.plrev.2015.11.006 [DOI] [PubMed] [Google Scholar]
  • 15.Longo MR, Sadibolova R, Tamè L. Embodying prostheses–how to let the body welcome assistive devices: Comment on “The embodiment of assistive devices—from wheelchair to exoskeleton” by M. Pazzaglia and M. Molinari. Phys Life Rev. 2016;16: 184–185. 10.1016/j.plrev.2016.01.012 [DOI] [PubMed] [Google Scholar]
  • 16.Marasco PD, Hebert JS, Sensinger JW, Shell CE, Schofield JS, Thumser ZC, et al. Illusory movement perception improves motor control for prosthetic hands. Sci Transl Med. 2018;10: 1–13. 10.1126/scitranslmed.aao6990 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.De Vignemont F. Embodiment, ownership and disownership. Conscious Cogn. 2011;20: 82–93. 10.1016/j.concog.2010.09.004 [DOI] [PubMed] [Google Scholar]
  • 18.Antfolk C, D’Alonzo M, Rosén B, Lundborg G, Sebelius F, Cipriani C. Sensory feedback in upper limb prosthetics. Expert Rev Med Devices. 2013;10: 45–54. 10.1586/erd.12.68 [DOI] [PubMed] [Google Scholar]
  • 19.Tsakiris M. My body in the brain: A neurocognitive model of body-ownership. Neuropsychologia. 2010;48: 703–712. 10.1016/j.neuropsychologia.2009.09.034 [DOI] [PubMed] [Google Scholar]
  • 20.Ehrsson HH, Rosén B, Stockselius A, Ragnö C, Köhler P, Lundborg G. Upper limb amputees can be induced to experience a rubber hand as their own. Brain. 2008;131: 3443–3452. 10.1093/brain/awn297 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Marasco PD, Kim K, Colgate JE, Peshkin MA, Kuiken TA. Robotic touch shifts perception of embodiment to a prosthesis in targeted reinnervation amputees. Brain. 2011;134: 747–58. 10.1093/brain/awq361 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Mulvey MR, Fawkner HJ, Radford HE, Johnson MI. Perceptual Embodiment of Prosthetic Limbs by Transcutaneous Electrical Nerve Stimulation. Neuromodulation Technol Neural Interface. 2012;15: 42–47. 10.1111/j.1525-1403.2011.00408.x [DOI] [PubMed] [Google Scholar]
  • 23.D’Alonzo M, Clemente F, Cipriani C. Vibrotactile Stimulation Promotes Embodiment of an Alien Hand in Amputees With Phantom Sensations. IEEE Trans Neural Syst Rehabil Eng. 2015;23: 450–457. 10.1109/TNSRE.2014.2337952 [DOI] [PubMed] [Google Scholar]
  • 24.Rosén B, Ehrsson HH, Antfolk C, Cipriani C, Sebelius F, Lundborg G. Referral of sensation to an advanced humanoid robotic hand prosthesis. Scand J Plast Reconstr Surg Hand Surg. 2009;43: 260–266. 10.3109/02844310903113107 [DOI] [PubMed] [Google Scholar]
  • 25.Schmalzl L, Kalckert A, Ragnö C, Ehrsson HH. Neural correlates of the rubber hand illusion in amputees: A report of two cases. Neurocase. 2014;20: 407–420. 10.1080/13554794.2013.791861 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Collins KL, Guterstam A, Cronin J, Olson JD, Ehrsson HH, Ojemann JG. Ownership of an artificial limb induced by electrical brain stimulation. Proc Natl Acad Sci U S A. 2017;114: 166–171. 10.1073/pnas.1616305114 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Graczyk EL, Resnik L, Schiefer MA, Schmitt MS, Tyler DJ. Home use of a neural-connected sensory prosthesis provides the functional and psychosocial experience of having a hand again. Sci Rep. 2018;8: 1–17. 10.1038/s41598-017-17765-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Maimon-Mor RO, Obasi E, Lu J, Odeh N, Kirker S, MacSweeney M, et al. Communicative hand gestures as an implicit measure of artificial limb embodiment and daily usage. medRxiv. 2020; 2020.03.11.20033928. 10.1101/2020.03.11.20033928 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Van Den Heiligenberg FMZ, Orlov T, MacDonald SN, Duff EP, Henderson Slater D, Beckmann CF, et al. Artificial limb representation in amputees. Brain. 2018;141: 1422–1433. 10.1093/brain/awy054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Kriegeskorte N, Mur M, Ruff DA, Kiani R, Bodurka J, Esteky H, et al. Matching Categorical Object Representations in Inferior Temporal Cortex of Man and Monkey. Neuron. 2008;60: 1126–1141. 10.1016/j.neuron.2008.10.043 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Orlov T, Makin TR, Zohary E. Topographic Representation of the Human Body in the Occipitotemporal Cortex. Neuron. 2010;68: 586–600. 10.1016/j.neuron.2010.09.032 [DOI] [PubMed] [Google Scholar]
  • 32.Bracci S, Caramazza A, Peelen M V. Representational Similarity of Body Parts in Human Occipitotemporal Cortex. J Neurosci. 2015;35: 12977–12985. 10.1523/JNEUROSCI.4698-14.2015 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Bracci S, Peelen M V. Body and object effectors: The organization of object representations in high-level visual cortex reflects body-object interactions. J Neurosci. 2013;33: 18247–18258. 10.1523/JNEUROSCI.1322-13.2013 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Tsao DY, Livingstone MS. Mechanisms of Face Perception. Annu Rev Neurosci. 2008;31: 411–437. 10.1146/annurev.neuro.30.051606.094238 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Downing PE, Peelen M V. Body selectivity in occipitotemporal cortex: Causal evidence. Neuropsychologia. 2016;83: 138–148. 10.1016/j.neuropsychologia.2015.05.033 [DOI] [PubMed] [Google Scholar]
  • 36.Astafiev S V, Stanley CM, Shulman GL, Corbetta M. Extrastriate body area in human occipital cortex responds to the performance of motor actions. Nat Neurosci. 2004;7: 542–548. 10.1038/nn1241 [DOI] [PubMed] [Google Scholar]
  • 37.Tal Z, Geva R, Amedi A. The origins of metamodality in visual object area LO: Bodily topographical biases and increased functional connectivity to S1. Neuroimage. 2016;127: 363–375. 10.1016/j.neuroimage.2015.11.058 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Beauchamp MS, LaConte S, Yasar N. Distributed representation of single touches in somatosensory and visual cortex. Hum Brain Mapp. 2009;30: 3163–3171. 10.1002/hbm.20735 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Limanowski J, Lutti A, Blankenburg F. The extrastriate body area is involved in illusory limb ownership. Neuroimage. 2014;86: 514–524. 10.1016/j.neuroimage.2013.10.035 [DOI] [PubMed] [Google Scholar]
  • 40.Limanowski J, Blankenburg F. Integration of visual and proprioceptive limb position information in human posterior parietal, premotor, and extrastriate cortex. J Neurosci. 2016;36: 2582–2589. 10.1523/JNEUROSCI.3987-15.2016 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Gentile G, Guterstam A, Brozzoli C, Henrik Ehrsson H. Disintegration of multisensory signals from the real hand reduces default limb self-attribution: An fMRI study. J Neurosci. 2013;33: 13350–13366. 10.1523/JNEUROSCI.1363-13.2013 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Bracci S, Cavina-Pratesi C, Ietswaart M, Caramazza A, Peelen M V. Closely overlapping responses to tools and hands in left lateral occipitotemporal cortex. J Neurophysiol. 2012;107: 1443–1456. 10.1152/jn.00619.2011 [DOI] [PubMed] [Google Scholar]
  • 43.Reddy L, Kanwisher N. Coding of visual objects in the ventral stream. Curr Opin Neurobiol. 2006;16: 408–414. 10.1016/j.conb.2006.06.004 [DOI] [PubMed] [Google Scholar]
  • 44.Gomez J, Barnett M, Grill-Spector K. Extensive childhood experience with Pokémon suggests eccentricity drives organization of visual cortex. Nat Hum Behav. 2019; 1 10.1038/s41562-019-0592-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Lingnau A, Downing PE. The lateral occipitotemporal cortex in action. Trends Cogn Sci. 2015;19: 268–77. 10.1016/j.tics.2015.03.006 [DOI] [PubMed] [Google Scholar]
  • 46.Peelen M V., Downing PE. The neural basis of visual body perception. Nat Rev Neurosci. 2007;8: 636–648. 10.1038/nrn2195 [DOI] [PubMed] [Google Scholar]
  • 47.Downing PE, Jiang Y, Shuman M, Kanwisher N. A cortical area selective for visual processing of the human body. Science (80-). 2001;293: 2470–2473. 10.1126/science.1063414 [DOI] [PubMed] [Google Scholar]
  • 48.Walther A, Nili H, Ejaz N, Alink A, Kriegeskorte N, Diedrichsen J. Reliability of dissimilarity measures for multi-voxel pattern analysis. Neuroimage. 2016;137: 188–200. 10.1016/j.neuroimage.2015.12.012 [DOI] [PubMed] [Google Scholar]
  • 49.Chao LL, Martin A. Representation of Manipulable Man-Made Objects in the Dorsal Stream. Neuroimage. 2000;12: 478–484. 10.1006/nimg.2000.0635 [DOI] [PubMed] [Google Scholar]
  • 50.Wesselink DB, van den Heiligenberg FM, Ejaz N, Dempsey-Jones H, Cardinali L, Tarall-Jozwiak A, et al. Obtaining and maintaining cortical hand representation as evidenced from acquired and congenital handlessness. Elife. 2019;8: 1–19. 10.7554/eLife.37227 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Yarkoni T, Poldrack RA, Nichols TE, Van Essen DC, Wager TD. Large-scale automated synthesis of human functional neuroimaging data. Nat Methods. 2011;8: 665–670. 10.1038/nmeth.1635 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Choi HJ, Zilles K, Mohlberg H, Schleicher A, Fink GR, Armstrong E, et al. Cytoarchitectonic identification and probabilistic mapping of two distinct areas within the anterior ventral bank of the human intraparietal sulcus. J Comp Neurol. 2006;495: 53–69. 10.1002/cne.20849 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Scheperjans F, Eickhoff SB, Hömke L, Mohlberg H, Hermann K, Amunts K, et al. Probabilistic maps, morphometry, and variability of cytoarchitectonic areas in the human superior parietal cortex. Cereb Cortex. 2008;18: 2141–2157. 10.1093/cercor/bhm241 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Hellman RB, Chang E, Tanner J, Helms Tillery SI, Santos VJ. A Robot Hand Testbed Designed for Enhancing Embodiment and Functional Neurorehabilitation of Body Schema in Subjects with Upper Limb Impairment or Loss. Front Hum Neurosci. 2015;9: 26 10.3389/fnhum.2015.00026 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Valle G, Mazzoni A, Iberite F, D’Anna E, Strauss I, Granata G, et al. Biomimetic Intraneural Sensory Feedback Enhances Sensation Naturalness, Tactile Sensitivity, and Manual Dexterity in a Bidirectional Prosthesis. Neuron. 2018;100: 37–45.e7. 10.1016/j.neuron.2018.08.033 [DOI] [PubMed] [Google Scholar]
  • 56.Makin TR, Holmes NP, Ehrsson HH. On the other hand: Dummy hands and peripersonal space. Behav Brain Res. 2008;191: 1–10. 10.1016/j.bbr.2008.02.041 [DOI] [PubMed] [Google Scholar]
  • 57.Ehrsson HH. Multisensory processes in body ownership In: Sathian K, Ramachandran V S, editors. Multisensory Perception: From Laboratory to Clinic. Elsevier; 2020. pp. 179–200. 10.1016/b978-0-12-812492-5.00008-5 [DOI] [Google Scholar]
  • 58.Macdonald S, van den Heiligenberg F, Makin T, Culham J. Videos are more effective than pictures at localizing tool- and hand-selective activation in fMRI. J Vis. 2017;17: 991 10.1167/17.10.991 [DOI] [Google Scholar]
  • 59.Striem-Amit E, Vannuscorps G, Caramazza A. Sensorimotor-independent development of hands and tools selectivity in the visual cortex. Proc Natl Acad Sci U S A. 2017;114: 4787–4792. 10.1073/pnas.1620289114 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Hahamy A, Macdonald SN, van den Heiligenberg FMZ, Kieliba P, Emir U, Malach R, et al. Representation of Multiple Body Parts in the Missing-Hand Territory of Congenital One-Handers. Curr Biol. 2017;27: 1350–1355. 10.1016/j.cub.2017.03.053 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Graziano MSA, Aflalo TN. Mapping behavioral repertoire onto the cortex. Neuron. 2007. pp. 239–251. 10.1016/j.neuron.2007.09.013 [DOI] [PubMed] [Google Scholar]
  • 62.Seger CA, Miller EK. Category Learning in the Brain. 2010. [cited 19 Dec 2018]. 10.1093/brain/awq077 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Braunlich K, Liu Z, Seger CA. Occipitotemporal Category Representations Are Sensitive to Abstract Category Boundaries Defined by Generalization Demands. J Neurosci. 2017;37: 7631–7642. 10.1523/JNEUROSCI.3825-16.2017 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Op de Beeck HP, Pillet I, Ritchie JB. Factors Determining Where Category-Selective Areas Emerge in Visual Cortex. Trends in Cognitive Sciences. Elsevier Ltd; 2019. pp. 784–797. 10.1016/j.tics.2019.06.006 [DOI] [PubMed] [Google Scholar]
  • 65.Braunlich K, Love BC. Occipitotemporal representations reflect individual differences in conceptual knowledge. J Exp Psychol Gen. 2018;Advance on. 10.1037/xge0000501 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Harel A. What is special about expertise? Visual expertise reveals the interactive nature of real-world object recognition. Neuropsychologia. 2016;83: 88–99. 10.1016/j.neuropsychologia.2015.06.004 [DOI] [PubMed] [Google Scholar]
  • 67.Martens F, Bulthé J, van Vliet C, Op de Beeck H. Domain-general and domain-specific neural changes underlying visual expertise. Neuroimage. 2018;169: 80–93. 10.1016/j.neuroimage.2017.12.013 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.McGugin RW, Van Gulick AE, Tamber-Rosenau BJ, Ross DA, Gauthier I. Expertise Effects in Face-Selective Areas are Robust to Clutter and Diverted Attention, but not to Competition. Cereb Cortex. 2015;25: 26610–2622. 10.1093/cercor/bhu060 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Bilalić M, Grottenthaler T, Nägele T, Lindig T. The Faces in Radiological Images: Fusiform Face Area Supports Radiological Expertise. Cereb Cortex. 2016;26: 1004–1014. 10.1093/cercor/bhu272 [DOI] [PubMed] [Google Scholar]
  • 70.Ross DA, Tamber-Rosenau BJ, Palmeri TJ, Zhang J, Xu Y, Gauthier I. High-resolution Functional Magnetic Resonance Imaging Reveals Configural Processing of Cars in Right Anterior Fusiform Face Area of Car Experts. J Cogn Neurosci. 2018;30: 973–984. 10.1162/jocn_a_01256 [DOI] [PubMed] [Google Scholar]
  • 71.Tan DW, Schiefer MA, Keith MW, Anderson JR, Tyler J, Tyler DJ. A neural interface provides long-term stable natural touch perception. Sci Transl Med. 2014;6: 257ra138 10.1126/scitranslmed.3008669 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Philip BA, Frey SH. Compensatory Changes Accompanying Chronic Forced Use of the Nondominant Hand by Unilateral Amputees. J Neurosci. 2014;34: 3622–3631. 10.1523/JNEUROSCI.3770-13.2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Stoeckel MC, Seitz RJ, Buetefisch CM. Congenitally altered motor experience alters somatotopic organization of human primary motor cortex. Natl Institutes Ment Heal. 2009;106: 2395–2400. Available: www.pnas.org/cgi/content/full/ [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74.Makin TR, Cramer AO, Scholz J, Hahamy A, Henderson Slater D, Tracey I, et al. Deprivation-related and use-dependent plasticity go hand in hand. Elife. 2013;2013: 1–15. 10.7554/eLife.01273.01273 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Hahamy A, Sotiropoulos SN, Henderson Slater D, Malach R, Johansen-Berg H, Makin TR. Normalisation of brain connectivity through compensatory behaviour, despite congenital hand absence. Elife. 2015;4 10.7554/eLife.04605 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Striem-Amit E, Vannuscorps G, Caramazza A. Plasticity based on compensatory effector use in the association but not primary sensorimotor cortex of people born without hands. Proc Natl Acad Sci U S A. 2018;115: 7801–7806. 10.1073/pnas.1803926115 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Yu XJ, He HJ, Zhang QW, Zhao F, Zee CS, Zhang SZ, et al. Somatotopic reorganization of hand representation in bilateral arm amputees with or without special foot movement skill. Brain Res. 2014;1546: 9–17. 10.1016/j.brainres.2013.12.025 [DOI] [PubMed] [Google Scholar]
  • 78.Mahon BZ, Caramazza A. What drives the organization of object knowledge in the brain? The distributed domain-specific hypothesis. Trends Cogn Sci. 2011;15: 97–103. 10.1016/j.tics.2011.01.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Heimler B, Striem-Amit E, Amedi A. Origins of task-specific sensory-independent organization in the visual and auditory brain: neuroscience evidence, open questions and clinical implications. Curr Opin Neurobiol. 2015;35: 169–177. 10.1016/j.conb.2015.09.001 [DOI] [PubMed] [Google Scholar]
  • 80.Hahamy A, Makin TR. Remapping in cerebral and cerebellar cortices is not restricted by somatotopy. J Neurosci. 2019; 2599–18. 10.1523/jneurosci.2599-18.2019 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Peelen M V, Bracci S, Lu X, He C, Caramazza A, Bi Y. Tool Selectivity in Left Occipitotemporal Cortex Develops without Vision. 2013. [cited 23 Jan 2019]. 10.1162/jocn_a_00411 [DOI] [PubMed] [Google Scholar]
  • 82.He C, Peelen M V., Han Z, Lin N, Caramazza A, Bi Y. Selectivity for large nonmanipulable objects in scene-selective visual cortex does not require visual experience. Neuroimage. 2013;79: 1–9. 10.1016/j.neuroimage.2013.04.051 [DOI] [PubMed] [Google Scholar]
  • 83.Striem-Amit E, Dakwar O, Reich L, Amedi A. The large-Scale Organization of “‘Visual’” Streams Emerges Without Visual Experience. Cereb Cortex. 2012;22: 1698–1709. 10.1093/cercor/bhr253 [DOI] [PubMed] [Google Scholar]
  • 84.Kitada R, Yoshihara K, Sasaki AT, Hashiguchi M, Kochiyama T, Sadato N. The brain network underlying the recognition of hand gestures in the blind: The supramodal role of the extrastriate body area. J Neurosci. 2014;34: 10096–10108. 10.1523/JNEUROSCI.0500-14.2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85.Striem-Amit E, Amedi A. Visual Cortex Extrastriate Body-Selective Area Activation in Congenitally Blind People ‘“Seeing”‘ by Using Sounds. Curr Biol. 2014;24: 687–692. 10.1016/j.cub.2014.02.010 [DOI] [PubMed] [Google Scholar]
  • 86.Bensmaia SJ, Miller LE. Restoring sensorimotor function through intracortical interfaces: progress and looming challenges. Nat Rev Neurosci. 2014;15: 313–325. 10.1038/nrn3724 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87.de Oliveira Barata S, Clode D, Taylor J, Elias H. Vine Arm. In: The Alternative Limb Project [Internet]. 2017. Available from: http://www.thealternativelimbproject.com/project/vine/
  • 88.van den Heiligenberg FMZ, Yeung N, Brugger P, Culham JC, Makin TR. Adaptable Categorization of Hands and Tools in Prosthesis Users. Psychol Sci. 2017;28: 395–398. 10.1177/0956797616685869 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Uswatte G, Taub E, Morris D, Light K, Thompson PA. The Motor Activity Log-28: assessing daily use of the hemiparetic arm after stroke. Neurology. 2006;67: 1189–94. 10.1212/01.wnl.0000238164.90657.c2 [DOI] [PubMed] [Google Scholar]
  • 90.Saxe R, Jamal N, Powell L. My body or yours? The effect of visual perspective on cortical body representations. Cereb Cortex. 2006;16: 178–182. 10.1093/cercor/bhi095 [DOI] [PubMed] [Google Scholar]
  • 91.Myers A, Sowden PT. Your hand or mine? The extrastriate body area. Neuroimage. 2008;42: 1669–1677. 10.1016/j.neuroimage.2008.05.045 [DOI] [PubMed] [Google Scholar]
  • 92.Belongie S, Malik J, Puzicha J. Shape Matching and Object Recognition Using Shape Contexts. IEEE Trans Pattern Anal Mach Intell. 2002;24: 509–522. 10.1109/34.993558 [DOI] [Google Scholar]
  • 93.Uğurbil K, Xu J, Auerbach EJ, Moeller S, Vu AT, Duarte-Carvajalino JM, et al. Pushing spatial and temporal resolution for functional and diffusion MRI in the Human Connectome Project. Neuroimage. 2013;80: 80–104. 10.1016/j.neuroimage.2013.05.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Greve DN, Fischl B. Accurate and robust brain image alignment using boundary-based registration. Neuroimage. 2009;48: 63–72. 10.1016/j.neuroimage.2009.06.060 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Jenkinson M, Smith S. A global optimisation method for robust affine registration of brain images. Med Image Anal. 2001;5: 143–156. 10.1016/s1361-8415(01)00036-6 [DOI] [PubMed] [Google Scholar]
  • 96.Jenkinson M, Bannister P, Brady M, Smith S. Improved Optimization for the Robust and Accurate Linear Registration and Motion Correction of Brain Images. Neuroimage. 2002;17: 825–841. 10.1016/s1053-8119(02)91132-8 [DOI] [PubMed] [Google Scholar]
  • 97.Andersson JLR, Jenkinson M, Smith S. Non-linear registration aka Spatial normalisation FMRIB Technial Report TR07JA2. 2007. [Google Scholar]
  • 98.Andersson JLR, Jenkinson M, Smith S. Non-linear optimisation FMRIB Technial Report TR07JA1. 2007. [Google Scholar]
  • 99.Smith SM. Fast robust automated brain extraction. Hum Brain Mapp. 2002;17: 143–155. 10.1002/hbm.10062 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100.Woolrich MW, Ripley BD, Brady M, Smith SM. Temporal autocorrelation in univariate linear modeling of FMRI data. Neuroimage. 2001;14: 1370–86. 10.1006/nimg.2001.0931 [DOI] [PubMed] [Google Scholar]
  • 101.Desikan RS, Ségonne F, Fischl B, Quinn BT, Dickerson BC, Blacker D, et al. An automated labeling system for subdividing the human cerebral cortex on MRI scans into gyral based regions of interest. Neuroimage. 2006;31: 968–980. 10.1016/j.neuroimage.2006.01.021 [DOI] [PubMed] [Google Scholar]
  • 102.Diedrichsen J, Kriegeskorte N. Representational models: A common framework for understanding encoding, pattern-component, and representational-similarity analysis. Cichy R, editor. PLoS Comput Biol. 2017;13: e1005508 10.1371/journal.pcbi.1005508 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 103.Nili H, Wingfield C, Walther A, Su L, Marslen-Wilson W, Kriegeskorte N. A Toolbox for Representational Similarity Analysis. PLoS Comput Biol. 2014;10 10.1371/journal.pcbi.1003553 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 104.Wesselink DB, Maimon-Mor RO. RSA toolbox extension for FSL. 2018. [Google Scholar]
  • 105.Jasp Team. JASP. 2019.
  • 106.Wetzels R, Matzke D, Lee MD, Rouder JN, Iverson GJ, Wagenmakers EJ. Statistical evidence in experimental psychology: An empirical comparison using 855 t tests. Perspect Psychol Sci. 2011;6: 291–298. 10.1177/1745691611406923 [DOI] [PubMed] [Google Scholar]
  • 107.Dienes Z. Using Bayes to get the most out of non-significant results. Front Psychol. 2014;5: 1–17. 10.3389/fpsyg.2014.00001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 108.Makin TR, Orban de Xivry JJ. Ten common statistical mistakes to watch out for when writing or reviewing a manuscript. Elife. 2019;8: 1–13. 10.7554/eLife.48175 [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision Letter 0

Gabriel Gasque

15 Jul 2019

Dear Dr Makin,

Thank you for submitting your manuscript entitled "Is an artificial limb embodied as a hand? Brain decoding in prosthetic limb users" for consideration as a Research Article by PLOS Biology.

Your manuscript has now been evaluated by the PLOS Biology editorial staff, as well as by an Academic Editor with relevant expertise, and I am writing to let you know that we would like to send your submission out for external peer review.

Please note, however, that we would like to pursue your manuscript as a Short Report, not as a Full Research Article. Please select this option when resubmitting.

In addition, before we can send your manuscript to reviewers, we need you to complete your submission by providing the metadata that is required for full assessment. To this end, please login to Editorial Manager where you will find the paper in the 'Submissions Needing Revisions' folder on your homepage. Please click 'Revise Submission' from the Action Links and complete all additional questions in the submission questionnaire.

**Important**: Please also see below for further information regarding completing the MDAR reporting checklist. The checklist can be accessed here: https://plos.io/MDARChecklist

Please re-submit your manuscript and the checklist, within two working days, i.e. by Jul 17 2019 11:59PM.

Login to Editorial Manager here: https://www.editorialmanager.com/pbiology

During resubmission, you will be invited to opt-in to posting your pre-review manuscript as a bioRxiv preprint. Visit http://journals.plos.org/plosbiology/s/preprints for full details. If you consent to posting your current manuscript as a preprint, please upload a single Preprint PDF when you re-submit.

Once your full submission is complete, your paper will undergo a series of checks in preparation for peer review. Once your manuscript has passed all checks it will be sent out for review.

Feel free to email us at plosbiology@plos.org if you have any queries relating to your submission.

Kind regards,

Gabriel Gasque, Ph.D.,

Senior Editor

PLOS Biology

==================

INFORMATION REGARDING THE REPORTING CHECKLIST:

PLOS Biology is pleased to support the "minimum reporting standards in the life sciences" initiative (https://osf.io/preprints/metaarxiv/9sm4x/). This effort brings together a number of leading journals and reproducibility experts to develop minimum expectations for reporting information about Materials (including data and code), Design, Analysis and Reporting (MDAR) in published papers. We believe broad alignment on these standards will be to the benefit of authors, reviewers, journals and the wider research community and will help drive better practise in publishing reproducible research.

We are therefore participating in a community pilot involving a small number of life science journals to test the MDAR checklist. The checklist is intended to help authors, reviewers and editors adopt and implement the minimum reporting framework.

IMPORTANT: We have chosen your manuscript to participate in this trial. The relevant documents can be located here:

MDAR reporting checklist (to be filled in by you): https://plos.io/MDARChecklist

**We strongly encourage you to complete the MDAR reporting checklist and return it to us with your full submission, as described above. We would also be very grateful if you could complete this author survey:

https://forms.gle/seEgCrDtM6GLKFGQA

Additional background information:

Interpreting the MDAR Framework: https://plos.io/MDARFramework

Please note that your completed checklist and survey will be shared with the minimum reporting standards working group. However, the working group will not be provided with access to the manuscript or any other confidential information including author identities, manuscript titles or abstracts. Feedback from this process will be used to consider next steps, which might include revisions to the content of the checklist. Data and materials from this initial trial will be publicly shared in September 2019. Data will only be provided in aggregate form and will not be parsed by individual article or by journal, so as to respect the confidentiality of responses.

Please treat the checklist and elaboration as confidential as public release is planned for September 2019.

We would be grateful for any feedback you may have.

Decision Letter 1

Gabriel Gasque

28 Aug 2019

Dear Tamar,

Thank you very much for submitting your manuscript "Is an artificial limb embodied as a hand? Brain decoding in prosthetic limb users" for consideration as a Short Report at PLOS Biology. Your manuscript has been evaluated by the PLOS Biology editors, by an Academic Editor with relevant expertise, and by four independent reviewers. Please accept my apologies for the delay in sending the decision below to you.

In light of the reviews (below), we will not be able to accept the current version of the manuscript, but we would welcome resubmission of a much-revised version that takes into account the reviewers' comments. We cannot make any decision about publication until we have seen the revised manuscript and your response to the reviewers' comments. Your revised manuscript is also likely to be sent for further evaluation by the reviewers.

Your revisions should address the specific points made by each reviewer. Having discussed these comments with the Academic Editor, we think you should pay very special attention to the common concern about using visual areas and passive assessments. For example, is there evidence of similar findings in motor areas. If you chose to send in a revision, this point should be clearly and fully addressed.

Please submit a file detailing your responses to the editorial requests and a point-by-point response to all of the reviewers' comments that indicates the changes you have made to the manuscript. In addition to a clean copy of the manuscript, please upload a 'track-changes' version of your manuscript that specifies the edits made. This should be uploaded as a "Related" file type. You should also cite any additional relevant literature that has been published since the original submission and mention any additional citations in your response.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

Before you revise your manuscript, please review the following PLOS policy and formatting requirements checklist PDF: http://journals.plos.org/plosbiology/s/file?id=9411/plos-biology-formatting-checklist.pdf. It is helpful if you format your revision according to our requirements - should your paper subsequently be accepted, this will save time at the acceptance stage.

Please note that as a condition of publication PLOS' data policy (http://journals.plos.org/plosbiology/s/data-availability) requires that you make available all data used to draw the conclusions arrived at in your manuscript. If you have not already done so, you must include any data used in your manuscript either in appropriate repositories, within the body of the manuscript, or as supporting information (N.B. this includes any numerical values that were used to generate graphs, histograms etc.). For an example see here: http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1001908#s5.

For manuscripts submitted on or after 1st July 2019, we require the original, uncropped and minimally adjusted images supporting all blot and gel results reported in an article's figures or Supporting Information files. We will require these files before a manuscript can be accepted so please prepare them now, if you have not already uploaded them. Please carefully read our guidelines for how to prepare and upload this data: https://journals.plos.org/plosbiology/s/figures#loc-blot-and-gel-reporting-requirements.

Upon resubmission, the editors will assess your revision and if the editors and Academic Editor feel that the revised manuscript remains appropriate for the journal, we will send the manuscript for re-review. We aim to consult the same Academic Editor and reviewers for revised manuscripts but may consult others if needed.

We expect to receive your revised manuscript within two months. Please email us (plosbiology@plos.org) to discuss this if you have any questions or concerns, or would like to request an extension. At this stage, your manuscript remains formally under active consideration at our journal; please notify us by email if you do not wish to submit a revision and instead wish to pursue publication elsewhere, so that we may end consideration of the manuscript at PLOS Biology.

When you are ready to submit a revised version of your manuscript, please go to https://www.editorialmanager.com/pbiology/ and log in as an Author. Click the link labelled 'Submissions Needing Revision' where you will find your submission record.

Thank you again for your submission to our journal. We hope that our editorial process has been constructive thus far, and we welcome your feedback at any time. Please don't hesitate to contact us if you have any questions or comments.

Sincerely,

Gabriel Gasque, Ph.D.,

Senior Editor

PLOS Biology

*****************************************************

Reviewer remarks:

Reviewer #1: The authors present a study on the fMRI activation in the occipitotemporal cortex (OTC) related to different prostheses types. They found that prosthesis-use let to OTC fMRI activations more similar to those of tools than to hands, and that greater daily-life prosthesis usage correlated with greater prosthesis (tool) categorisation, documenting use-dependent plasticity within the OTC. Overall, the manuscript is well written, the methods clear and the results interesting, but there a few issues that need to be addressed.

- It is stated that “Two hypotheses were tested: the embodiment hypothesis, assessing whether prosthetic limbs are in fact represented as hands (and not tools); and, the categorization hypothesis, assessing whether a new ‘prosthesis’ category has formed.” It is unclear what predictions these hypotheses make about the presented outcome measures (representational similarity as measured as OTC fMRI activation). Please clarify in the introduction.

- "In other words, prosthesis users do not, on average, ‘embody’ prostheses as hands more than controls." The lack of a statistically significant difference in the hand-similarity index in prosthesis users relative to controls is not a good argument in favor of the equality of this measure, i.e., an argument against the embodiment hypothesis. Please provide equivalence test to support this statement, or at least a power analysis.

- What was the rationale for choosing to examine BOLD activity in the extra striate region of the OTC? What about motor cortex representations? Were any other regions examined? What were the findings?

- The lack of a linear relationship between prosthesis use and the hand similarity index is contrasted against the strong linear relationship between prosthesis use and the prosthesis similarity index. The latter relationship doesn't need to be driven by the separation of prosthesis from hand representations, because the former relationship would have reflected this too. This implies that the strong linear relationship between ‘prosthesis use’ and ‘prosthesis similarity index’ is driven by the distance between prosthesis representations. Please disentangle the distance between prosthesis representations from the distance between other prosthesis and hand representations.

- "Therefore, our results show that neural embodiment does not predict successful adoption of wearable technology by the human brain." The authors found that prosthesis use correlates negatively with the prosthesis similarity index (“the degree to which the prostheses representation moved away from their natural categories (hands for cosmetic prostheses and tools for active prostheses) and towards one-another”). Doesn't this imply a negative correlation between neural embodiment (representational separation from hands) and adoption of the technology (prosthesis use)?

- "Successful prosthesis usage was predicted not by embodiment (hand-similarity),

but by a more distinct categorical representation of artificial limbs." What do you mean by successful prosthesis use? How did you measure prosthesis success rate? Unclear, please reword or explain.

Reviewer #2: In this study Mor and Makin use multivariate pattern analysis of BOLD signals in the occipitotemporal cortex (OTC) to compare the neural representations of visually presented hands and prostheses in prosthesis-users and controls. The main result is that protheses are more clearly represented as its own category in prosthesis-user compared to controls. In other words, the OTC can better distinguish between hands and prostheses in the group that uses protheses. This is an interesting observation that is well worth reporting. The paper is well written, the statistical analyses has been conducted appropriately, and the main result is very clear. The study advances our understanding of OTC, use-dependent plasticity and the neural representation of artificial limbs in higher order visual cortex. However, I think the authors overstate importance of the present finding with respect to theories of embodiment and the development of advanced prostheses. Other weaknesses are that not all conclusions in the discussion are supported by the data and the manuscript includes some unnecessary speculations. Further, the finding that a new 'prothesis' category is formed in the OTC is not entirely novel as we know from earlier studies that expertise can lead to changes in the neural representations of categories in OTC. In sum, this is an interesting high-quality study but, in my opinion, the conceptual advance with respect to our understanding of artificial limb embodiment is somewhat limited. This article might therefore be better suited in a more specialized journal.

Major points.

1. In the discussion I think the authors oversell their results as evidence against the embodiment hypothesis. The authors do discuss some limitations of their methods and results but I still think they go too far in their conclusions. The present study does not investigate fronto-parietal-cerebellar regions during actual prothesis use so the embodiment theory has not been directly tested in my opinion. To me it seems reasonable that the OTC in expert prothesis-users should be better at differentiating between prosthetic limbs and real hands compared to controls. But does this really matter for how visual information about the prosthesis is processed by parietal, frontal and cerebellar regions during prosthesis use? Take hand-held tools for example. Clearly the OTC can differentiate body parts from tools but tools can be embodied in the sense that some of the same sensorimotor mechanisms used to control real limbs are re-used for effective control of hand-held tools. So, my point is simply that I think the authors should have a more balanced discussion with respect to the significance of the present results for theories of prothesis embodiment.

2. The authors do discuss some limitations of the study with respect to the embodiment question, most notably the use of 2-D pictures of hands rather than real protheses and the focus on a single visual area rather than a network of sensorimotor regions. This is all very good, but to me a further limitation is that the authors used hands and prostheses depicted from different visual orientations, and most problematically, mixing up first person perspective and third person perspective. Does this not cause a problem for the interpretation of the results? Hands (or protheses) of strangers viewed from a third person perspective should not be embodied but simply treated as external visual object. How could this mixing of perspectives have influenced the current results?

3. I think the authors needs to work more on the last paragraph of the discussion. I see two problems with this “conclusion paragraph”. The first is that the authors suddenly introduce a new third interpretation of their data (“appendage represented as a body part”). This seems out-of-place here in the conclusion paragraph. If this is an important alternative interpretation – which I think it is – then it should be discussed more extensively earlier in the discussion paragraph. Also, the authors should more explicitly state how this “appendage interpretation” relates to the two hypotheses outlined in the introduction and how it fits with the results more precisely. But if this idea is merely a speculation it might be better to drop it all together. The second problem I had with the “conclusion paragraph” is that the authors argue against using rubber hand illusion as a tool for prosthesis embodiment research. I do not follow. The present study did not investigate the rubber hand illusion or the sense of ownership of prosthetic limbs, so to me this part reads like an unnecessary speculative overstatement. In sum, I recommend a shorter conclusion-paragraph without speculations or introduction of new concepts and interpretations.

Minor points

4. In my opinion, the statement “In other words, prosthesis users do not, on average, ‘embody’ prostheses as hands more than controls.” is an overstatement. This conclusion is based on a non-significant result from a frequentist statistical test. In other words, all you can say is that you did not find significant evidence against the null hypothesis. But this does mean that you can conclude that the null hypothesis is true. Please tone down this statement or present results from a Bayesian statistical analysis that support the null hypothesis. (This comment refers to a sentence in the results paragraph with the heading “Prosthesis-like (categorical), and not hand-like (embodied) representation of prostheses in

prosthesis users”)

5. In the methods section (under “Stimuli”) please add a motivation for mixing the presentations of the pictures of the hands and protheses from the first- and third-person perspectives (also see point 2 above). Please also clarify which perspective that was used for the own hand stimuli. Finally, add an explanation for why the shoe condition was not included in the analysis.

6. It would have been interesting to have ratings regarding how much the amputees embodied their prothesis in every-day life. I assume that according to the embodiment hypothesis this should have some relationship to the neural representation of protheses. If I am not misinformed, most prothesis users describe their prothesis more like a tool than an actual hand.

7. In the second discussion paragraph you mention the sensorimotor related regions in the frontal and parietal regions, but give no references. Please add a couple of references here.

8. I miss page numbers in the manuscript.

Reviewer #3: The authors present an interesting study aiming at characterising how the visual cortex, and more specifically the extra striate body area, reacts to images of prostheses, tools and hand in prosthesis users and controls.The topic and objective of the paper, understanding how the brain embodies technologies -here a prosthetic hand-, is not only important conceptually but also has major implication for rehabilitation since technological development is not sufficient for successful device adoption and usage. Indeed, to better understand how amputees adopt their prosthesis, one needs to measure objectively embodiment. The authors decided to have an implicit measure of embodiment based on recording the activity of region in the visual system known to typically show specific responses to tools and body-parts like hands. This study is an important extension of a previously published paper showing that individuals who used their prosthesis more in daily lives also showed greater activity in the OTC’s hand-selective visual areas when viewing images of prostheses (van den Heiligenberg, Brain, 2018). The main claims of the paper is straightforward: in contrast to what is predicted to an embodied view of brain (re)organization in amputees, the authors show that the upper limb prothesis is represented in the occipital cortex differently that the hands. The paper is clearly written and the study was elegantly designed and expertly executed. Authors should be praised to rely on fMRI data of 32 individuals missing a hand, which is a sample size larger than typically reported in the literature. Here follows some suggestion that may help to improve the study.

The authors set out to explore two alternative hypotheses (embodiment vs categorisation hypotheses) by mainly contrasting brain activity evoked by prostheses versus hand. However an important third hypothesis is not fully tested in my view: that prostheses are represented not like hand but like tools. This third hypothesis presume that prostheses representations will anchor on an existing category (tools) which is different from what the authors suggest, namely the creation of a new categorical distinction due to new conceptual (visual? -see below) knowledge.

As discussed by the authors, the main finding (as summarised above) could well be explained by visual expertise. Alternatively, since it was shown that body-selective regions also link to the parietal cortex to represent body and tool actions, one may think that the organisation of this brain region also depends on the use of the prosthesis. At the moment the paper does not disentangle these two alternatives which would tell us important information about the mechanisms underlying the brain reorganisation in amputees and people born without hand. Would it be possible to find a strategy to disantagle those two alternatives by analysing further the brain representation of your own prosthesis versus another prosthesis with similar design-function? In this vein, it might also be interesting to scan the brain of people involved in rehabilitative training of amputees since one can assume similar visual expertise without the motor component of use. I think also that this topic (visual vs user-dependent expertise) should be put upfront in the paper in the introduction.

Authors focused their study on body selective regions. Why not conducting similar analyses on more specific ROIs selective to hands or tools? I understand that body selective regions could be defined independently (univariate analyses) from the conditions of interest undergoing multivariate analyses, which may not be the case for hands/tools. I am however wondering if the author could use a data split-half strategy (or any type of crossvaligdation technique) or relying on hand and tool related ROIs from the literature to achieve this goal?

In a previous study, the authors found that individuals who used their prosthesis more in daily lives also showed greater activity in the OTC’s hand-selective visual areas when viewing images of prostheses (van den Heiligenberg, Brain, 2018). Here, they find that when comparing activity levels between controls and prosthesis- users within this region, no significant differences were found for each of the image conditions of hands, tools and prostheses . How do these two observations made on the same dataset fit?

A group comparison of the hand-similarity index between controls and prosthesis users showed no significant differences. In addition, users showed significantly greater distances between their own prosthesis and hands. Both results are thought to contrast with the embodiment hypothesis but the link between those two results is unclear to me.

The prosthesis categorisation effect was found in both congenital and acquired one-handers suggesting no "sensitive period" for this effect to emerge. However the analyses in link to the comparison of these two groups could be further developed in the manuscript.

The analysis were implemented a priori focusing on prosthesis and hand representation in the occipital cortex. Do the authors have speculation about what is happening in other brain regions like the sensorimotor cortex which is thought to also activate in link to action related objects- it is possible that a different representational structure support the use of an artificial limb in those regions?

The study included additional conditions like dismorphed images, lower limb and the own shoes (for people that were shown picture of their own prosthesis). What was the rational for not analysing these conditions which could provide important additional insights?

Distance measurement between brain patterns were computed using Cross validated Mahalanobis distances. Is it what is referred to as linear discriminant contrast (Walter 2016)?

The main hypotheses were tested by a series of group comparisons using t-tests. How were they corrected for multiple comparisons?

Reviewer #4: The manuscript is well written and describes an interesting set of data that challenges a widely-accepted viewpoint in the field, which is that upper limb prostheses are embodied as natural hands. The manuscript offers a new view, which is that prostheses are neurally represented as distinct from both hands and other tools. The manuscript also reports on prosthesis usage data and correlates prosthesis usage to neural representations. The manuscript discusses an important and timely topic, and has implications both for rehabilitation and human-machine interfacing.

Major Points:

1. My primary concern with the study is the underlying assumption that neural activity in visual areas of the brain is an appropriate indicator of embodiment. The study attempts to extend findings about neural responses in the occipitotemporal cortex (OTC) as evidence that prostheses are not embodied as a hand. However, to my knowledge, embodiment is generally discussed in the literature in the context of the body schema, which is a preconscious sensorimotor representation of the body. I would expect then, that neural responses in the sensory and motor cortices would be more appropriate to measure embodiment-related neural representations of prostheses, tools, and hands. Why did the authors choose to focus on OTC rather than any sensorimotor brain regions? I am not convinced that neural responses in visual brain areas are sufficient indicators of embodiment, and thus am not convinced that the conclusions are supported by the data. Please comment.

2. In addition, given that sensorimotor representations of the body involve active control of a limb or tool and dynamic sensory feedback from the limb or tool, why was the experimental task designed to study passive visual experience rather than control or somatosensory feedback of the presented objects? I am not convinced that passively viewing an image of a prosthesis is a sufficient indication of the neural representation of the prosthesis during actual prosthesis usage. Please comment.

3. I believe that the perspective of the body-related images may have impacted the definition of the ROI and the measured neural responses to the hands and tools. One does not typically view one’s own body from a third-person perspective, but the study included both first- and third-person perspective images. What was the rationale for including third-person images when the goal was to assess embodiment, which relates to the sense of self? For the images shown in the third-person, I am concerned that the neural responses recorded might be related to viewing other people’s bodies rather than one’s own body. Were there any differences in neural representations between images in the first- and third-person perspective? This seems to me to be a critical point. Furthermore, in Figure 1, I cannot distinguish which of the example images are in the first-person perspective because they are all floating in a solid background disconnected from bodies. Are the authors certain that the images were interpreted by the participants as being in the first person? Please comment.

4. The survey results regarding prosthesis usage are interesting and valuable. However, I believe that grouping these two metrics, frequency of wear and PAL, may not be appropriate. Users of cosmetic prostheses, for example, may wear their prostheses very frequently, but may not use them to perform any tasks. Although the authors note that the PAL has been validated, they do not present any indication that this combined prosthesis usage score has been validated. Could the authors please provide information about the validation of this combined usage metric? I would also be interested to see the analysis comparing hand-similarity index to prosthesis usage broken down for the two components of the usage metric: i.e. one plot of hand-similarity index to wear frequency and one plot of hand-similarity index to PAL score. Did the authors attempt to decouple the two usage metrics?

5. I found the discussion about visual expertise to be very interesting in relation to the conclusions of this study. It seems to me that the emergence of prosthesis-specific visual expertise in prosthesis users is well-supported by the results, and perhaps a better interpretation of the data than a lack of embodiment. The authors discuss prior literature on car experts, who show increased activity in object-selective areas of OTC compared to non-experts. In this context, the prosthesis users could be considered prosthesis experts, since they have more experience with prostheses than control participants (perhaps except in rare cases). If car-specific regions of the OTC emerge through visual learning, it is reasonable that prosthesis-specific regions could also emerge through visual learning. If there were a ‘prosthesis-specific area’ of the OTC, then it could follow that prosthesis experts would have increased activation of this area than non-experts. This prosthesis-specific region of the OTC could be considered to be the non-hand, non-tool region the authors observed in their fMRI data. It would also make sense that prosthesis usage would correlate with the emergence of the prosthesis-specific region in the OTC, since visual learning would be promoted by increased visual exposure to prostheses (as the prosthesis is used more over time). However, I do not think that this visual learning interpretation implies that prostheses are not embodied, because visual learning could occur independently from sensorimotor learning, and to my understanding, sensorimotor learning is intricately related to embodiment.

Minor Points:

1. The manuscript needs to be revised throughout for typos.

2. I am concerned about the way in which the ROI was defined based on the presentation of headless-bodies versus everyday objects. Why were the bodies headless? Could the unnaturalness or dehumanizing nature of the images of headless-bodies impact arousal or emotion-related centers of the brain?

3. For viewing images of prostheses, was the handedness of the prosthesis images matched to the handedness of the prosthesis user’s limb loss? For example, for a person with an amputation of the right arm, were the observed images of right-handed prostheses? Could the authors please comment on how handedness may have impacted the results?

4. In the introduction, the authors suggest that prosthetic limbs are the “most established form of wearable technology to date.” While I agree that prosthetic limbs are an important category of wearable technology, many other wearable technologies, such as hearing aids, glasses, and watches, are much more widely adopted than prostheses.

5. In the Discussion, the authors note that the categorization hypothesis is “well validated in neuroscience” but do not present any citations to demonstrate prior instances of this categorization hypothesis in the literature. Please provide citations for this prior work.

Decision Letter 2

Gabriel Gasque

4 Feb 2020

Dear Tamar,

Thank you very much for submitting a revised version of your manuscript "Is an artificial limb embodied as a hand? Brain decoding in prosthetic limb users" for consideration as a Short Report at PLOS Biology. This revised version of your manuscript has been evaluated by the PLOS Biology editors, and by the original Academic Editor and reviewers. You will note that reviewer 4, Emily Graczyk, has signed her comments.

In light of the reviews (below), we are positive about your paper and pleased to offer you the opportunity to address the remaining points from the reviewers in a revised version that we anticipate should not take you very long. We will then assess your revised manuscript and your response to the reviewers' comments and we may consult the reviewers again.

We expect to receive your revised manuscript within 1 month.

Please email us (plosbiology@plos.org) if you have any questions or concerns, or would like to request an extension. At this stage, your manuscript remains formally under active consideration at our journal; please notify us by email if you do not intend to submit a revision so that we may end consideration of the manuscript at PLOS Biology.

**IMPORTANT - SUBMITTING YOUR REVISION**

Your revisions should address the specific points made by each reviewer. You will see that while most of these lingering concerns/comments require only textual modifications (more discussion, some clarifications), some need more analyses/data, particularly reviewer 1’s point about Fig S3, reviewer 2’s point 3.1, and reviewer 4’s points 2 and 4. Having discussed these comments with the Academic Editor, we think you should provide what the reviewers have requested, with exception of reviewer 2's point 3.1, which will be your choice.

Because this is a Short Report, please keep the final number of main figures below four.

Please submit the following files along with your revised manuscript:

1. A 'Response to Reviewers' file - this should detail your responses to the editorial requests, present a point-by-point response to all of the reviewers' comments, and indicate the changes made to the manuscript.

*NOTE: In your point by point response to the reviewers, please provide the full context of each review. Do not selectively quote paragraphs or sentences to reply to. The entire set of reviewer comments should be present in full and each specific point should be responded to individually.

You should also cite any additional relevant literature that has been published since the original submission and mention any additional citations in your response.

2. In addition to a clean copy of the manuscript, please also upload a 'track-changes' version of your manuscript that specifies the edits made. This should be uploaded as a "Related" file type.

*Resubmission Checklist*

When you are ready to resubmit your revised manuscript, please refer to this resubmission checklist: https://plos.io/Biology_Checklist

To submit a revised version of your manuscript, please go to https://www.editorialmanager.com/pbiology/ and log in as an Author. Click the link labelled 'Submissions Needing Revision' where you will find your submission record.

Please make sure to read the following important policies and guidelines while preparing your revision:

*Published Peer Review*

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out. Please see here for more details:

https://blogs.plos.org/plos/2019/05/plos-journals-now-open-for-published-peer-review/

*PLOS Data Policy*

Please note that as a condition of publication PLOS' data policy (http://journals.plos.org/plosbiology/s/data-availability) requires that you make available all data used to draw the conclusions arrived at in your manuscript. If you have not already done so, you must include any data used in your manuscript either in appropriate repositories, within the body of the manuscript, or as supporting information (N.B. this includes any numerical values that were used to generate graphs, histograms etc.). For an example see here: http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1001908#s5

*Blot and Gel Data Policy*

We require the original, uncropped and minimally adjusted images supporting all blot and gel results reported in an article's figures or Supporting Information files. We will require these files before a manuscript can be accepted so please prepare them now, if you have not already uploaded them. Please carefully read our guidelines for how to prepare and upload this data: https://journals.plos.org/plosbiology/s/figures#loc-blot-and-gel-reporting-requirements

*Protocols deposition*

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosbiology/s/submission-guidelines#loc-materials-and-methods

Thank you again for your submission to our journal. We hope that our editorial process has been constructive thus far, and we welcome your feedback at any time. Please don't hesitate to contact us if you have any questions or comments.

Sincerely,

Gabriel Gasque, Ph.D.,

Senior Editor

PLOS Biology

*****************************************************

REVIEWS:

Reviewer #1: The authors have provided extensive replies and significantly expanded on the original manuscript in the first round of corrections. They have convincingly made the claim that their "categorization hypothesis" (that prostheses become their own semantic category) provides a better explanation than the traditional embodiment hypothesis (that protheses are represented more like hands). At the same time, the selected paradigm does support complete falsification of the embodiment hypothesis. fMRI activation patterns in the OTC during passive viewing of prostheses may not be the optimal measure to make claims about the embodied representation that would invariably involve widespread brain areas including the motor and somatosensory cortices during actual use of a prosthesis. Therefore, conclusions about the explanatory power of the embodiment hypothesis should be toned down.

A few more comments:

Currently, cosmetic and active prostheses are lumped together in most of the outcome measures. It could be the case that the embodiment hypothesis holds true only for one of them.

Supplementary Figure S3 is greatly appreciated to better understand the individual pairwise distances that make up the complex prosthesis similarity index which underlies the core argument of the manuscript - that the categorization hypothesis provides a better explanation of brain activity patterns than the embodiment hypothesis. Please provide statistical hypothesis tests for each of these displayed pairwise distances, and present this information in the main text. It is quite confusing to interpret the prosthesis similarity index without understanding the distances it is composed of. As the prosthesis similarity index is a key outcome measure, this is of paramount importance.

If the prosthesis similarity index should capture "degree to which the prostheses representation moved away from their natural categories (hands for cosmetic prostheses and tools for active prostheses) and towards one another", why does it include the pairwise distances hands <-> active prostheses and tools <-> cosmetic prostheses? It doesn't appear to be directly relevant under the above definition, and confuses the reader during their interpretation of the measure.

Reviewer #2: Excellent revision. The manuscript has been much improved and it will now make a very valuable contribution to the literature. The results are fascinating. I just have a few minor comments.

1. Introduction, page 2. The term "visual embodiment" is new to me. Can we you define the term and/or give a reference the first time you use it on page 2.

2. Introduction, page 2. The phrase "visual embodiment has been suggested to the gateway for the sense of bodily ownership" is jargon. Please provide a better explanation for why visual information is important for embodiment and body perception. From a multisensory perceptive vision is important for localizing and identifying the limbs in space because the signal is spatially more precise compared with other senses (under good viewing conditions). I recommend you cite studies from the multisensory literature.

3. In their response letter, the author write that some authors have argued that the rubber hand is not embodied at a sensorimotor level during the rubber hand illusion, but rather at the visual level. I disagree with this view and argue that most researchers working on the rubber hand illusion probably describe this phenomenon as a multisensory body illusion rather than a visual illusion. The somatosensory (tactile-proprioceptive) aspects of the illusion are very prominent and the illusion works well without visual feedback of a limb (Ehrsson et al. 2005 J Neurosci; Guterstam et al. 2013 J Cog Neurosci).

4. Page 3 and discussion. Most importantly, when the authors discuss OTC and EBA it is probably relevant to mention that this region respond to the rubber hand illusion, over and above tactile stimulation or the presentation of a visual hand image. The two best citations are probably Limanowski and Blankenburg (2014) Neuroimage (see also Limanowski and Blankenburg 2016 J Neurosci) that used an EBA localizer, and Gentile et al. (2013) J Neurosci that showed a correlation between LOC activity and the strength of the ownership illusion and who also found significant changes in functional connectivity between LOC and IPS (stronger PPI during hand ownership). This literature does not only suggest that OTC/EBA respond to tactile stimulation and has cross-modal properties, but actually, that this region play an important role in the key multisensory integration mechanisms related to body ownership/embodiment. I think this point is relevant for current study.

5. Page 14, first sentence, second paragraph. What is the difference between "neural visual embodiment" and "visual embodiment"? You have not registered "visual embodiment" behaviorally in this study, so we have no data about a possible relationship between visual embodiment and prothesis-use from a behavioral perspective. Please clarify this section.

6. Page 14, fourth paragraph, "other sensorimotor-related regions in the parietal and frontal regions (53)". I recommend that the authors to cite a couple of relevant original articles here (e.g. Gentile et al 2013), and a more up-to date-review, for example Ehrsson (2020).

Ehrsson HH. Multisensory processes in body ownership. In: Sathian K, Ramachandran VS, eds. Multisensory Perception: From Laboratory to Clinic. Academic Press: Elsevier; 2020:179-200.

http://www.ehrssonlab.se/pdfs/Ehrsson%20HH%202020%20Multisensory%20processes%20in%20body%20ownership.pdf

Reviewer #3: The authors have done a meticulous job addressing my comments. I think a few minor issues may require further considerations.

The number below relates to my original comments.

3.1.I believe it would be better to formalise the 3rd possibility (that prosthesis anchor more into the representation of tools in prosthesis user) from the start- eg in the intro. As mentioned by the author this hypothesis is 'implicitly" announced and tested and I see that the results is confusing as others reviewers mention that prosthesis maps more like tools while the author in their response suggest that this is not statistically the case. Also, I think it makes sense to consider this possibility seriously since one of the main hypothesis for the development of tool selective region relate to tool-use and intrinsic connectivity between occipital-temporal and parietal regions (e.g. work of Stefania Bracci). I however let the author decide if they think this proposed strategy is best suited to present their work.

3.2. I understand that the authors do not want to include these exploratory analyses in the manuscript. I would however be nice for the reader to get familiarised in the discussion with the idea that the visual expertise hypothesis, which is one potential reason behind the results of the current study, might be formally tested for instance by involving people highly involved in rehabilitative training of amputees.

3.4. I am not sure the point was fully taken. What I was intending to question was the conceptual relationship of the current paper in the context of the Brain paper.

In the abstract of the Brain paper, one can read: "We show that the more one-handers use an artificial limb (prosthesis) in their everyday life, the stronger visual hand-selective areas in the lateral occipitotemporal cortex respond to prosthesis images." or, in the summary section "our findings show that neurocognitive resources devoted to representing our body can support representation of artificial body parts". This paper therefore suggest that images of prostheses anchor into visual hand selective region in prosthesis user. At first sight, this seems at odd with the conclusion of the current study showing that "prosthesis users represented their own prosthesis more dissimilarly to hands, compared to controls". I think that some text unifying the two studies may help more naive reader on the subject to create a coherent core of knowledge in this domain.

Reviewer #4, Emily Graczyk: I have reviewed a previous version of the manuscript. I believe the current version is much improved and I thank the authors for their efforts to thoroughly address the points raised by reviewers. I have a few remaining comments that I would like the authors to consider.

Major points:

1. I thank the authors for clarifying that the focus of the article is on visual embodiment rather than sensorimotor embodiment. I think this clarification alleviates most of my previous major concerns. The authors' explanation of why visual embodiment was more appropriate to test in this study rather than sensorimotor embodiment was also helpful.

As another argument for the relevance of visual embodiment in one-handers, the authors note the over-reliance on visual guidance during normal prosthesis use. Could the authors comment on how this over-reliance on visual feedback when using prostheses could have influenced the results? Is it possible that the over-reliance on visual information could be a factor contributing to the differences in cortical representations of prostheses for prosthesis users compared to controls?

2. On page 8, the authors state "This indicates that using a prosthesis alters one's visual representation of different prostheses types into a distinct category, in between hands and tools…" It appears that the authors assume that the prosthesis category must be in between hands and tools along the hand-tool axis. In a multidimensional space, however, it's possible for the prosthesis category to be off of the hand-tool axis in a higher dimension. The calculations of the hand-similarity index and the prosthesis-similarity index do not appear to necessitate a single axis along which all three categories (hand, tool, prosthesis) must lie. I believe this implicit assumption of a single axis could influence the interpretation of the study by readers and should be removed. As a way to address this point, it would be interesting to see an analysis breaking down the hand similarity index and the prosthesis similarity index into their component distance contributors. For example, how much of the differences between controls and prosthesis users in the prosthesis similarity index are due to the hand-cosmetic distance increasing vs the tool-active prosthesis distance increasing vs the cosmetic-active distance decreasing? Based on the way the prosthesis similarity index is calculated, any one of these distance changes could result in a higher prosthesis similarity index without all three distances necessarily changing. A discussion about the contributions of the inter-condition distances to the similarity indices and the extent to which the prosthesis category is on or off the hand-tool axis would be helpful.

3. The authors discuss the present findings in relation to two of their prior studies that appear to present contradictory or unexpected conclusions. The authors should provide a bit more discussion about how to rectify or interpret these different findings. First, the authors cite a prior study in the introduction showing that greater prosthesis use in daily life correlated to greater activity in OTC. However, the first paragraph of the results indicates that overall activity levels in the selected ROI were not different between prosthesis users and controls. This seems to contradict the previous results cited in the introduction and warrants further discussion. In addition, what are the implications of greater overall activation of OTC with increased prosthesis use on the inter-group comparisons in this study? Second, the authors discuss a prior study showing that cognitive and phenomenological embodiment correlates to prosthesis usage. While this does not contradict the present findings that neural embodiment does not correlate to prosthesis usage, it does suggest that cognitive embodiment is not a consequence of neural embodiment in the visual cortex. Instead, it suggests that cognitive embodiment must arise from neural representations in some other brain area. A bit more discussion about these points would be helpful in interpreting the present work in the context of prior literature.

4. I believe an analysis comparing the data from congenital and acquired one-handers is needed to support the authors' claim that the relationships do not depend on "visual mimicry of the human hand and cause of limb loss" (page 17). It appears that the data is presented separately for congenital and acquired one-handers in Figures 3 and 4, but there do not appear to be any statistical analyses presented that explicitly compare the two groups of one-handers. Were there any differences between congenital and acquired one-handers in hand similarity index or prosthesis similarity index? Were the relationships between prosthesis similarity index and prosthesis usage the same for congenital and acquired amputees?

5. I think the manuscript was improved by the additional analyses focused on ROIs outside of EBA, and I thank the authors for their efforts on this. However, I think the interpretations stated in the section "Prosthesis representation beyond EBA" are stated too strongly. The only significant differences shown in IPS between prosthesis users and controls was for the "own prosthesis" condition. The hand and prosthesis similarity indices did not differ between controls and prosthesis users in the IPS. While I do not think this point weakens the overall study, I think it should be made clear that prosthesis categorization may be weaker in IPS. I think this is further support for future investigations into neural representations of embodiment in other brain areas, as the authors suggest in the discussion.

Minor points:

1. I thank the authors for clarifying why the task was designed to be passive. However, I have a few remaining questions about the images viewed by participants while in the scanner. The authors state that they used generic hand images for the task. Please clarify if the hands were matched in size, gender, and/or skin color to the participants' own hands. I think this detail needs to be explicitly stated in the methods or supplement, so that readers can understand the point the authors make in the discussion regarding the limitations of "generic hand images that are not the participants' own hands".

2. I thank the authors for the additional information about the combined prosthesis usage measure and the additional analyses demonstrating the relationships between the similarity indices and the PAL and wear time individually. In the previous revision, I asked the authors to provide evidence that the combined prosthesis usage measure had been validated. By validation of the metric, I mean is there data demonstrating internal consistency, minimal detectable change, and test-retest reliability of this metric? If this data is available, please indicate where it can be found. If not, please indicate this point in the methods.

3. In the abstract, the sentence beginning with "Moreover, prosthesis users represented their own prosthesis more dissimilarly to hands, compared to controls, challenging…" should be reorganized for clarity. It is difficult to distinguish what exactly was compared.

4. Page 12 - the acronym RSA is introduced and not defined.

5. Where was the prosthesis observation log (POL) score for control participants (shown in Table S1) used in this study? I do not think Table S1 should be removed, but it is not clear whether POL was used in any prosthesis usage analyses.

6. In the results section titled "Prosthesis representation doesn't depend on the users' prosthesis type", other key factors of prosthesis use were not considered when analyzing prosthesis categorization: level of injury (transradial vs transhumeral), duration of time since limb loss, congenital vs acquired, and duration of time using primary prosthesis. Given the potential role of visual learning on the results, the duration of time since starting to use their primary prosthesis may be especially important to investigate in the future.

7. Page 10 - "A group comparison between users of the two prosthesis types revealed no significant effect." It is not clear what metric is being compared between users of cosmetic and active prostheses in this section.

Decision Letter 3

Gabriel Gasque

7 Apr 2020

Dear Dr Makin,

Thank you for submitting your revised Short Report entitled "Is an artificial limb embodied as a hand? Brain decoding in prosthetic limb users" for publication in PLOS Biology. I have now obtained advice from original reviewers 1 and 4 and have discussed their comments with the Academic Editor. You will note that reviewer 1, Surjo Soekadar, has signed his comments.

I'm delighted to let you know that we're now editorially satisfied with your manuscript. However, before we can formally accept your paper and consider it "in press", we also need to ensure that your article conforms to our guidelines. A member of our team will be in touch shortly with a set of requests. As we can't proceed until these requirements are met, your swift response will help prevent delays to publication. Please also make sure to address the data and other policy-related requests noted at the end of this email.

*Copyediting*

Upon acceptance of your article, your final files will be copyedited and typeset into the final PDF. While you will have an opportunity to review these files as proofs, PLOS will only permit corrections to spelling or significant scientific errors. Therefore, please take this final revision time to assess and make any remaining major changes to your manuscript.

NOTE: If Supporting Information files are included with your article, note that these are not copyedited and will be published as they are submitted. Please ensure that these files are legible and of high quality (at least 300 dpi) in an easily accessible file format. For this reason, please be aware that any references listed in an SI file will not be indexed. For more information, see our Supporting Information guidelines:

https://journals.plos.org/plosbiology/s/supporting-information

*Published Peer Review History*

Please note that you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out. Please see here for more details:

https://blogs.plos.org/plos/2019/05/plos-journals-now-open-for-published-peer-review/

*Early Version*

Please note that an uncorrected proof of your manuscript will be published online ahead of the final version, unless you opted out when submitting your manuscript. If, for any reason, you do not want an earlier version of your manuscript published online, uncheck the box. Should you, your institution's press office or the journal office choose to press release your paper, you will automatically be opted out of early publication. We ask that you notify us as soon as possible if you or your institution is planning to press release the article.

*Protocols deposition*

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosbiology/s/submission-guidelines#loc-materials-and-methods

*Submitting Your Revision*

To submit your revision, please go to https://www.editorialmanager.com/pbiology/ and log in as an Author. Click the link labelled 'Submissions Needing Revision' to find your submission record. Your revised submission must include a cover letter, a Response to Reviewers file that provides a detailed response to the reviewers' comments (if applicable), and a track-changes file indicating any changes that you have made to the manuscript.

Please do not hesitate to contact me should you have any questions.

Sincerely,

Gabriel Gasque, Ph.D.,

Senior Editor

PLOS Biology

------------------------------------------------------------------------

ETHICS STATEMENT:

-- Please indicate if the informed consent was written. If not, please explain why.

------------------------------------------------------------------------

DATA POLICY:

You may be aware of the PLOS Data Policy, which requires that all data be made available without restriction: http://journals.plos.org/plosbiology/s/data-availability. For more information, please also see this editorial: http://dx.doi.org/10.1371/journal.pbio.1001797

Note that we do not require all raw data. Rather, we ask for all individual quantitative observations that underlie the data summarized in the figures and results of your paper. For an example see here: http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1001908#s5

These data can be made available in one of the following forms:

1) Supplementary files (e.g., excel). Please ensure that all data files are uploaded as 'Supporting Information' and are invariably referred to (in the manuscript, figure legends, and the Description field when uploading your files) using the following format verbatim: S1 Data, S2 Data, etc. Multiple panels of a single or even several figures can be included as multiple sheets in one excel file that is saved using exactly the following convention: S1_Data.xlsx (using an underscore).

2) Deposition in a publicly available repository. Please also provide the accession code or a reviewer link so that we may view your data before publication.

Regardless of the method selected, please ensure that you provide the individual numerical values that underlie the summary data displayed in the following figure: Figure 2A, 3ABCEF, S1, S2BCD, S3AB, and S5AB.

NOTE: the numerical data provided should include all replicates AND the way in which the plotted mean and errors were derived (it should not present only the mean/average values).

Please also ensure that figure legends in your manuscript include information on where the underlying data can be found and ensure your supplemental data file/s has a legend.

Please ensure that your Data Statement in the submission system accurately describes where your data can be found.

------------------------------------------------------------------------

Reviewer remarks:

Reviewer #1, Surjo R. Soekadar: Makin et al. present the third revision of their manuscript on prosthesis embodiment analyzed in terms of multidimensional fMRI activation patterns in response to visual depictions of prostheses, hands, and tools. The remaining concerns of all reviewers have been sufficiently addressed. The core outcome measure presented by the authors consisted of a complex distance metric measuring the degree to which prosthesis representations moved away from those of hands and tools, and towards each other (a distinct new category) in prothesis users. Many comments requested that the pairwise distances making up this complex distance metric should be further explained, which the authors have now satisfactorily done. The manuscript is therefore now suitable for publication.

Reviewer #4: I thank the authors for thoroughly addressing all of my comments. I appreciate the addition of Supplementary Table 4 showing the results of a PCA examining the contributions of pairwise-distances on the prosthesis-similarity index. I have no further concerns and recommend the manuscript for publication.

Decision Letter 4

Gabriel Gasque

20 May 2020

Dear Dr Makin,

On behalf of my colleagues and the Academic Editor, Karunesh Ganguly, I am pleased to inform you that we will be delighted to publish your Short Reports in PLOS Biology.

The files will now enter our production system. You will receive a copyedited version of the manuscript, along with your figures for a final review. You will be given two business days to review and approve the copyedit. Then, within a week, you will receive a PDF proof of your typeset article. You will have two days to review the PDF and make any final corrections. If there is a chance that you'll be unavailable during the copy editing/proof review period, please provide us with contact details of one of the other authors whom you nominate to handle these stages on your behalf. This will ensure that any requested corrections reach the production department in time for publication.

Early Version

The version of your manuscript submitted at the copyedit stage will be posted online ahead of the final proof version, unless you have already opted out of the process. The date of the early version will be your article's publication date. The final article will be published to the same URL, and all versions of the paper will be accessible to readers.

PRESS

We frequently collaborate with press offices. If your institution or institutions have a press office, please notify them about your upcoming paper at this point, to enable them to help maximise its impact. If the press office is planning to promote your findings, we would be grateful if they could coordinate with biologypress@plos.org. If you have not yet opted out of the early version process, we ask that you notify us immediately of any press plans so that we may do so on your behalf.

We also ask that you take this opportunity to read our Embargo Policy regarding the discussion, promotion and media coverage of work that is yet to be published by PLOS. As your manuscript is not yet published, it is bound by the conditions of our Embargo Policy. Please be aware that this policy is in place both to ensure that any press coverage of your article is fully substantiated and to provide a direct link between such coverage and the published work. For full details of our Embargo Policy, please visit http://www.plos.org/about/media-inquiries/embargo-policy/.

Thank you again for submitting your manuscript to PLOS Biology and for your support of Open Access publishing. Please do not hesitate to contact me if I can provide any assistance during the production process.

Kind regards,

Vita Usova

Publication Assistant,

PLOS Biology

on behalf of

Gabriel Gasque,

Senior Editor

PLOS Biology

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Text. Supplementary results: ‘Hand’ and ‘Tool’ ROI analysis.

    Additional analyses conducted on ‘hand’ and ‘tool’ ROIs generated from the meta-analysis tool Neurosynth. ROI, region of interest

    (DOCX)

    S1 Table. Control participants demographics.

    This table was taken from the supplementary material of van den Heiligenberg and colleagues, 2018, using the same controls cohort.

    (XLSX)

    S2 Table. Group comparison of average activity levels.

    Group comparison of average activity levels between controls and prosthesis users within the visual body-selective ROI. Results are shown for both the bilateral ROI (500 voxels) and for each hemisphere separately (250 voxels each). ROI, region of interest

    (XLSX)

    S3 Table. Confirmatory additional analyses.

    A summary table for analyses of hand-similarity index and prosthesis-cluster index: group comparisons (controls versus prosthesis users) and correlation with prosthesis usage. Including the results reported in the paper (bilateral ROI), results within the same ROI with the raw indices without controlling for the hand-tool distance (bilateral ROI raw), and for the results of the indices for each hemisphere separately. ROI, region of interest

    (XLSX)

    S4 Table. PCA of pairwise distances.

    To explore which of the pairwise distances contributed to the underlying observed effect of prosthesis categorisation we ran a data-driven analysis (PCA) on the 5 distances of the one-handed group. Values in the table are the weights given to each distance within a component. The first component shows a ‘main effect’ of interindividual differences across participants, in which some individuals have overall larger distances than others across all condition pairs. In our calculated indices, we control for this effect by normalising the individual’s selectivity indices by their Hands ↔ Tools distance (see ‘Methods‘). The second component explains almost half of the remaining variance (after accounting for the interindividual differences in component 1). In the second component, individuals showing greater distances between the active prostheses and the tool condition also show greater similarity between the active prosthesis and the cosmetic prosthesis conditions. In other words, when the active prosthesis condition moves away from the tool category, it also tends to get closer to the cosmetic prostheses (as can be seen by the high weights and opposite signs of these 2 distances in the second component). This data-driven analysis provides further support for the hypothesised categorical shift of prosthesis representation. PCA, principle component analysis

    (XLSX)

    S1 Fig. Group probability maps for visual body-selective ROIs in control participants and prosthesis users.

    All individual visual ROIs were superimposed per group, yielding corresponding probability maps. Warmer colours represent voxels that were included in greater numbers of individual ROIs. Data used to create this figure can be found at https://osf.io/4mw2t/. ROI, region of interest

    (TIF)

    S2 Fig. Intercategorical shape dissimilarities.

    (A) Two exemplars from the ‘hand’ and ‘active prosthesis’ categories. (B) All exemplars shown to each individual participant were submitted to a visual shape similarity analysis (Belongie and colleagues, 2002), in which intercategorical pairwise shape similarity was assessed. (C) A histogram showing intercategorical similarity from one participant’s shown cosmetic (blue) and active (red) prosthesis exemplars, with respect to hand exemplars (all exemplars are available on https://osf.io/kd2yh/). As demonstrated in this example, these dissimilarity ranges were largely overlapping. (D) This intercategory dissimilarity analysis was repeated for each of the participants (based on the specific prostheses exemplars shown to them), and mean histogram values were averaged. As indicated in the resulting matrix, cosmetic and active prostheses did not show strong differences in similarities, on average. This is likely due to the wide range of exemplars/shapes used in the study data set. Data used to create this figure can be found at https://osf.io/4mw2t/.

    (TIF)

    S3 Fig. Pairwise distances in EBA.

    (A) Pairwise distances between patterns of activations of hands, cosmetic prostheses, active prostheses, and tools. In the labels, ‘↔’ indicates the distance between a pair of conditions. Within the plot, x indicates the group’s mean. (B) same as panel A, only with each distance standardised by the individuals’ distances between hands and tools. (C) A table illustrating the direction of the effect predicted by each index. Data used to create this figure can be found at https://osf.io/4mw2t/. EBA, extrastriate body-selective area

    (TIF)

    S4 Fig. Neurosynth ‘hand’ and ‘tool’ ROIs.

    Using the association maps for the words: ‘hand’ (blue) and ‘tools’ (green), ROIs were defined by using all significant voxels within the OTC. These ROIs are projected on inflated brain for visualisation. Surface and volume masks can be found at https://osf.io/4mw2t/. OTC, occipitotemporal cortex; ROI, region of interest

    (TIF)

    S5 Fig. IPS analysis.

    (A) Univariate activations in prosthesis users. Results of the group level univariate contrast of (Active Prosthesis + Cosmetic Prosthesis) > Objects show that at the group level the IPS is also activated. (B) The IPS region of interest was taken from the Juelich Histological Atlas (30% probability of hIP1, hIP2, and hIP3). (C) Hand (left) and tool (right) distances from users’ ‘own’ prosthesis in IPS. Individual distances were normalised by the controls’ group mean distance, depending on the visual features of the ‘own’ prosthesis (hand-likeness). A value of 1 indicates similar hand/tool distance to controls. Users showed significantly greater distances between their own prosthesis and hands (t(25) = 10.11, p < 0.001) contrary to the embodiment hypothesis. A significant increase in the distance of the ‘own’ prosthesis from tools was also observed (t(25) = 4.62, p < 0.001). Data used to create this figure can be found at https://osf.io/4mw2t/. hIP, human intraparietal; IPS, intraparietal sulcus.

    (TIF)

    Attachment

    Submitted filename: Rebuttal_drat3.docx

    Attachment

    Submitted filename: Rebuttal_rev2_submitted.docx

    Attachment

    Submitted filename: ResponseToReviewers.docx

    Data Availability Statement

    Full study protocol, key materials and full clinical/demographic details are available from the Open Science Framework at https://osf.io/kd2yh/ Data used in all the reported analyses and figures are available from the Open Science Framework at https://osf.io/4mw2t/ Full raw data including each participant/ROI fMRI BOLD activity values across voxels/conditions/runs, will be available upon request.


    Articles from PLoS Biology are provided here courtesy of PLOS

    RESOURCES