Abstract
To examine the neural signatures of language co-activation and control during bilingual spoken word comprehension, Korean-English bilinguals and English monolinguals were asked to make overt or covert semantic relatedness judgments on auditorily-presented English word pairs. In two critical conditions, participants heard word pairs consisting of an English-Korean interlingual homophone (e.g., the sound /mu:n/ means “moon” in English and “door” in Korean) as the prime and an English word as the target. In the homophone-related condition, the target (e.g., “lock”) was related to the homophone’s Korean meaning, but not related to the homophone’s English meaning. In the homophone-unrelated condition, the target was unrelated to either the homophone’s Korean meaning or the homophone’s English meaning. In overtly responded situations, ERP results revealed that the reduced N400 effect in bilinguals for homophone-related word pairs correlated positively with the amount of their daily exposure to Korean. In covertly responded situations, ERP results showed a reduced late positive component for homophone-related word pairs in the right hemisphere, and this late positive effect was related to the neural efficiency of suppressing interference in a non-linguistic task. Together, these findings suggest 1) that the degree of language co-activation in bilingual spoken word comprehension is modulated by the amount of daily exposure to the non-target language; and 2) that bilinguals who are less influenced by cross-language activation may also have greater efficiency in suppressing interference in a non-linguistic task.
Keywords: language co-activation, inhibitory control, cross-language competition, ERPs, N400, LPC
1. Introduction
Bilinguals’ two languages have been shown to be simultaneously active during listening, reading, and speaking, even when only one language is explicitly required (e.g., Colomé & Miozzo, 2010; Marian & Spivey, 2003; Schwartz & Kroll, 2006; Thierry & Wu, 2007). This parallel activation has been shown to facilitate lexical access (e.g., Hoshino & Kroll, 2008; Van Hell & Dijkstra, 2002), as well as to interfere with language processing in bilingual comprehension (e.g., Dijkstra, Timmermans, & Schriefers, 2000; Lagrou, Hartsuiker, & Duyck, 2011). To date, research has shown that when bilinguals process visual words, they experience language co-activation and use inhibitory control to resolve competition from the non-target language (Durlik, Szewczyk, Muszyński, & Wodniecka, 2016; Macizo, Bajo, & Martín, 2010; Martín, Macizo, & Bajo 2010). However, it is less clear to what extent bilinguals draw on these same processing mechanisms during spoken word comprehension. Here, we present evidence for the neural signatures of language co-activation and control in bilingual spoken word comprehension.
1.1. Language co-activation in bilingual visual and spoken word comprehension
Interlingual homographs and homophones, words that share visual and auditory forms but not meanings across languages, are often used to investigate language co-activation in bilingual visual and spoken word comprehension. In bilingual visual word comprehension studies, lexical decision times for homographs have been found to differ from those for non-homographs, suggesting that bilinguals activate lexical-semantic information in both languages non-selectively (e.g., De Groot, Delmaar, & Lupker, 2000; Dijkstra, Van Jaarsveld, & Ten Brinke, 1998). In bilingual spoken word comprehension, homophones have been predicted to affect language co-activation differently than homographs. This prediction was based on the reasoning that spoken words, unlike written words, carry phonemic and subphonemic information that might produce a stronger cue to language membership and thus limit cross-linguistic phonological activation in bilinguals (Ju & Luce, 2004). However, when using homophones in an auditory lexical decision task, Lagrou et al. (2011) found that Dutch-English bilinguals showed delayed response times to homophones compared to non-homophones, regardless of the target language of the task. This result parallels effects found with homographs in bilingual visual word comprehension and suggests that lexical-semantic co-activation caused by interlingual homophones is similar to co-activation caused by homographs.
It is possible that the orthographic and typological similarity of the two languages (Dutch and English) used in Lagrou et al. (2011) boosted language co-activation in their study. Previous studies have suggested that orthographic information is activated during bilingual spoken word comprehension (Mishra & Singh, 2014; Veivo & Järvikivi, 2013). It is therefore possible that the observed lexical co-activation may have been partly driven by the orthographic overlap between homophones in the two languages. The current study used interlingual homophones between Korean and English, two languages that do not share a script, to examine whether lexical-semantic co-activation still occurs when orthographic overlap is completely eliminated.
1.2. Inhibitory control in bilingual language processing
To investigate the involvement of inhibitory control in bilingual visual word comprehension, Macizo and colleagues developed a negative priming paradigm in which Spanish-English bilinguals were instructed to make semantic relatedness judgments on English word pairs. In two similar behavioral studies (Macizo, Bajo, & Martín, 2010; Martín, Macizo, & Bajo, 2010), Spanish-English bilinguals showed slower response times to homograph word pairs (e.g., pie-toe, “pie” meaning “foot” in Spanish) when compared to unrelated word pairs (e.g., log-toe). This result indicated that bilinguals co-activated the Spanish meaning of “pie,” which interfered with their semantic judgment in English. More importantly, bilinguals were also slower when judging translation equivalent word pairs (e.g., foot-finger, where “foot” is the English translation of the Spanish word “pie”) following homograph word pairs (e.g., pie-toe) as compared to those following unrelated word pairs (e.g., log-toe). The authors interpreted the prolonged response time on the second word pair as reflecting a recovery process in which participants suppressed the homograph’s Spanish meaning (e.g., “foot” for “pie”) during the presentation of the first word pair (e.g., “pie-toe”) and subsequently had to overcome the inhibition in order to judge the semantic relatedness of the second word pair (e.g., “foot-finger”). These findings are compatible with the Inhibitory Control model (IC model, Green, 1998) and the Bilingual Interactive Activation model (BIA, Dijkstra & Van Heuven, 1998), although the specific loci of the inhibitory processes might differ based on the two models. Because the current study did not aim to determine the specific locus of inhibitory control, the term inhibitory control throughout this paper refers to a domain-general inhibitory control process.
Other studies in bilingual visual word comprehension have provided further evidence of the involvement of inhibitory control and have also revealed the time course and scope of inhibitory control. In a follow-up study to Macizo et al. (2010), Martín et al. (2010) manipulated the intervals between homograph word pairs (e.g., pie-toe) and translation equivalent word pairs (e.g., foot-hand) and found that the inhibitory control effect disappeared between 500 ms and 750 ms. However, a recent study using a similar design suggests that inhibitory processes in bilingual visual word comprehension could last more than 1000 ms, at least when bilinguals have limited proficiency in their L2 (Durlik et al., 2016). This same study also suggests that inhibition may extend beyond the non-target language translation equivalent to the whole semantic category to which the homograph’s meaning in the non-target language belongs. More specifically, Durlik et al. (2016) found that the presentation of word pairs such as “pie-toe” not only slowed down the subsequent processing of “foot” but also delayed the following processing of “hand.” These findings of inhibitory processes in bilingual visual word recognition seem to resemble those in language production, where the effect of inhibition has been considered long-lasting and widespread (Guo, Liu, Misra, & Kroll, 2011; Misra, Guo, Bobb, & Kroll, 2012; Rossi, Newman, Diaz, Guo, & Kroll, in preparation).
While these behavioral studies provide some evidence for the involvement of inhibitory control in bilingual visual word comprehension, it is difficult to assess the immediate impact of inhibition early in processing (i.e., during the initial presentation of homograph word pairs) by measuring prolonged response times in subsequent trials (i.e., translation word pairs). If inhibition occurs during the processing of homograph word pairs, online measures such as ERPs on the initial homograph word pairs alone should be sufficient to capture the inhibitory control process. In fact, Hoshino and Thierry (2012) used ERPs to investigate language co-activation and control in bilingual visual word comprehension. They compared ERPs of word pairs that were related to homographs’ Spanish meanings (e.g., toe-pie, “pie” meaning “foot” in Spanish) with words pairs that were unrelated (e.g., rug-pie). They found that word pairs related in Spanish elicited a smaller N400 than unrelated word pairs, which indicated the activation of homograph meanings in Spanish. However, an attenuated effect on the late positive component (LPC) in the following time window indicated that homograph meanings in the non-target language did not receive further explicit processing. These ERP findings suggest that the activation of homograph meanings is language non-specific and that inhibitory processes may prevent further interference from the non-target language in bilingual visual word comprehension.
In contrast to visual word comprehension, the involvement of inhibitory control in bilingual spoken word comprehension has only been explored in an indirect way. Blumenfeld and Marian (2013) found that bilinguals with better inhibitory control showed increased co-activation during the early stages of spoken word comprehension and decreased co-activation immediately before the selection of a target word. Similarly, better inhibitory control has also been associated with decreased between-language competition among relatively low-proficient bilinguals (Mercier, Pivneva, & Titone, 2014). These findings indicate that individual differences in inhibitory control in a non-linguistic task might be related to the efficiency of resolving interference from the non-target language, and they provide indirect evidence for the possible involvement of inhibitory control in bilingual spoken word recognition. Previous research has emphasized the role that negotiating language competition may play in shaping the bilingual cognitive architecture, mainly during bilingual production (see Kroll, 2008; Kroll & Bialystok, 2013, for reviews). Our aim was to assess the neural evidence for control processes when bilinguals hear one of their languages using a task that explicitly requires inhibitory control, thereby contributing to a comprehensive view of language control in bilingualism across modalities.
1.3. ERP indexes of language co-activation and control
The N400 has been used as a reliable neural index of language co-activation because of its sensitivity to semantic congruency and phonological repetition. Thierry and Wu (2007) asked Chinese-English bilinguals to perform semantic relatedness judgments on English word pairs. Unknown to these bilinguals, the phonological overlap between the Chinese translations of the prime and target (of a word pair) was manipulated. A reduced N400 was found for unrelated word pairs with the phonological overlap in Chinese when compared to other unrelated word pairs without such overlap, indicating the activation of phonological information in the non-target language. Additionally, the N400 effect also indexes the cross-language semantic congruency effect caused by semantic co-activation of interlingual homographs (Hoshino & Thierry, 2012).
With respect to the neural signatures of the inhibitory control in language processing, the picture is less clear. While some previous studies have identified the N2 effect to index inhibition during bilingual language production (e.g., Misra, Guo, Bobb, & Kroll, 2012), no specific inhibitory-related component (e.g., N2 or P3) has been found in the domain of bilingual language comprehension. Instead, given the results in Hoshino and Thierry (2012), we focused on the late positive component (LPC) as a potential index of the consequence of exerting language inhibition. The LPC has been found in a wide range of studies and is considered to reflect explicit and elaborate processing of stimuli that require additional cognitive resources (e.g., Osterhout & Holcomb, 1993; Paller, Kutas, & McIsaac, 1995; Rugg, Furda, & Lorist, 1988). Hoshino and Thierry (2012) observed an increased LPC only for word pairs that were related within language and not across languages, suggesting that cross-language relatedness did not receive further processing because of language inhibition. Since the current study used a similar design as in Hoshino and Thierry (2012), we also focused on these previously identified N400 and LPC effects.
1.4. The present study
In the present study, we used Korean-English homophones to investigate language co-activation and control during bilingual spoken word comprehension. Korean-English bilinguals and English monolinguals were asked to judge the semantic relatedness of two English words while their EEGs were recorded. In the homophone-related (across language) condition, homophones were paired with a word that is semantically unrelated to the English meaning but related to the Korean meaning of the homophones (e.g., moon - lock, where /mu:n/ means “door” in Korean). In the homophone-unrelated condition, homophones were paired with a word that is semantically unrelated to both English and Korean meanings of the homophones (e.g., moon - tree). The homophone-related and homophone-unrelated conditions used the exact same physical stimuli and required “No” responses; therefore, any neural or response differences between these two conditions can be attributed to their cross-linguistic relatedness. We predicted that the homophone-related (across language) condition would elicit a reduced N400 when compared to the homophone-unrelated condition. Unlike Hoshino and Thierry (2012), in the current study, interlingual homophones were used in the homophone-related across languages and the homophone-unrelated conditions. Thus, we predicted that the LPC would be larger for the homophone-related condition than for the homophone-unrelated condition due to greater cognitive demands in resolving cross-linguistic competition.
Bilinguals’ language profile and inhibitory control ability have been shown to influence the extent to which bilinguals co-activate their two languages (see Chen & Marian, 2016; Van Hell & Tanner, 2012, for reviews) and effectively manage competition from the non-target language (Blumenfeld & Marian, 2013; Mercier et al., 2014). Therefore, we examined the effects of individual differences in language profile and in inhibitory control on the neural signatures of language co-activation in bilingual spoken word comprehension. In particular, we predicted that higher language proficiency and greater language exposure in the non-target language would increase the degree of language co-activation. To measure participants’ inhibitory control ability, we employed a non-linguistic, simplified flanker task (Luk et al., 2010, see Figure 1). The flanker task measures several different types of cognitive control process. A facilitation effect, which is indexed by the difference between the congruent and neutral conditions, reflects the extent to which participants benefit from utilizing congruent information. An interference effect, the difference between the incongruent and neutral conditions, indicates the amount of interference participants experience (Luk, et al., 2010). Lastly, the flanker effect can be calculated by comparing the congruent and incongruent conditions and is often used as a measure of cognitive control ability (e.g., Costa, Hernández, & Sebastián-Gallés, 2008). In the current study, we were particularly interested in participants’ ability to resolve interference, as reflected in the difference between the incongruent and the neutral conditions1. Neurally, we focused on the P300, a component which has been used as an index of cognitive effort and attentional resources (for a review see Polich, 2007). In the flanker task, the more difficult incongruent condition often elicits a larger P300 than the easier congruent and neutral conditions that do not generate interference (e.g., Clayson & Larson, 2011; Groom & Cragg, 2015). Furthermore, a smaller P300 effect is linked to higher efficiency in resolving conflicts and interference in the incongruent condition (Pratt, Willoughby, & Swick, 2011; Wu & Thierry, 2013). We predicted that bilinguals’ efficiency in resolving non-linguistic interference (i.e., RT differences or the P300 effect between the incongruent and neutral condition) would influence bilinguals’ performance on the linguistic task.
Figure 1.
Examples of four conditions and the effects measured in the simplified flanker task.
2. Results
2.1. Behavioral results of the semantic judgment task
In NoGo conditions (covertly responded situations), the accuracy rates of successfully withholding responses were very high in all conditions (equal to or higher than 99%). In Go conditions (overtly responded situations), only trials to which participants responded correctly were included in the analysis of response times. Accuracies and response times were analyzed for homophones and non-homophones separately, using mixed-effects models as implemented in the lme4 library (version, 1.1–11, Bates et al., 2015) in R (version 3.2.3, R Core Team, 2015). Fixed effects included group (bilinguals vs. monolinguals) and relatedness (related vs. unrelated), which were both coded with contrast coding. The models also included the maximal random effects structure justified by the data using a backward-fitting procedure (Barr, Levy, Scheepers, & Tily, 2013). The descriptive accuracy and response time data are presented in Table 1.
Table 1.
Behavioral results of the Go condition in the semantic judgment task (standard errors are in the parentheses).
Group | Conditions | Accuracy | Response time/ms |
---|---|---|---|
Bilinguals | homophone-related | 86.2% (2.6%) | 1219 (34) |
homophone-unrelated | 88.5% (2.0%) | 1226 (36) | |
control-related | 86.4% (1.3%) | 1093 (29) | |
control-unrelated | 84.8% (2.7%) | 1189 (35) | |
Monolinguals | homophone-related | 92.3% (1.8%) | 1088 (35) |
homophone-unrelated | 92.7% (2.4%) | 1095 (37) | |
control-related | 88.4% (1.3%) | 999 (30) | |
control-unrelated | 93.0% (1.6%) | 1087 (36) |
2.1.1. Non-homophone controls
For accuracy, a main effect of group (β = −.67, SE = .22, Z = −2.99, p < .005) showed that monolinguals (M = 90.7%) were more accurate than bilinguals (M = 85.6%). The interaction between group and relatedness was marginally significant (β = .85, SE = .46, Z = 1.83, p = .07). Further comparisons showed that monolinguals responded to unrelated word pairs (M = 93.0%) more accurately than to related word pairs (M = 88.4%, β = −.95, SE = .46, Z = −2.08, p < .05), while bilinguals did not respond differently across conditions (Z < 1.4, p >.17). In the analysis of response times, both the main effects of relatedness (β = −.09, SE = .02, t = −4.70, p < .001) and group were significant (β = −.1, SE = .04, t = −2.13, p < .05), where related word pairs (M = 1046 ms) were responded to faster than unrelated word pairs (M = 1127 ms), and monolinguals (M = 1038 ms) were overall faster than bilinguals (M = 1134 ms). No other main effects or interactions reached significance.
2.1.2. Homophones
The main effect of group was marginally significant in the analysis of accuracies (β = −.079, SE = .41, Z = −1.92, p = .06) and was significant in the analysis of response times (β = .13, SE = .05, t = 2.69, p < .01), suggesting that monolinguals were more accurate and faster (M = 92.5%, M = 1092 ms) than bilinguals (M = 87.4%, M = 1223 ms).
2.2. ERP Data of the semantic judgment task
The averaged ERPs were generated for each participant by only including trials that were responded to correctly. Early components such as P1, N1, and P2 appeared similar across all conditions. None of the analyses of these early components revealed significant differences between groups or between related and unrelated word pairs (ps > .1). No significant interactions between group and relatedness or between group and response type (Go/NoGo) were found in any of the early time windows (ps > .08).
2.2.1. Non-homophones
The significant main effect of relatedness emerged during the 400–450 ms time window and lasted during all remaining time windows in both Go and NoGo conditions (see Figure 2 and Figure 3). Therefore, we re-calculated the mean amplitude of each condition in the 400–1000 ms time window and submitted it to a 2 (group: monolinguals vs. bilinguals) × 2 (relatedness: related vs. unrelated) × 8 (electrode) ANOVA in both Go and NoGo conditions. In both conditions, the main effect of relatedness was significant (Go: F(1,37) = 51.71, p < .001, ; NoGo: F(1,37) = 14.21, p = .001, ). Planned comparisons showed that the main effect of relatedness was significant for both bilinguals (Fs (1,19) > 7.61, ps < .05, ) and monolinguals (Fs (1,18) > 6.78, ps < .05, ) in both Go and NoGo conditions.
Figure 2.
ERPs and difference waves elicited by control-related and control-unrelated word pairs in the Go condition.
Figure 3.
ERPs and difference waves elicited by control-related and control-unrelated word pairs in the NoGo condition.
2.2.2. Homophones
2.2.2.1. Homophone Go condition
The main effects of relatedness or group × relatedness interactions did not reach significance in any of the 50 ms time windows between 350 to 1000 ms. Planned comparisons within each group did not reveal any significant effects (Fs < 2.83, ps > .1, see Figure 4).
Figure 4.
ERPs and difference waves elicited by homophone-related and homophone-unrelated word pairs in the Go condition.
Next, we evaluated the influence of language proficiency and daily language exposure (Luk & Bialystok, 2013), as well as individual differences in inhibitory control (Blumenfeld & Marian, 2013; Mercier et al., 2014) on the N400 language co-activation effect. In order to examine the N400 effect, we computed mean amplitude differences (averaged across eight electrodes in the ROI analysis) between the two homophone conditions (subtracting the mean amplitude of homophone related condition from the mean amplitude of the homophone unrelated condition) in the 350–500 ms time window (Hoshino & Thierry, 2012) and 500–1000 ms time window. While neither language proficiency in Korean nor the inhibitory control measure correlated with the N400 effect in either time window (rs < .36, ps > .14), we found that the amount of Korean exposure was positively correlated with the mean amplitude difference of the two homophone conditions (the N400 effect) in the 500–1000 ms time window (r = .47, p = .038, two-tailed, see Table 2). This indicates that increased daily exposure to the non-target native language (Korean) resulted in greater language co-activation in bilinguals (see Figure 5). Importantly, language proficiency in Korean and daily language exposure were not correlated with each other (r = .03, p > .9, see Table 2).
Table 2.
Correlations among ERP effects in the semantic judgment task, measures of language profile and neural efficiency in the flanker task in Korean-English bilinguals.
Variables | the N400 effect (co-activation) |
the LPC effect (language control) |
Korean proficiency |
Korean exposure |
the P300 effect (inhibitory efficiency) |
---|---|---|---|---|---|
The N400 effect | - | −0.19 | −0.31 | 0.47* | 0.31 |
the LPC effect | − | −0.14 | −0.02 | 0.52* | |
Korean proficiency | - | 0.03 | −0.03 | ||
Korean exposure | - | 0.09 | |||
the P300 effect | - |
p < .05
Figure 5.
Correlation between the N400 effect and the amount of daily Korean exposure in Korean-English bilinguals.
2.2.2.2. Homophone NoGo condition
A marginally significant main effect of relatedness was found in the 700–750 ms time window, showing that the homophone-unrelated condition elicited a larger positivity than the homophone-related condition (F(1,37) = 4.08, p = .051, , see Figure 6). The interaction between group and relatedness was not significant (F < 1, p > .6). Planned comparisons showed that the relatedness effect was only significant in bilinguals (F(1,19) = 4.83, p = .041, ), but not in monolinguals (F(1,18) < 1, p > .35). Visual inspection of the scalp distribution suggested potential lateralization of this effect during the 600-8f00 ms time window. Therefore, we added hemisphere as an additional factor and performed a 2 (relatedness: related vs. unrelated) × 2 (group: bilingual vs. monolingual) 2 (hemisphere: left vs. right) × 3 (electrode: CP1, C3, and P3, vs. CP2, C4, and P4) ANOVA over the 600–800 ms time window. A significant interaction between hemisphere and relatedness emerged (F(1,37) = 9.76, p = .003, ). Planned comparisons within each language group showed that the hemisphere × relatedness interaction was significant in bilinguals (F(1,19) = 8.25, p = .01, ), but not in monolinguals (F(1,18) = 2.20, p > .15), and that the three-way interaction between group, hemisphere and relatedness was not significant (F(1,37) = 1.37, p = .25). Further comparisons revealed that bilinguals elicited larger positive-going waves for the homophone-unrelated condition than for the homophone-related condition in the right hemisphere (F(1,19) = 6.32, p = .021, ), but not in the left hemisphere (F(1,19) < 1, p > .6). No difference between the two homophone conditions was found for monolinguals in either the left or right hemisphere (Fs(1,18) < 1, ps > .5). Next, we examined the relationship between this right-lateralized late positive effect (subtracting the mean amplitude of the homophone-unrelated condition from that of the homophone-related condition across three electrodes in the right hemisphere) and individual differences in the amount of Korean exposure, language proficiency and inhibitory control. We found that only the neural index of inhibitory control (i.e., the P300 effect between incongruent and neutral conditions in the flanker task) was correlated with the late positive effect (r = .52, p = .028, two-tailed, see Table 2), showing that a larger P300 difference was associated with a larger late positive effect (see Figure 7).
Figure 6.
ERPs and difference waves elicited by homophone-related and homophone-unrelated word pairs in the NoGo condition.
Figure 7.
Correlation between the late positive effect in the right hemisphere and the P300 effect (incongruenct condition – neutrual condition) in the Flanker task.
2.3. Semantic rating and translation tasks
To ensure that the participants in the ERP study evaluated semantic relatedness of the critical word pairs similar to the participants in the norming study, we collected semantic ratings after the ERP session was completed (on a 0–7 scale, where 0 = not related at all, and 7 = highly related). For bilinguals, the average rating of homophone-related across languages word pairs (which should be judged as unrelated in English) was 0.22 (SD = 0.027) when evaluated in English, and the average rating of homophone-unrelated word pairs was 0.22 (SD = 0.025). For the English monolinguals, the average rating for homophone-related across languages word pairs was 0.54 (SD = 0.060) and 0.56 (SD = 0.061) for homophone-unrelated word pairs. Although monolinguals had an overall slightly higher rating than bilinguals, both groups determined that the relatedness of two homophone conditions (both unrelated when considering only the English meaning of homophones) were comparable. Finally, bilingual participants were also asked to translate all the critical English word stimuli into Korean. The average accuracy of the translation task was high (M = 91%, SD = 7%), confirming that bilingual participants knew the English words that were used in the critical conditions and were able to judge the semantic relatedness correctly.
2.4. Behavioral results of the flanker task
The behavioral results of the flanker task are summarized in Table 3. The accuracies for the congruent, incongruent and neutral conditions were high (equal to or higher than 97.5%) for both bilinguals and monolinguals (no further accuracy analyses were conducted because of the lack of variability in these three conditions). In the response time analysis, a 2 (group: bilinguals vs. monolinguals) × 3 (condition: congruent, incongruent, and neutral) mixed repeated-measures ANOVA was performed with only correctly responded trials. The main effect of condition was significant, F (2, 66) = 104.19, p < .001, . Further comparisons showed that the congruent condition was faster than the neutral condition (p < .001), and the incongruent condition was slower than the neutral condition (p < .001). However, neither the main effect of group nor the interaction between group and condition was significant (Fs < 1).
Table 3.
Accuracies and response times of the four conditions in the flanker task (standard errors are in parentheses).
Group | Congruent | Incongruent | Neutral | Nogo | |
---|---|---|---|---|---|
Accuracy (% correct) | Bilinguals | 98.3 (0.7) | 97.9 (0.7) | 99.0 (0.4) | 93.0 (1.6) |
Monolinguals | 99.8 (0.7) | 97.5 (0.8) | 99.3 (0.4) | 90.7 (1.8) | |
Response time (in ms) | Bilinguals | 537 (17) | 601 (17) | 557 (19) | NA |
Monolinguals | 519 (18) | 580 (17) | 542 (20) | NA |
2.5. ERP data of the flanker task
In the analysis of the P300 effect during the 500–750 ms time window, the main effect of condition was significant (F (2, 66) = 27.03, p < .001, ). Further comparisons showed that the incongruent condition elicited a larger P300 than both the congruent condition (p < .001) and the neutral condition (p < .001). However, the group × condition interaction as well as the main effect of group was not significant (Fs < 1.98, see Figure 8). We provide a more detailed description of the flanker ERP results in the supplementary materials.
Figure 8.
ERPs and difference waves elicited by congruent, incongruent, and neutral conditions in the flanker task.
3. Discussion
The aim of the current study was to examine the neural signatures of language co-activation and control in bilingual spoken word comprehension. Comparisons were drawn between word pairs that were related within language (English), word pairs that were related across languages (English and Korean), and word pairs that were unrelated either within or across languages. In the comparison of word pairs that were related and unrelated within English, both bilinguals and monolinguals showed the classic semantic congruency effect, as indexed by the N400 effect. In the critical comparison between word pairs that were related across languages and that were unrelated, we found that the amount of language co-activation, as indexed by the size of the N400 effect, increased as bilinguals had more daily exposure to the non-target native language. We also found that the allocation of cognitive resources differed depending on whether cross-language competition was present: word pairs that were related across languages received fewer cognitive resources than word pairs that were not related. Meanwhile, a smaller resource allocation difference (i.e., a smaller effect caused by cross-linguistic competition) was found to positively correlate with bilinguals’ ability to resolve interference in the flanker task. These findings suggest that the degree of language co-activation in bilingual spoken word comprehension is influenced by bilinguals’ daily language experience and that greater neural efficiency in resolving non-linguistic interference may make bilinguals less susceptible to cross-linguistic interference.
We found that greater exposure to the non-target language (native language) in daily life increased the degree of language co-activation in bilinguals. This result is comparable to a previous behavioral study in which bilinguals read sentences in their L2 embedded with homographs. In the study, Elston-Güttler, Gunter, and Kotz (2005) showed that during an English (L2) sentence reading task, German-English bilinguals who were exposed to the German (L1) version of a film (compared to the English version of the same film) prior to the reading task accessed homophones’ German meaning. According to the Bilingual Interactive Activation model (BIA and BIA+, Dijkstra & Van Heuven, 1998; 2002), the high daily exposure to the non-target language in the current study and brief exposure to the non-target language before the experiment in Elston-Güttler et al. (2005) would both increase the pre-activation level of the non-target language and increase the ease of accessing that language, thereby resulting in greater language co-activation in bilinguals. Taken together, these findings suggest that language co-activation in bilinguals could be very sensitive to fine-grained differences in the bilingual language experience. Thus, it might be more appropriate to consider bilinguals on a continuum of language experience, and future studies should use continuous measures instead of categorical measures to capture subtle individual differences (Luk & Bialystok, 2013; Van Hell & Tanner, 2012).
In the current study, we observed cross-linguistic semantic activation in interlingual homophones that do not overlap orthographically, suggesting that phonological similarity alone is sufficient to activate semantic information in the non-target language. This finding is consistent with the prediction of the Bilingual Language Interaction Network for Comprehension of Speech model (BLINCS, Shook & Marian, 2013). In BLINCS, the incoming auditory input first activates phonemes in both of a bilingual’s languages as the phonological representation is shared across two languages. Activated phonemes are then mapped onto words in each language at the level of phono-lexical representation, and these words subsequently activate both semantic and ortho-lexical information. According to the BLINCS model, the activation of the ortho-lexical representation does not have a direct influence on the access of semantic information, and semantic co-activation occurs as long as two languages share phonological similarity. Consistent with the model, we observed that interlingual homophones with no orthographic overlap still activated semantics in the non-target language.
However, the cross-linguistic semantic relatedness effect in the current study appeared later (i.e., the first visible difference appeared around 500 ms after stimulus onset) than the within-language semantic effect in non-homophone word pairs (i.e., around 400 ms). This finding suggests that a homophone’s meaning in the non-target language might not be immediately accessible in bilingual spoken word comprehension, and is consistent with FitzPatrick and Indefrey (2014). They found that when listening in an L2, a homophone’s meaning in the non-target language (L1) was only available after accessing the meaning in the target language L2. This delayed effect in bilingual spoken word comprehension contrasts with the co-activation of homograph meanings in bilingual visual word comprehension. In Hoshino and Thierry (2012), the cross-linguistic semantic effect (shown in word pairs that were related across languages) appeared in the same 350–500 ms time window as the semantic effect in the target language (shown in word pairs that were related within language). Taken together, these findings indicate a difference in the time course of semantic co-activation triggered by interlingual homophones and homographs. The time course difference of co-activation across auditory and visual modalities is likely caused by different degrees of overlap in interlingual homophones and homographs. Homographs overlap visually completely while homophones can vary in pronunciation because of the phonetic differences between languages (Kang & Guion, 2006). Therefore, it is likely that when hearing interlingual homophones, bilinguals need additional time to match these slightly different auditory inputs onto the phonological representations in the non-target language. Consequently, co-activation of the meanings in the non-target language was delayed. Another possible explanation is that a homophone’s semantic information in a given language is more closely linked to the native phonology (FitzPatrick & Indefrey, 2014); as a result, accessing a homophone’s meaning in the L1 takes more time when it is presented with L2 pronunciation. Future studies can use homophone word pairs presented in Korean to Korean-English bilinguals to test this explanation.
In the time window of 600 to 800 ms in the NoGo condition, we observed an effect that resembles the LPC effect, which is often identified around 500 ms after stimulus onset. However, contradictory to our prediction, word pairs that contain cross-language competition elicited a smaller LPC than unrelated word pairs in the right hemisphere. It is not clear why we observed a reduced LPC for word pairs that were related across-languages because we expected cross-linguistic activation to increase the cognitive demand, which would result in an increased LPC. One tentative explanation is that bilinguals resolve cross-linguistic competition at the cost of semantic processing, resulting in a smaller LPC for cross-linguistically related word pairs. Critically, the difference in the homophone-related and -unrelated conditions suggests that the LPC is sensitive to the experimental manipulation of whether or not cross-language competition is present. If we consider the word pairs without cross-language competition as a baseline, then the difference between the baseline and the word pairs with cross-language competition can be used as an index of the interference which bilinguals experience. Therefore, a smaller difference between the conditions with and without cross-linguistic competition indicates that bilinguals experienced less influence or interference from cross-linguistic competition.
In the following correlation analysis between the non-linguistic flanker task and the linguistic semantic relatedness task, a smaller LPC effect between homophone related and unrelated word pairs was positively associated with a smaller P300 effect between incongruent and neutral conditions in the flanker task. A smaller P300 effect in the flanker task has been associated with more efficient cognitive processing where fewer cognitive resources are required (Pratt, Willoughby, & Swick, 2011; Wu & Thierry, 2013). Therefore, the correlation between the LPC effect in the semantic judgment task and the P300 effect in the flanker task might indicate a link between the efficiency of suppressing non-linguistic interference and the interference bilinguals experience when they encounter cross-linguistic competition. It is plausible that bilinguals who were more efficient in managing interference in the non-linguistic task also experienced less linguistic interference, resulting in a smaller difference in resource allocation. These findings also indicate a potential link between the cognitive efficiency when facing interference in linguistic and non-linguistic tasks. While a number of studies have found greater cognitive efficiency in non-linguistic processing in bilinguals when compared with monolinguals (e.g., Bialystok, 2006; Bialystok & DePape, 2009; Costa, Hernández, & Sebastián-Gallés, 2008; Luk, Anderson, Craik, Grady, & Bialystok, 2010; Schroeder et al., 2016), and attributed better inhibitory control abilities to bilingual language experience, the robustness of this causal relationship is under debate (see Paap, Johnson, & Sawi, 2015, for different views). The current study provides the first neural evidence demonstrating that cognitive mechanisms might be similar across linguistic and non-linguistic processing, allowing for a potential bi-directional influence between language experience and non-linguistic abilities.
In the current study, we did not find behavioral differences between the homophone-related and homophone-unrelated conditions, which contrasts with bilingual visual word studies in which behavioral differences were observed (Durlik et al., 2016; Macizo et al., 2010; Martín et al., 2010). The current study constructed homophone word pairs in a very similar way to these behavioral studies (homophones with auditory presentation vs. homographs with visual presentation); therefore, a similar pattern of behavioral results was expected. Contrary to our expectations, bilinguals did not show different response times between homophone-related (across languages) and homophone-unrelated conditions. This unexpected finding could be due to differences in simultaneous and sequential presentation across visual and auditory modalities. During the simultaneous presentation, bilinguals may have experienced difficulty resolving the interference because the source of the interference – homographs – was constantly shown on the screen before a decision was made (Durlik et al., 2016; Macizo et al., 2010; Martín et al., 2010). In the sequential presentation, the interval between the two words provided a buffering zone that made semantic interference manageable; therefore, no conflicts were detected by behavioral measures. Nevertheless, because of their excellent temporal resolution, ERP measures were still able to capture the language co-activation effect (see McLaughlin, Osterhout, & Kim, 2004, for a similar dissociation).
Using ERP measures, the current study revealed that the neural signatures of language co-activation and control differ across overtly and covertly responded conditions. In the overtly responded Go condition, an N400 effect (which is an index of language co-activation) was found, and this N400 effect correlated with daily exposure to the non-target language. In the covertly responded NoGo condition, a late positive effect (which is associated with the allocation of cognitive resources) was found, and this effect correlated with cognitive efficiency in the flanker task. The difference between the Go and NoGo conditions might suggest that bilinguals used different strategies for processing cross-linguistic related word pairs, depending on whether overt responses were required or not. In the Go condition, bilinguals were instructed to press the buttons in a timely manner; it is likely that this high-demand task prevented bilinguals from inhibiting the non-target language completely, resulting in language co-activation. In contrast, in the NoGo condition, where the task demand was low, bilinguals had more cognitive resources available for resolving cross-linguistic competition, which successfully prevented language co-activation. This might be the reason why we did not observe an N400 effect. However, when the comparison was made between word pairs with and without cross-language competition, we found that bilinguals allocated cognitive resource differently. Nevertheless, the results in the NoGo condition were different from those of a similar ERP study (Hoshino & Thierry 2012), in which an N400 language co-activation effect was found. This discrepancy might be due to a difference in the presentation order of word pairs in the two studies. Hoshino and Thierry (2012) presented their homographs as the target of the word pairs while we presented homophones as the prime of the word pairs. It is possible that participants in the current study had more time and used more cognitive resources to prepare for and to subsequently resolve language co-activation. Therefore, we observed a component that is associated with allocation of cognitive resources instead of language co-activation.
In conclusion, the current study provides the first ERP evidence for language co-activation and control in bilingual spoken word comprehension. The results suggest that the degree of language co-activation is influenced by bilinguals’ daily language experience, and bilinguals’ neural efficiency when facing interference is comparable across linguistic and non-linguistic processing. These findings reveal that bilingual language processing is sensitive to individual differences in language experience and cognitive abilities.
4. Methods
4.1. Participants
Twenty Korean-English bilinguals and 19 native English monolinguals were included in the data analyses. All participants were right-handed (assessed by the Edinburgh handedness inventory, Oldfield, 1971) and had normal or corrected to normal vision and normal hearing. None of them reported any neurological disorders. Participants gave informed consent prior to the experiment, which was approved by the Institutional Review Board. We included bilinguals who learned Korean as their native language and English as their second language, and monolinguals who learned English as their native language and had a second language proficiency lower than 4 on an 11-point scale (0–10, where 10 indicates native-like fluency). One Korean-English speaker and two English speakers were tested but excluded due to poor quality of the data (i.e., no early components could be identified in the averaged ERPs). Furthermore, one Korean-English speaker and two English speakers were excluded because too few trials remained after artifact rejection and exclusion of incorrect trials (n < 25 per condition).
The remaining 20 Korean-English speakers all started learning Korean at birth and English before the age of 11 (two of them reported acquiring Korean and English simultaneously). All bilingual participants lived in the U.S. at the time of testing. The remaining 19 English speakers were native speakers of English with minimal exposure to a second language, and none of them reported having any knowledge of Korean. All participants completed the Language Experience and Proficiency Questionnaire (LEAP-Q; Marian, Blumenfeld, & Kaushanskaya, 2007), a self-report questionnaire for assessing the language profile of multilingual participants. Participant demographics are summarized in Table 4.
Table 4.
Characteristics of the English monolingual and Korean-English bilingual participants. (Standard deviations are in the parentheses.)
Measure | Monolinguals | Bilinguals |
---|---|---|
Number | 19 (4 male) | 20 (3 males) |
Age | 22.4 (3.3) | 21.7 (2.9) |
L1 Proficiency | 9.70 (0.6) | 8.67 (1.2) |
L2 Proficiency | 1.56 (1.6) | 9.1 (1.2)*** |
L1 Age of Acquisition | Birth | Birth |
L2 Age of Acquisition | 11.67 (4.3) | 6.03 (2.5)*** |
Daily L1 Exposure (%) | 98.3 (3.2) | 30.0 (17.8)*** |
Daily L2 Exposure (%) | 1.7 (3.2) | 67.7 (18.0)*** |
p <.05,
p<.01,
p<.005
4.2. Materials
4.2.1. Semantic judgment task
Fifty-six English-Korean homophones were selected. One doctoral-level psychologist who teaches English in Korea and two doctoral-level linguistics students who speak both languages were asked to rate the phonological similarity between the English pronunciation and Korean pronunciation of the homophone on a Likert scale from 1 to 10. The average rating of phonological similarity for the 56 homophones was 7.65 (SD = 1.08) out of 10 (see Appendix A for the list of homophones. A more detailed selection procedure is provided in the Supplementary Materials).
From the 56 homophones, four types of word pairs were constructed for the critical conditions (see Table 5). (1) In the homophone-related (across language) condition, each of the 56 homophones (e.g., “moon”, meaning “door” in Korean) was paired with an English target word that was unrelated to the English homophone meaning but was related to the Korean homophone meaning (e.g., “lock”). (2) In the non-homophone control-related (within language) condition, each English target word (e.g., “lock”) that was generated for the homophone-related condition was paired with a new related English word (e.g., “jail”). (3) For the homophone-unrelated condition, homophones were paired with a different target word that was previously paired with another homophone in the homophone-related condition, resulting in word pairs that were not related when considering both English and Korean homophone meanings. This pairing method ensured that the stimuli were exactly the same in the homophone-related and -unrelated conditions. (4) The control-unrelated condition was created in the same way by using the stimuli from the control-related condition. Therefore, each homophone appeared once in the homophone-related condition and once in the homophone-unrelated condition. Each non-homophone control word appeared once in the control-related and once in the control-unrelated condition. The targets were the same across all four conditions. In other words, each homophone and non-homophone control word was presented 4 times throughout the semantic relatedness judgment task, and the targets paired with them were presented 8 times.
Table 5.
Examples of the critical conditions in the semantic relatedness judgment task.
Critical Conditions | Examples | Required Responses | Number of trials |
---|---|---|---|
homophone-related | moon (door) – lock soup (forest) – tree | No | 56 |
homophone-unrelated | moon (door) – tree soup (forest) – lock | No | 56 |
control-related | jail – lock leaf – tree | Yes | 56 |
control-unrelated | jail – tree leaf – lock | No | 56 |
Note. Words in the parentheses indicate the Korean homophone meaning.
The prime words in the homophone related and unrelated conditions and those in non-homophone control related and unrelated conditions were matched for length, lexical frequency, log-frequency, subtitle frequency, number of phonemes, number of syllables, and both orthographic and phonological neighborhood sizes (all ps > .1, see Table 6; lexical properties were obtained from The English Lexicon Project, Balota et al., 2007). The semantic relatedness of the word pairs was computed based on the Nelson Norms (Nelson, McEvoy, & Schreiber, 1998). The average semantic relatedness of the word pairs in the homophone-related condition (when considering the Korean homophone meaning) did not differ from the average semantic relatedness of the control-related condition (0.13 vs. 0.13, p > 0.9). Because no norms existed to assess the degree of unrelatedness between word pairs in the homophone-related condition (when considering the English homophone meaning), homophone-unrelated condition and control-unrelated condition, a norming study was conducted. Each unrelated pair was rated on its semantic relatedness on a scale from 0 (not at all related) to 7 (completely related) by 22 native speakers of English. The average ratings of homophone-related, homophone-unrelated and control-unrelated word pairs were 0.58 (SD = 0.54), 0.59 (SD = 0.48), and 0.72 (SD = 0.54) respectively and there was no difference among the three conditions (p > .27). In addition to the critical conditions, we generated 224 filler pairs in a similar manner to the critical condition by selecting an additional 112 English words as filler primes and pairing each with two different filler target words that were selected from an additional set of 56 word. Of the 224 filler pairs, 168 word pairs were semantically related (required a “Yes” response in the semantic judgment task) and 56 word pairs were not semantically related (required a “No” response). Therefore, the number of word pairs requiring “Yes” and “No” responses was equal in the semantic relatedness judgment task.
Table 6.
Lexical characteristics of homophones and control words (standard deviation in parentheses).
Lexical characteristics | Homophones | Control words |
---|---|---|
Length | 3.85 (0.9) | 3.81 (0.7) |
Frequency_HAL | 68762.4 (150912) | 45929.8 (101317) |
Subtitle Word Frequency | 295.88 (766) | 142.83 (415) |
Orthographic neighborhood | 8.40 (6.5) | 8.71 (5.0) |
Phonological neighborhood | 17.67 (9.4) | 18.08 (9.8) |
Number of phonemes | 2.96 (0.6) | 3.13 (0.6) |
Number of syllables | 1.19 (0.4) | 1.08 (0.3) |
In order to obtain both behavioral measures and response-free EEGs, all of the word pairs (both critical and filler word pairs) were presented once as a Go condition in which participants were asked to press a yes or no button to indicate their semantic relatedness judgment, and once as a NoGo condition where participants were asked to make silent judgments. Go and NoGo trials were randomly intermixed to maintain participants’ attention (e.g., Hoshino & Thierry, 2012; Orgs, Lange, Dombrowski, & Heil, 2008). While interspersing Go and NoGo conditions potentially risked increasing the cognitive load of the task, this concern was mitigated by our experimental design in which the comparison occurred only within the Go or NoGo condition. If the additional cognitive load influences participants’ performance, it should exert similar effects across all the Go and NoGo conditions.
To minimize repetition effects, the materials were divided into two equivalent sets so that no homophone words and no control words appeared in both related and unrelated conditions within the same set. In the first half of the semantic relatedness judgment task, one set of word pairs was presented in the Go condition and the other set of the word pairs was presented in the NoGo condition. In the second half of the semantic task, the two sets of word pairs were switched for Go and NoGo presentation; words previously presented in the Go condition were now presented in the NoGo condition and vice versa. There were four lists of word pairs in each half of the semantic task, and each unique word only appeared once in each list. Within each list, word pairs were presented in a randomized order. The number of word pairs from each condition, the number of word pairs requiring a Yes or No response, and the number of Go/NoGo responses within each block were balanced. The order of a set being presented first in the Go condition or first in the NoGo condition and the block order within a set were counterbalanced across participants.
4.2.2. Auditory recordings
Auditory stimuli were produced by a female native speaker of American English using a Sennheiser PC360 microphone headset in a quiet room. Recordings were digitized as WAV files at a sampling rate of 44.1 kHz using the audio-editing software Audacity 2.0.6 (http://audacity.sourceforge.net). Individual stimuli were minimally processed to remove clicks using PRAAT (Boersma & Weenink, 2004) and were then run through a dynamic compressor in Audacity to reduce the range of the loudest sounds and bring them closer to the average.
4.2.3. The flanker task
The flanker task included four conditions (adapted from Luk et al., 2010, see Figure 1). The participants were asked to respond to the center red arrow. In the congruent condition, the center arrow was surrounded by four arrows pointing in the same direction. In the incongruent condition, the center arrow was flanked by four arrows pointing in the opposite direction. In the neutral trials, the flanking stimuli were diamonds providing no directional information. Finally, in the no-go condition where the center arrow was surrounded by Xs, participants were asked to suppress their responses. There were 60 trials in each condition (30 trials with center arrow facing left and 30 trials with center arrow facing right), resulting in 240 trials in total. The flanker task was divided into two equivalent sets with the same number of trials from each condition. Within each set, trials of different conditions were presented randomly.
4.3. Procedure
After informed consent was obtained, participants were prepared for EEG testing. Each participant was tested individually in a quiet room. Stimuli were presented using MATLAB (version 8.2, The MathWork Inc., Orgs, Lange, Dombrowski, & Heil, 2008), MA, 2013) with PsychToolBox 3.0 (Brainard, 1997) on a Dell PC. In the semantic relatedness judgment task, auditory sound files of words were played via two magnetically shielded speakers. Each trial started with a fixation cross of 400 ms, followed by the first word while the fixation cross stayed on the screen. After a 400 ms interstimulus interval, the second word was played with a black or a red fixation cross on the screen. In the event of a black fixation cross, participants were asked to make their responses as accurately and quickly as possible by pressing one of two buttons on a two-handed game controller. In the event of a red fixation cross, participants were instructed to make a relatedness judgment silently. The black or red fixation cross disappeared after 1500 ms regardless of when the participants made their judgments. The inter-trial interval was 1000 ms (see Figure 9). The corresponding hand for “Yes” and “No” was counterbalanced across participants. A practice of 12 word pairs that were not used for the formal experiment was given before the formal task, and the same 12 word pairs were repeated a second time for eye blink control practice. Participants were offered a break after every 96 trials.
Figure 9.
A schematic representation of the procedure in the Semantic Relatedness Judgement Task.
After the semantic relatedness judgment task, participants completed the flanker task. In the flanker task, each trial started with a fixation of 500 ms, followed by a blank of 200 ms. The arrow and its flanking stimuli were presented for a maximum duration of 1500 ms and disappeared immediately after a response was made. The inter-trial-interval was 1000 ms. After the EEG session of the semantic relatedness judgment task and the flanker task, participants rated all the critical word pairs on a 0–7 scale, where 0 meant “not related at all” and 7 meant “highly related.” Korean-English bilinguals also completed a translation task on the critical stimuli of the semantic judgment task. All participants then completed the Language Experience and Proficiency Questionnaire (LEAP-Q; Marian et al., 2007) and the NIH Toolbox Cognition Battery (NTCB, https://www.assessmentcenter.net/). The entire experimental session lasted approximately 3.5 – 4.5 hours and all participants received monetary compensation for their participation.
4.4. Electroencephalogram recording and processing
The electroencephalogram was recorded from 30 Ag/AgCl active electrodes (Brain Vision antiCHamp and PyCorder, Brain Vision LLC) placed on the scalp according to the extended 10–20 system (Pivik et al., 1993). Vertical and horizontal electrooculogram was recorded through two additional electrodes attached below the left eye and the corner of the right eye. All electrodes were referenced to the left mastoid online. Electrode impedances were kept below 15 kΩ. All channels were amplified with a band pass of 0.01–100 Hz and at a sampling rate of 500 Hz. Offline, EEG data were preprocessed using a combination of EEGLAB (version 13.3.2, Delorme & Makeig, 2004) and ERPLAB (version 4.0.3.1, Lopez-Calderon & Luck, 2014) under MATLAB (version 8.2, The MathWork Inc., Natick, MA, 2013). After using independent component analysis to remove EEG components related to eye blinks (ICA function in EEGLAB, Delorme & Makeig, 2004), continuous EEGs were filtered with a band pass of 0.05 - 70 Hz (for statistical analyses), and were re-referenced to the averaged mastoid reference. After re-referencing, the EEGs were segmented into epochs of 1200 ms, starting 200 ms before stimulus onset. Each epoch was baseline corrected for the 200 ms pre-stimulus interval. Finally, epochs containing artifacts were automatically discarded when the amplitudes exceeded a 100 µV threshold using a moving window of 200 ms in steps of 50 ms, or when simple voltage exceeded −120 µV and 120 µV in any channel.
4.4.1. Semantic relatedness judgment task
After excluding trials with incorrect responses or artifacts, the percentage of remaining trials in each condition was calculated: Homophone-related (85%), homophone-unrelated (85%), control-related (92%), and control-unrelated (93%) in the Go condition, and homophone-related (84%), homophone-unrelated (82%), control-related (92%), and control-unrelated (94%) in the NoGo condition. For all participants included in the final analysis, at least 25 trials remained per condition. Average ERPs were generated for each participant, electrode, and experimental condition. Averaged ERPs were filtered with a low pass filter of 30 Hz only for the purpose of figure plotting.
Mean ERP amplitudes in the semantic task were calculated in every 50 ms time window between stimulus onset and 1000 ms. Early effects (e.g., P1, N1 and P2) were analyzed by conducting ANOVAs with relatedness (2: related vs. unrelated) × response type (2: Go vs. NoGo) × electrode (28) as within-subjects factors and group (2: bilinguals vs. monolinguals) as a between-subjects factor in every 50 ms time window from stimulus onset to 350 ms. For the N400 effect and LPC effect, we selected 8 centro-parietal electrodes (C3/4, CP1/2, P3/4, Cz, and Pz) for ROI analyses based on previous literature (Hoshino & Thierry, 2012; Kuipers & Thierry, 2010). ANOVAs with relatedness (2: related vs. unrelated), response type (2: Go vs. NoGo), and electrode (8) as within-subjects factors and group (2: bilinguals vs. monolinguals) as a between-subjects factor were performed every 50 ms starting from 350 ms to 1000 ms. For both early components and the N400 and LPC effects, ANOVAs were performed separately for homophones conditions and non-homophone controls conditions2. Further, considering the different cognitive demands in Go and NoGo conditions, we also performed ANOVAs separately in Go and NoGo conditions with relatedness (2: related vs. unrelated) and electrode (28) as within-subjects factors and group (2: bilinguals vs. monolinguals) as a between-subjects factor in all the time windows. Because the difference in language co-activation effects was observed between bilinguals and monolinguals in several previous ERP studies (Hoshino & Thierry, 2012; Thierry & Wu, 2007, 2010), planned comparisons within each group were conducted regardless of the significance of the interaction between relatedness and group3.
4.4.2. The flanker task
Data from two bilinguals and one monolingual were excluded due to an insufficient number of trials (n < 25 per condition). An additional monolingual was discarded because of poor data quality. The data from 18 bilinguals (out of 20) and 17 monolinguals (out of 19) were included in the final flanker analysis. The efficiency of resolving interference was measured by the P300 effect, which was obtained by subtracting the mean amplitudes of the neutral condition from those of the incongruent condition in the ROI (8 electrodes: C3/4, CP1/2, P3/4, Cz, and Pz) during the time window of 500–750 ms (Wu & Thierry, 2013). In the time window of 500–750 ms, an ANOVA with group (2: bilinguals vs. monolinguals) as a between-subjects factor, and condition (3: congruent, incongruent, and neutral) and electrode (8) as within-subjects factors was also performed.
Supplementary Material
Highlights.
Language co-activation in spoken word comprehension can be indexed by N400 effects.
The presence of cross-language competition influences allocation of cognitive resources.
There is a relationship between resolving linguistic and non-linguistic competition.
Language co-activation increases as exposure to the non-target language increases.
Acknowledgments
The authors thank Dr. Tuan Q. Lam, Dr. Sunjoo Chung, Jiyong Kim, and Haram Kim, as well as two undergraduate research assistants Peter Kwak and Jae-Ryoung Lee for their help with stimulus creation and subject recruitment. We also thank the members of the Northwestern University Bilingualism and Psycholinguistics Research Group for helpful comments and input. This project was supported by grant NICHD R01 HD059858 to Viorica Marian.
Appendix A
Word pairs in each critical condition.
homophone-related (Korean meaning in parenthesis) |
homophone-unrelated | control-related | control-unrelated | ||||
---|---|---|---|---|---|---|---|
bull (fire) | hot | god | pencil | cool | hot | boy | number |
ill (work) | place | talk | bacon | room | place | soap | air |
talk (chin) | bone | ill | shoe | teeth | bone | rack | cow |
god (hat) | coat | bull | coat | fur | coat | rod | pen |
book (drum) | stick | abbey | short | rod | stick | fur | nerd |
oat (clothes) | hanger | tall | warm | rack | hanger | teeth | school |
abbey (father) | man | book | water | boy | man | cool | spoon |
tall (hair) | wash | oat | death | soap | wash | room | death |
pay (lung) | air | duck | sweet | fan | air | life | place |
jug (enemy) | war | goal | skin | fight | war | lake | pencil |
goal (valley) | river | jug | pen | flood | river | smart | rose |
duck (virtue) | truth | pay | brick | liar | truth | wipe | truth |
huge (toilet paper) | tissue | arm | school | wipe | tissue | liar | tissue |
arm (cancer) | death | huge | parent | life | death | fan | wash |
easy (intellect) | nerd | autumn | necklace | smart | nerd | flood | necklace |
mat (taste) | sweet | tongue | ocean | cake | sweet | rice | tree |
tongue (empty) | nest | mat | sick | eagle | nest | fee | brick |
up (career) | school | meal | hanger | fee | school | eagle | warm |
meal (wheat) | field | up | paint | rice | field | cake | paint |
daisy (pig) | bacon | bar | air | ham | bacon | art | ocean |
autumn (ice) | water | easy | thread | lake | water | fight | new |
bar (foot) | shoe | daisy | truth | box | shoe | cold | nest |
evil (blanket) | warm | boot | music | cold | warm | box | hanger |
boot (brush) | paint | evil | under | art | paint | ham | rope |
choke (tip) | pencil | him | number | draw | pencil | idea | lock |
soup (forest) | tree | jeep | kiss | leaf | tree | child | bacon |
him (strength) | energy | choke | river | power | energy | inch | lobe |
jeep (house) | brick | soup | bone | wall | brick | dance | sick |
key (height) | short | yet | beach | inch | short | power | beach |
yet (old) | new | key | spoon | idea | new | draw | thread |
sorry (sound) | music | deck | fruit | dance | music | wall | fruit |
moon (door) | lock | cookie | lock | jail | lock | fork | hot |
cookie (flag) | pole | moon | lobe | fish | pole | pearl | energy |
oak (jade) | necklace | car | bell | pearl | necklace | fish | bell |
deck (home) | parent | sorry | field | child | parent | leaf | under |
car (knife) | spoon | oak | cloud | fork | spoon | jail | water |
chill (seven) | number | jar | positive | age | number | sew | positive |
bowl (cheek) | kiss | mall | tree | lip | kiss | low | sweet |
jar (ruler) | pen | chill | pole | ink | pen | tone | coat |
top (tower) | bell | panel | feather | tone | bell | ink | short |
sun (line) | rope | top | man | hang | rope | age | cloud |
panel (needle) | thread | goat | war | sew | thread | fog | war |
gray (yes) | positive | gray | energy | sure | positive | pillow | river |
goat (flower) | rose | bee | stick | wine | rose | wine | feather |
say (bird) | feather | bowl | nest | pillow | feather | peach | shoe |
mall (horse) | cow | meat | swear | beef | cow | beef | swear |
bay (pear) | fruit | say | rock | peach | fruit | sure | man |
bee (rain) | cloud | weigh | hot | fog | cloud | hard | skin |
pea (blood) | skin | bay | rope | burn | skin | lip | field |
meat (bottom) | under | toe | place | low | under | hang | kiss |
sum (island) | ocean | sum | cow | boat | ocean | vow | bone |
toe (vomit) | sick | pea | rose | flu | sick | wave | pole |
weigh (outer ear) | lobe | sun | tissue | ear | lobe | flu | music |
yolk (curse) | swear | dole | nerd | vow | swear | ear | stick |
hay (sun) | beach | yolk | wash | wave | beach | boat | parent |
dole (stone) | rock | hay | new | hard | rock | burn | rock |
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
The difference between the incongruent and the neutral conditions may be a better measure of inhibitory control than the flanker effect (i.e., the difference between the congruent and the incongruent conditions) because the flanker effect conflates facilitation of congruent information and resistance to interference from incongruent information (see Schroeder, Marian, Shook, & Bartolotti, 2016, for a discussion).
The reason why we performed separate analyses for homophone vs. non-homophone word pairs was that the experimental design was not balanced between homophone word pairs and non-homophone word pairs; all the word pairs in the homophone condition required “No” responses, while half of the non-homophone condition required “Yes” and the other half required “No” resposens. Significant main effects of word type (homophone vs. non-homophone control) do not reflect the langauge co-activation effect of homophones, but rather the overall semantic relatedness difference.
Significant pairwise comparisons can be masked because of the failure to reject the null hypothesis on an F test, which is referred to as nonconsonance (Gabriel, 1969; Hancock & Klockars, 1996).
References
- Balota DA, Yap MJ, Cortese MJ, Hutchison KA, Kessler B, Loftis B, Neely JH, Nelson DL, Simpson GB, Treiman R. The English Lexicon Project. Behavior Research Methods. 2007;39:445–459. doi: 10.3758/bf03193014. [DOI] [PubMed] [Google Scholar]
- Barr DJ, Levy R, Scheepers C, Tily HJ. Random effects structure for confirmatory hypothesis testing: Keep it maximal. Journal of memory and language. 2013;68(3):255–278. doi: 10.1016/j.jml.2012.11.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bates D, Mächler M, Bolker B, Walker S. Fitting linear mixed-effects models using lme4. Journal of Statistical Software. 2015;67(1):1–48. [Google Scholar]
- Bialystok E. Effect of bilingualism and computer video game experience on the Simon task. Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale. 2006;60(1):68. doi: 10.1037/cjep2006008. [DOI] [PubMed] [Google Scholar]
- Bialystok E, DePape AM. Musical expertise, bilingualism, and executive functioning. Journal of Experimental Psychology: Human Perception and Performance. 2009;35(2):565. doi: 10.1037/a0012735. [DOI] [PubMed] [Google Scholar]
- Blumenfeld HK, Marian V. Parallel language activation and cognitive control during spoken word recognition in bilinguals. Journal of Cognitive Psychology. 2013;25(5):547–567. doi: 10.1080/20445911.2013.812093. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Boersma P, Weenink D. Praat Version 4.2. 09. Institute of Phonetic Sciences: Amsterdam 2004 [Google Scholar]
- Brainard DH. The psychophysics toolbox. Spatial vision. 1997;10:433–436. [PubMed] [Google Scholar]
- Chen P, Marian V. Bilingual spoken word recognition. In: Gaskell MG, Mirković J, editors. Speech Perception and Spoken Word Recognition. Psychology Press; 2016. [Google Scholar]
- Clayson PE, Larson MJ. Conflict adaptation and sequential trial effects: Support for the conflict monitoring theory. Neuropsychologia. 2011;49(7):1953–1961. doi: 10.1016/j.neuropsychologia.2011.03.023. [DOI] [PubMed] [Google Scholar]
- Colomé À, Miozzo M. Which words are activated during bilingual word production? Journal of Experimental Psychology: Learning, Memory, and Cognition. 2010;36(1):96. doi: 10.1037/a0017677. [DOI] [PubMed] [Google Scholar]
- Costa A, Hernández M, Sebastián-Gallés N. Bilingualism aids conflict resolution: Evidence from the ANT task. Cognition. 2008;106(1):59–86. doi: 10.1016/j.cognition.2006.12.013. [DOI] [PubMed] [Google Scholar]
- De Groot AM, Delmaar P, Lupker SJ. The processing of interlexical homographs in translation recognition and lexical decision: Support for non-selective access to bilingual memory. The Quarterly Journal of Experimental Psychology: Section A. 2000;53(2):397–428. doi: 10.1080/713755891. [DOI] [PubMed] [Google Scholar]
- Delorme A, Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of neuroscience methods. 2004;134(1):9–21. doi: 10.1016/j.jneumeth.2003.10.009. [DOI] [PubMed] [Google Scholar]
- Dijkstra T, Timmermans M, Schriefers H. On being blinded by your other language: Effects of task demands on interlingual homograph recognition. Journal of Memory and Language. 2000;42(4):445–464. [Google Scholar]
- Dijkstra T, Van Heuven WJB. The BIA model and bilingual word recognition. In: Grainger J, Jacobs AM, editors. Localist Connectionist Approaches to Human Cognition. Mahwah, NJ: Lawrence Erlbaum Associates; 1998. pp. 189–225. [Google Scholar]
- Dijkstra T, Van Heuven WJB. The architecture of the bilingual word recognition system: From identification to decision. Bilingualism: Language and Cognition. 2002;5(3):175–197. [Google Scholar]
- Dijkstra T, Van Jaarsveld H, Ten Brinke S. Interlingual homograph recognition: Effects of task demands and language intermixing. Bilingualism: Language and Cognition. 1998;1:51–66. [Google Scholar]
- Durlik J, Szewczyk J, Muszyński M, Wodniecka Z. Interference and Inhibition in Bilingual Language Comprehension: Evidence from Polish-English Interlingual Homographs. PloS one. 2016;11(3):e0151430. doi: 10.1371/journal.pone.0151430. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Elston-Güttler KE, Gunter TC, Kotz SA. Zooming into L2: Global language context and adjustment affect processing of interlingual homographs in sentences. Cognitive Brain Research. 2005;25(1):57–70. doi: 10.1016/j.cogbrainres.2005.04.007. [DOI] [PubMed] [Google Scholar]
- FitzPatrick I, Indefrey P. Head start for target language in bilingual listening. Brain Research. 2014;1542:111–130. doi: 10.1016/j.brainres.2013.10.014. [DOI] [PubMed] [Google Scholar]
- Gabriel KR. Simultaneous test procedures–some theory of multiple comparisons. Ann Math Stat. 1969:224–250. [Google Scholar]
- Green DW. Mental control of the bilingual lexico-semantic system. Bilingualism: Language and cognition. 1998;1(02):67–81. [Google Scholar]
- Groom MJ, Cragg L. Differential modulation of the N2 and P3 event-related potentials by response conflict and inhibition. Brain and cognition. 2015;97:1–9. doi: 10.1016/j.bandc.2015.04.004. [DOI] [PubMed] [Google Scholar]
- Guo T, Liu H, Misra M, Kroll JF. Local and global inhibition in bilingual word production: fMRI evidence from Chinese–English bilinguals. NeuroImage. 2011;56(4):2300–2309. doi: 10.1016/j.neuroimage.2011.03.049. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hancock GR, Klockars AJ. The quest for alpha: developments in multiple comparison procedures in the quarter century since Games (1971) Rev Educ Res. 1996;66(3):269–306. http://dx.doi.org/10.3102/00346543066003269. [Google Scholar]
- Hoshino N, Kroll JF. Cognate effects in picture naming: Does cross-language activation survive a change of script? Cognition. 2008;106(1):501–511. doi: 10.1016/j.cognition.2007.02.001. [DOI] [PubMed] [Google Scholar]
- Hoshino N, Thierry G. Do Spanish-English bilinguals have their fingers in two pies—or is it their toes? An electrophysiological investigation of semantic access in bilinguals. Frontiers in psychology. 2012;3:52–7. doi: 10.3389/fpsyg.2012.00009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ju M, Luce PA. Falling on sensitive ears: Constraints on bilingual lexical activation. Psychological Science. 2004;15(5):314–318. doi: 10.1111/j.0956-7976.2004.00675.x. [DOI] [PubMed] [Google Scholar]
- Kang KH, Guion SG. Phonological systems in bilinguals: Age of learning effects on the stop consonant systems of Korean-English bilingualsa) The Journal of the Acoustical Society of America. 2006;119(3):1672–1683. doi: 10.1121/1.2166607. [DOI] [PubMed] [Google Scholar]
- Kim J, Davis C. Task effects in masked cross-script translation and phonological priming. Journal of Memory and Language. 2003;49(4):484–499. [Google Scholar]
- Kroll JF. Juggling two languages in one mind. Psychol Sci Agenda, APA. 2008;22(1) [Google Scholar]
- Kuipers JR, Thierry G. Event-related brain potentials reveal the time-course of language change detection in early bilinguals. Neuroimage. 2010;50(4):1633–1638. doi: 10.1016/j.neuroimage.2010.01.076. [DOI] [PubMed] [Google Scholar]
- Kroll JF, Bialystok E. Understanding the consequences of bilingualism for language processing and cognition. Journal of Cognitive Psychology. 2013;25(5):497–514. doi: 10.1080/20445911.2013.799170. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lagrou E, Hartsuiker RJ, Duyck W. Knowledge of a second language influences auditory word recognition in the native language. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2011;37(4):952. doi: 10.1037/a0023217. [DOI] [PubMed] [Google Scholar]
- Lopez-Calderon J, Luck SJ. ERPLAB: an open-source toolbox for the analysis of event-related potentials. Frontiers in human neuroscience. 2014;8(4):1–14. doi: 10.3389/fnhum.2014.00213. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Luk G, Anderson JA, Craik FI, Grady C, Bialystok E. Distinct neural correlates for two types of inhibition in bilinguals: Response inhibition versus interference suppression. Brain and Cognition. 2010;74(3):347–357. doi: 10.1016/j.bandc.2010.09.004. [DOI] [PubMed] [Google Scholar]
- Luk G, Bialystok E. Bilingualism is not a categorical variable: Interaction between language proficiency and usage. Journal of Cognitive Psychology. 2013;25(5):605–621. doi: 10.1080/20445911.2013.795574. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Macizo P, Bajo T, Martín MC. Inhibitory processes in bilingual language comprehension: Evidence from Spanish-English interlexical homographs. Journal of Memory and Language. 2010;63(2):232–244. [Google Scholar]
- Marian V, Blumenfeld HK, Kaushanskaya M. The Language Experience and Proficiency Questionnaire (LEAP-Q): Assessing language profiles in bilinguals and multilinguals. Journal of Speech, Language and Hearing Research. 2007;50(4):940. doi: 10.1044/1092-4388(2007/067). [DOI] [PubMed] [Google Scholar]
- Marian V, Spivey M. Competing activation in bilingual language processing: Within-and between-language competition. Bilingualism: Language and Cognition. 2003;6(02):97–115. [Google Scholar]
- Martín MC, Macizo P, Bajo T. Time course of inhibitory processes in bilingual language processing. British Journal of Psychology. 2010;101(I4):679–693. doi: 10.1348/000712609X480571. [DOI] [PubMed] [Google Scholar]
- McLaughlin J, Osterhout L, Kim A. Neural correlates of second-language word learning: Minimal instruction produces rapid change. Nature neuroscience. 2004;7(7):703–704. doi: 10.1038/nn1264. [DOI] [PubMed] [Google Scholar]
- Mercier J, Pivneva I, Titone D. Individual differences in inhibitory control relate to bilingual spoken word processing. Bilingualism: Language and Cognition. 2014;17(1):89–117. [Google Scholar]
- Mishra RK, Singh N. Language non-selective activation of orthography during spoken word processing in Hindi-English sequential bilinguals: an eye tracking visual world study. Reading and Writing. 2014;27(1):129–151. [Google Scholar]
- Misra M, Guo T, Bobb SC, Kroll JF. When bilinguals choose a single word to speak: Electrophysiological evidence for inhibition of the native language. Journal of Memory and Language. 2012;67(1):224–237. doi: 10.1016/j.jml.2012.05.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nelson DL, McEvoy CL, Schreiber TA. The University of South Florida word association, rhyme, and word fragment norms. 1998 doi: 10.3758/bf03195588. http://www.usf.edu. FreeAssociation.[PubMed] [DOI] [PubMed]
- Oldfield RC. The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia. 1971;9(1):97–113. doi: 10.1016/0028-3932(71)90067-4. [DOI] [PubMed] [Google Scholar]
- Osterhout L, Holcomb PJ. Event-related potentials and syntactic anomaly: Evidence of anomaly detection during the perception of continuous speech. Language and Cognitive Processes. 1993;8(4):413–437. [Google Scholar]
- Paap KR, Johnson HA, Sawi O. Bilingual advantages in executive functioning either do not exist or are restricted to very specific and undetermined circumstances. Cortex. 2015;69:265–278. doi: 10.1016/j.cortex.2015.04.014. [DOI] [PubMed] [Google Scholar]
- Paller KA, Kutas M, McIsaac HK. Monitoring conscious recollection via the electrical activity of the brain. Psychological Science. 1995;6(2):107–111. [Google Scholar]
- Pivik R, Broughton R, Coppola R, Davidson R, Fox N, Nuwer M. Guidelines for the recording and quantitative analysis of electroencephalographic activity in research contexts. Psychophysiology. 1993;30(6):547–558. doi: 10.1111/j.1469-8986.1993.tb02081.x. [DOI] [PubMed] [Google Scholar]
- Polich J. Updating P300: an integrative theory of P3a and P3b. Clinical neurophysiology. 2007;118(10):2128–2148. doi: 10.1016/j.clinph.2007.04.019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pratt N, Willoughby A, Swick D. Effects of working memory load on visual selective attention: behavioral and electrophysiological evidence. Frontiers in human neuroscience. 2011;5:57. doi: 10.3389/fnhum.2011.00057. [DOI] [PMC free article] [PubMed] [Google Scholar]
- R Core Team. R: A language and environment for statistical computing. R Foundation for Statistical Computing; Vienna, Austria: 2015. URL https://www.R-project.org/ [Google Scholar]
- Rossi E, Newman S, Diaz M, Guo T, Kroll JF. There are no mental firewalls: fMRI evidence for global inhibition of the native language in bilingual speech (in preparation) [Google Scholar]
- Rugg MD, Furda J, Lorist M. The effects of task on the modulation of event-related potentials by word repetition. Psychophysiology. 1988;25(1):55–63. doi: 10.1111/j.1469-8986.1988.tb00958.x. [DOI] [PubMed] [Google Scholar]
- Schacht A, Sommer W. Time course and task dependence of emotion effects in word processing. Cognitive, Affective, & Behavioral Neuroscience. 2009;9(1):28–43. doi: 10.3758/CABN.9.1.28. [DOI] [PubMed] [Google Scholar]
- Schroeder SR, Marian V, Shook A, Bartolotti J. Bilingualism and Musicianship Enhance Cognitive Control. Neural Plasticity, 2016. 2016 doi: 10.1155/2016/4058620. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schwartz AI, Kroll JF. Bilingual lexical activation in sentence context. Journal of Memory and Language. 2006;55(2):197–212. [Google Scholar]
- Shook A, Marian V. The bilingual language interaction network for comprehension of speech. Bilingualism: Language and Cognition. 2013;16(2):304–324. doi: 10.1017/S1366728912000466. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thierry G, Wu YJ. Brain potentials reveal unconscious translation during foreign-language comprehension. Proceedings of the National Academy of Sciences. 2007;104(30):12530–12535. doi: 10.1073/pnas.0609927104. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van Hell JG, Dijkstra T. Foreign language knowledge can influence native language performance in exclusively native contexts. Psychonomic Bulletin & Review. 2002;9(4):780–789. doi: 10.3758/bf03196335. [DOI] [PubMed] [Google Scholar]
- Van Hell JG, Tanner D. Second language proficiency and cross language lexical activation. Language Learning. 2012;62(s2):148–171. [Google Scholar]
- Veivo O, Järvikivi J. Proficiency modulates early orthographic and phonological processing in L2 spoken word recognition. Bilingualism: Language and Cognition. 2013;16(04):864–883. [Google Scholar]
- Wu YJ, Thierry G. Fast modulation of executive function by language context in bilinguals. The Journal of Neuroscience. 2013;33(33):13533–13537. doi: 10.1523/JNEUROSCI.4760-12.2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.