Skip to main content
. 2024 Aug 13;10(16):e36251. doi: 10.1016/j.heliyon.2024.e36251

Table 2.

Variables, Items and Likert-scale items collected in “General Japanese citizens’ perception of emotional AI technologies,” a national survey on the Japanese population.

Emotional AI in schools
Schools in some countries are employing companies to install cameras and artificial intelligence in classrooms to track students' facial expressions to try to work out their emotional states and attention levels. This aims to tailor teaching approaches by understanding if some students are struggling with class material or if other students need to be challenged more. It also aims to identify students’ attention levels, to help teachers to monitor and record in-class attention levels.
Variables Statement
1 (strongly disagree) to 5 (strongly agree)
Scale
AttitudeEAIschool I would be comfortable with schools using emotion and attention monitoring in this way. 1 (strongly disagree) to 5 (strongly agree)
BiasConcern I would be concerned that the emotion recognition software would not work consistently across children of different genders, ethnicities, ages, and disabilities. Some children could end up misclassified, and so get inappropriately tailored teaching or punishment. 1 (strongly disagree) to 5 (strongly agree)
DataMisuseConcern I would be concerned about what happens to the emotional data about the child, and whether it might be used against the child in some way (now or in the future). 1 (strongly disagree) to 5 (strongly agree)
DystopianConcern This sort of emotional monitoring would feel dystopian. Children could worry about being judged by machines on their facial expressions at school. 1 (strongly disagree) to 5 (strongly agree)
Knowledge I have a basic understanding of the emotion-sensing technologies involved in such educational practices and their uses. 1 (strongly disagree) to 5 (strongly agree)
SafetyUtility I consider this use of emotion-sensing AI systems improve the safety of the school. 1 (strongly disagree) to 5 (strongly agree)
AccuracyConcern I am concern about the overall accuracy of such emotional AI systems. 1 (strongly disagree) to 5 (strongly agree)
TrustGov I think the government will be capable of providing sufficient regulations for such uses of emotional AI technologies. 1 (strongly disagree) to 5 (strongly agree)
TrustPrivate I trust companies to regulate themselves, ensuring that their technology will not result in racial, gender or age bias/discrimination and privacy harm. 1 (strongly disagree) to 5 (strongly agree)
Emotional AI Toys
This question is about interactive toys for children up to 12 years old. Toymakers are interested in building toys with capabilities for basic conversations, meaning they can increasingly understand and derive meaning from children's speech. These toys would also try to interpret emotion in child speech, through tone of voice, so that the toy can respond appropriately by adapting play activities or trying to cheer them up if they are sad.
Variables Items Scale
AttitudeEAIToys1 1. I would be comfortable with this as it sounds like fun. I wish I had toys like this when I was younger.s 1 (strongly disagree) to 5 (strongly agree)
UndueInfluence 2. I would have concerns about what the toy is saying to the child, how it is handling conversation with the child, and maybe even what it is advising the child to do or think. 1 (strongly disagree) to 5 (strongly agree)
DataManage
Concern
3. I would have concerns about where the emotion data about conversations would go and who could access it e.g. advertisers trying to sell the child more toys. 1 (strongly disagree) to 5 (strongly agree)
OK
AliveIlusion
4. I am comfortable with the idea that a young child might perceive the toy's artificial personality as something that is conscious or alive. 1 (strongly disagree) to 5 (strongly agree)
Privacy
Concern
5. I consider this practice too much scrutiny of my child's emotions. 1 (strongly disagree) to 5 (strongly agree)
Knowledge 6. I have the basic understanding of emotion-sensing technologies involved in such toys. 1 (strongly disagree) to 5 (strongly agree)
Accuracy
Concern
7. I am concerned about the overall accuracy of such emotional AI systems. 1 (strongly disagree) to 5 (strongly agree)
Bias
Concern
8. I would be concerned that the emotion recognition software would not work consistently across children of different genders, ethnicities, ages, and disabilities. 1 (strongly disagree) to 5 (strongly agree)
TrustGov 9. I think the government will be capable of providing sufficient regulations for such uses of emotional AI technologies. 1 (strongly disagree) to 5 (strongly agree)
TrustPrivate 10. I trust companies to regulate themselves, ensuring that their technology will not result in racial, gender or age bias/discrimination and privacy harm. 1 (strongly disagree) to 5 (strongly agree)