Abstract
To explore the association between the user’s cognitive function and usability reported by the evaluator. A cross-sectional study was conducted with a community-based sample. Data about participants’ age, sex, education, sleep quantity, subjective memory complaints, and cognitive function were collected. A usability session was conducted to evaluate a digital solution called Brain on Track. Independent linear-regression analyses were used to explore univariable and multivariable associations between evaluator-reported usability assessment and the users’ cognitive function, age, sex, education, sleep quantity, and subjective memory complaints. A total of 238 participants entered this study, of which 161 (67.6%) were females and the mean age was 42 (SD 12.9) years old. All variables (age, education, sleep quantity, subjective memory complaints and cognitive function) except sex were significantly associated with evaluator-reported usability in the univariable analysis (p < 0.05). Cognitive function, age, education, and subjective memory complaints remained significant in the multivariable model (F = 38.87, p < 0.001) with an adjusted R2 of 0.391. Cognition scores alone showed an adjusted R2 of 0.288. This work suggests that cognitive function impacts evaluator reported usability, alongside other users’ characteristics and needs to be considered in the usability evaluation.
Subject terms: Information technology, Computer science
Introduction
The ISO 9241-11 standard defines usability as the measure by which a product can be used by specific users to achieve specific goals with effectiveness, efficiency and satisfaction, in a context of specific use1. Therefore, usability is dependent on the context of use, i.e., the extent to which a system is considered usable depends on the specific circumstances in which the digital technology is being used1. The context of use includes users (and their characteristics), tasks being performed with the digital solution, equipment (hardware and software), as well as the physical and social environment where the digital solution is being used1.
Usability evaluation is a critical part of the development process of any digital solution, and must be considered to obtain a fully functional system, improve acceptability, increase user satisfaction and improve the digital solution’s reliability2,3. Usability evaluation can take place at any point in a digital solution lifecycle and should involve real users and usability experts throughout the process. For both of these, there is a variety of methods available that should be applied rigorously to collect accurate usability data that enables designers and developers to improve the digital solution4–6.
Despite the general assumption that users’ characteristics impact the results of usability evaluation, there is a lack of in-depth knowledge on which specific characteristics should be considered7. Users’ characteristics that are reported in the literature as conditioning the interaction with digital solutions include the user’s age8–11, education9, digital literacy9,12–15 and physical ability16. Another user characteristic likely to affect usability is the cognitive function, including the ability to understand, learn, be attentive, memorise, or be concentrated8. All these are required when interacting with any digital solution17,18 and, therefore, are likely to affect the ability and the experience of using a digital solution. To the best of our knowledge, no study has previously explored the association between cognitive function and usability in cognitively unimpaired individuals. However, previous studies using individuals with mild cognitive impairment and dementia have shown that usability measures were poorer in these individuals (they complete fewer tasks, take longer to complete them or need more help) when compared with older adults without cognitive impairment19,20.
These findings suggest that cognitive functioning might influence evaluators perceived usability, which is based on the performance of the individual while interacting with the digital solution during the usability evaluation. Therefore, this study aims to explore the association between cognitive function and usability reported by the evaluator in a community of cognitively unimpaired dwelling adults.
Methods
This is a cross sectional study approved by the Ethics Committee of the Centro Hospitalar de Entre o Douro e Vouga, EPE, Portugal (process number CA-0114/16-0c, date 25 Jan 2016). All participants signed an informed consent prior to entering the study and all methods were performed in accordance with the relevant guidelines and regulations.
Participants and inclusion and exclusion criteria
A community-based sample of residents of the region of Águeda, Aveiro, Portugal was involved in the study. To recruit participants, the study was publicized in the local media, in local dissemination actions and through leaflets placed at public establishments. Also, institutions linked to the public administration, industry, education, and the social area of the municipality were contacted, and employees and customers were invited to participate in the study. To be included, participants had to be 18 years old or older, be able to use a computer independently and be able to read in Portuguese. Participants were excluded if having cognitive deficit as assessed by the Montreal Cognitive Assessment (MoCA) as described below in the Procedures section.
Procedures
Participants who agreed to enter the study were asked to provide data regarding age, sex, years of formal education, sleep quantity, subjective memory complaints, and cognitive function.
Sleep quantity was evaluated by self-reporting of the average hours of sleep per night on the last month.
Subjective memory complaints were assessed using the Subjective Memory Complaints Scale (SMC). The SMC consists of ten items21. Each individual item is scored zero (in case of absence of complaints), one, two or three points (in case of maximum complaints) depending on the severity of the complaint. Total score varies between zero and of 21 points. The European Portuguese version of this scale is valid and reliable22. Values greater than or equal to four points indicate significant memory complaints and values less than or equal to three indicate no relevant memory complaints22.
Cognitive function was assessed with the MoCA, which evaluates different cognitive domains, namely executive function; visuospatial capacity; memory; attention, concentration and working memory; language and temporal and spatial orientation. It has been shown to have high sensitivity (90%) and specificity (87%) for detecting mild cognitive impairment23. It is validated for the Portuguese population24 and total score ranges from 0 to 30 points, with higher scores representing better cognitive performances23. The cut off points for cognitive deficit vary between 16 and 27, depending on the participants age and education level, as defined by the Portuguese normative data25. According to the normative data for the Portuguese population, the MoCA scores are expressed as the means ± standard deviation (SD), and those of the distributions that are below 1 SD, 1.5 SDs, and 2 SDs can be considered as cutoff points for possible cognitive impairment25. To determine the MoCA cutoff points, the age and education are considered, and the SD used for this study was 1.5. MoCA was selected because it is a highly sensitive tool for early detection of mild cognitive impairment (MCI) and is widely used in research and clinical practice. Besides that, MoCA is validated for Portugal and has normative data for the Portuguese population. MoCA was used both to screen for inclusion and to characterize the cognitive function of participants who entered the study.
Usability evaluation
Usability evaluation took place in a single session. Participants were asked to use a digital solution called Brain on Track. This digital solution is a computerized web-based self-administered test intended for cognitive testing that allows monitoring of the cognitive performance of users20,21. This solution has gone through a demanding process of testing and validation and is on the market since 2013.
The usability session consisted of three parts:
Pre-test—The evaluator explained all the procedures of the study.
Test—The evaluator explained how to use the digital solution Brain on Track and the participant interacted freely with it for 25 min. The evaluator helped the participant whenever the participant had doubts about any feature of the system and observed the interaction.
Post-test—The evaluator filled the ICF based Usability Scale I (ICF-US I)26. The ICF-US I is a usability rating scale that can be used as a tool for the usability evaluation based on the evaluator’s opinion26. This scale was used in this study because usability based on the evaluator’s perspective showed higher correlation with indicators of user performance than usability based on the user’s perspective2. The ICF-US I is composed of 10 items associated with different usability principles that are classified as facilitators or barriers. It is based on the International Classification of Functioning, Disability and Health (ICF)27 classification of environmental factors28. Each item is scored from − 3 (barrier) to 3 (facilitator) or not applicable (NA). For purposes of scoring, if an item is classified as NA, then this item score will be replaced by the mean value of the remaining items, rounded to the units. The final score of the ICF-US I is calculated by adding the scores of all individual items up to a maximum score of 30. Values above 10 indicate good usability and values below 10 indicate poor usability. A total of four evaluators with at least 2 years of experience in usability evaluation participated in the study, but each participant was observed by one evaluator only.
Statistical analysis
All data analyses were performed using SPSS 28.0 for Windows (SPSS Inc, Chicago, IL). Mean and standard deviation and count and proportion were used to describe continuous and categorical variables, respectively. To determine possible factors associated with evaluator reported usability, independent linear-regression analyses were used to explore univariable and multivariable associations between the independent variables (age, sex, education, sleep quantity, subjective memory complaints and MoCA scores) and usability based on the opinion of the evaluator as the dependent variable. Normality was checked by visual inspection of the plots and no major violations were noted for the independent variables. The univariable analyses was used to identify the variables that would enter the multivariable model, and p ≤ 0.10 was required, except for sex and age, which were forced into the multivariable model. The multivariable analyses were performed using the enter, stepwise, backward and forward models and as the Adjusted R-squared of the models varied less than 0.01, the results of the stepwise model were reported. The variables that entered into the models were checked for multicollinearity using variance inflation factor (VIF) ≤ 5 and the respective tolerance value (> 0.2). Level of significance was set at p < 0.05.
Results
A total of 238 participants entered this study, of which 161 (67.6%) were females. The mean age was 42 (SD 12.9) years old (range 20–76 years old). Sample characteristics are presented in Table 1.
Table 1.
Variables | |
---|---|
Sex, n (%) | |
Female | 161 (67.6) |
Male | 77 (32.4) |
Age, mean (± SD) | 42 (± 12.9) |
Years of formal education, mean (± SD) | 14 (± 3.9) |
Hours of sleep, mean (± SD) | 6.8 (± 1.1) |
SMC total (0–21), mean (± SD) | 3.9 (± 2.8) |
MoCA total (0–30), mean (± SD) | 27.17 (± 2.1) |
The mean (± SD) score of the ICF-US I was 25.38 (± 6.89), which indicates that the digital solution Brain on Track was considered a big facilitator. Of the 238 participants, 230 (96.6%) scored above the cut off (10 points) for considering the application a facilitator. The remaining 8 participants scored the digital solution below the cut off indicating that the application was considered a barrier.
Univariable associations with evaluator reported usability
All variables (education, sleep quantity, subjective memory complaints and cognitive function) were significantly associated with usability (Table 2) in the univariable analysis.
Table 2.
Independent variables | Unstandardized coefficient (95% CI) | Standardized coefficient | p |
---|---|---|---|
Education (years) | 0.85 (0.64; 1.05) | 0.47 | < 0.001 |
Sleep quantity (hours) | 1.21 (0.45; 1.97) | 0.20 | 0.002 |
Subjective memory complaints | − 0.51 (− 0.81; − 0.20) | − 0.21 | 0.001 |
MoCA scores | 1.77 (1.42; 2.12) | 0.18 | < 0.001 |
Multivariable associations with evaluator reported usability
Education, sleep quantity, subjective memory complaints, MoCA, sex and age were all include in the multivariable analysis as independent variables. Of these, age, education, subjective memory complaints and cognitive function remained significant in the multivariable model (F = 38.87, p < 0.001) with an adjusted R2 of 0.391 (Table 3). MoCA scores alone showed an adjusted R2 of 0.288.
Table 3.
Model | Unstandardized coefficient (95% CI) | Standardized coefficient | p |
---|---|---|---|
Constant | 2.36 (− 9.67; 14.38) | ||
MoCA score | 0.88 (0.44; 1.32) | 0.27 | < 0.001 |
Age | − 0.13 (− 0.19; − 0.07) | − 0.24 | < 0.001 |
Education | 0.42 (0.20; 0.64) | 0.23 | < 0.001 |
Subjective memory complaints | − 0.35 (− 0.60; − 0.10) | − 0.15 | 0.006 |
Discussion
This study explored the association between cognitive function and usability reported by the evaluator in a community-based sample and results suggest that cognitive function impacts evaluator reported usability, alongside with age, education and subjective memory complaints.
Cognitive function, accounted for most of the variance of usability explained by the multivariable model, which in total explained around 39% of the usability variance. The remaining 61% may be explained by the intrinsic usability of the system, and other variables that are yet to be studied.
Studies have suggested that using digital solutions, such as personal computers29 or tablets30 help maintain or enhance cognitive functions, such as memory, processing speed and attention. We are not aware of previous studies exploring the association between cognitive function and the perceived usability of the study neither in the evaluator’s perspective nor in the user’s perspective. However, for a user to be efficient in the use of a certain digital solution, several cognitive functions are involved, including concentration, attention, and memory31. Conceivably, the better the user's ability in these areas, the easier the interaction with this type of technologies will be. On the other hand, those users with lower cognitive ability may have more difficulty interacting with complex digital solutions. The findings of the present study suggest that assessing users’ cognitive function in usability studies is relevant and raise important questions, namely to what extent is it necessary to differentiate the interfaces according to the cognitive skills of the users? Another important issue is the role of training, as the association between cognitive function and usability might decrease with continued use, but individuals with lower cognitive skills might require longer periods of training to be able to effectively and efficiently use a digital solution. Also, this study results reinforce the importance of applying usability principles in the design of digital solutions such as minimize memory load, meaning that users should be able to recognize rather than recall or use minimalist design to avoid distractions and favor concentration32,33.
It is important to note that this study did not include individuals with a previous diagnosis of mental or neurological diseases, but people living in the community with no identified cognitive impairment.
A limitation of the present study is the fact that digital literacy was not considered, and it is one of the variables mentioned in the literature as influencing usability10,34. This variable is difficult to assess comprehensively, as it is not just about being proficient in the use of technology, but also about its ethical and responsible use and it is related to social and cultural aspects35. Another study limitation is the fact that neither user self-perceived usability nor objective measures of usability were evaluated, and it would be interesting to explore whether they are also associated with cognitive function. The ICF-US I was used to study the association with usability reported by the evaluator as it is a reliable tool that overcomes the difficulty reported in the literature related to the fact that the opinion of the users, collected through the filling of self-perceived generic usability scales, do not fully reflect users’ performance26,36.
Conclusion
Cognitive function seems to be associated to the usability of digital solutions and influence the human-technology interaction. The results of this study suggest that cognition has an even more pronounced influence in usability than other user characteristics such as age or education.
Acknowledgements
This study was partially supported by the Águeda City Council as part of a community cognitive screening program.
Author contributions
A.I.M., V.T.C., J.P. and N.P.R. designed the research, A.I.M. conducted the data collection, A.G.S. and A.I.M. analyzed and interpreted the results, A.I.M., A.G.S., V.T.C., J.P. and N.P.R. discussed the results and A.I.M., A.G.S. and N.P.R. wrote the manuscript. All authors reviewed the final document.
Data availability
All data needed to evaluate the conclusions in the paper are present in the paper. Additional data related to this paper may be requested from the authors.
Competing interests
The authors declare no competing interests.
Footnotes
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.International Organization for Standardization [ISO]. Ergonomics of human system interaction—Part 210: Human-centred design for interactive systems (ISO 9241-210:2019). (2019).
- 2.Martins AI, Queirós A, Rocha NP. Validation of a usability assessment instrument according to the evaluators’ perspective about the users’ performance. Univ. Access Inf. Soc. 2020;19:515–525. doi: 10.1007/s10209-019-00659-w. [DOI] [Google Scholar]
- 3.Nunes IL. Ergonomics & usability—Key factors in knowledge society. Enterp. Work Innov. Stud. 2006;2:87–94. [Google Scholar]
- 4.Dix, A., Finlay, J., Abowd, G. & Beale, R. Human–Computer Interaction (Prentice Hall, 2004).
- 5.Martins AI, Queirós A, Silva AG, Rocha NP. Usability evaluation methods: A systematic review. Hum. Factors Softw. Dev. Des. 2015 doi: 10.4018/978-1-4666-6485-2.ch013. [DOI] [Google Scholar]
- 6.Petrie, H. & Bevan, N. The evaluation of accessibility, usability, and user experience. In The Universal Access Handbook (ed. Stepanidis, C.) 1–16 (CRC Press, 2009). 10.1201/9781420064995-c20.
- 7.Borsci S, et al. Designing medical technology for resilience: Integrating health economics and human factors approaches. Expert Rev. Med. Dev. 2017 doi: 10.1080/17434440.2018.141866115,15-26. [DOI] [PubMed] [Google Scholar]
- 8.Chaniaud N, Megalakaki O, Capo S, Loup-Escande E. Effects of user characteristics on the usability of a home-connected medical device (smart angel) for ambulatory monitoring: Usability study. JMIR Hum. Factors. 2021;8:e24846. doi: 10.2196/24846. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Kim HH, et al. User-dependent usability and feasibility of a swallowing training mHealth app for older adults: Mixed methods pilot study. JMIR Mhealth Uhealth. 2020;8:e19585. doi: 10.2196/19585. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Sparkes J, Valaitis R, McKibbon A. A usability study of patients setting up a cardiac event loop recorder and blackberry gateway for remote monitoring at home. Telemed. E-Health. 2012;18:484–490. doi: 10.1089/tmj.2011.0230. [DOI] [PubMed] [Google Scholar]
- 11.Chadwick-Dias, A. et al. Web usability and age. In Proceedings of the 2003 Conference on Universal Usability—CUU ’03 30 (ACM Press, 2003). 10.1145/957205.957212.
- 12.Kaufman DR, et al. Usability in the real world: Assessing medical information technologies in patients’ homes. J. Biomed. Inform. 2003;36:45–60. doi: 10.1016/S1532-0464(03)00056-X. [DOI] [PubMed] [Google Scholar]
- 13.Voncken-Brewster V, et al. Usability evaluation of an online, tailored self-management intervention for chronic obstructive pulmonary disease patients incorporating behavior change techniques. JMIR Res. Protoc. 2013;2:e3. doi: 10.2196/resprot.2246. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Kastner M, Lottridge D, Marquez C, Newton D, Straus SE. Usability evaluation of a clinical decision support tool for osteoporosis disease management. Implement. Sci. 2010;5:1–12. doi: 10.1186/1748-5908-5-96. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Bhuiyan, M., Zaman, A. & Miraz, M. H. Usability Evaluation of a Mobile Application in Extraordinary Environment for Extraordinary People. (2017).
- 16.Kevin, M. User requirements: Understanding your users’ characteristics. Architecting Usabilityhttp://architectingusability.com/2012/06/15/user-requirements-understanding-your-users-characteristics/ (2012).
- 17.Wu Y-H, Lewis M, Rigaud A-S. Cognitive function and digital device use in older adults attending a memory clinic. Gerontol. Geriatr. Med. 2019;5:233372141984488. doi: 10.1177/2333721419844886. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Steve, K. Don’T Make Me Think. (Pearson Education (US)).
- 19.Contreras-Somoza LM, et al. Usability and user experience of cognitive intervention technologies for elderly people with MCI or dementia: A systematic review. Front. Psychol. 2021;12:1401. doi: 10.3389/fpsyg.2021.636116. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.de Frias CM, Dixon RA, Strauss E. Characterizing executive functioning in older special populations: From cognitively elite to cognitively impaired. Neuropsychology. 2009;23:778–791. doi: 10.1037/a0016743. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Schmand B, Jonker C, Hooijer C, Lindeboom J. Subjective memory complaints may announce dementia. Neurology. 1996;46:121–125. doi: 10.1212/WNL.46.1.121. [DOI] [PubMed] [Google Scholar]
- 22.Ginó, S. et al. Escala de Queixas de Memória. In Escalas e testes na demência (eds. Mendonça, A. & Guerreiro, M.) 117–120 (GEECD, 2007).
- 23.Nasreddine ZS, et al. The Montreal Cognitive Assessment, MoCA: A brief screening tool for mild cognitive impairment. J. Am. Geriatr. Soc. 2005;53:695–699. doi: 10.1111/j.1532-5415.2005.53221.x. [DOI] [PubMed] [Google Scholar]
- 24.Freitas, S., Simões, M. & Martins, C. Estudos de adaptação do Montreal Cognitive Assessment (MoCA) para a população portuguesa. Avaliação Psicológica (2010).
- 25.Freitas S, Simões MR, Alves L, Santana I. Montreal Cognitive Assessment (MoCA): Normative study for the Portuguese population. J. Clin. Exp. Neuropsychol. 2011;33:989–996. doi: 10.1080/13803395.2011.589374. [DOI] [PubMed] [Google Scholar]
- 26.Martins, A. I., Queirós, A., Silva, A. G. & Rocha, N. P. ICF based Usability Scale: Evaluating usability according to the evaluators’ perspective about the users’ performance. In Proceedings of the 7th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion—DSAI 2016 378–383 (ACM Press, 2016). 10.1145/3019943.3019997.
- 27.WHO. International Classification of Functioning, Disability and Health. (2001).
- 28.Martins AI, Queirós A, Cerqueira M, Rocha N, Teixeira A. The International Classification of Functioning, Disability and Health as a conceptual model for the evaluation of environmental factors. Proc. Comput. Sci. 2012;14:293–300. doi: 10.1016/j.procs.2012.10.033. [DOI] [Google Scholar]
- 29.Bordone V, Scherbov S, Steiber N. Smarter every day: The deceleration of population ageing in terms of cognition. Intelligence. 2015;52:90–96. doi: 10.1016/j.intell.2015.07.005. [DOI] [Google Scholar]
- 30.Chan MY, Haber S, Drew LM, Park DC. Training older adults to use tablet computers: Does it enhance cognitive function? Gerontologist. 2016;56:475–484. doi: 10.1093/geront/gnu057. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Small GW, et al. Brain health consequences of digital technology use. Dialog. Clin. Neurosci. 2020;22:179. doi: 10.31887/DCNS.2020.22.2/gsmall. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Nielsen, J. Usability Engeneering (Academic Press, 1993).
- 33.Lillemaa, M., Mazumder, F. K. & Das, U. K. Usability Guidelines For Usable User Interface.
- 34.Georgsson M, Staggers N. Quantifying usability: An evaluation of a diabetes mHealth system on effectiveness, efficiency, and satisfaction metrics with associated user characteristics. J. Am. Med. Inform. Assoc. 2015;23:5–11. doi: 10.1093/jamia/ocv099. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Burnett C, Merchant G. Is there a space for critical literacy in the context of social media? ERIC. 2011;10:41–57. [Google Scholar]
- 36.Bangor A, Kortum PT, Miller JT. An empirical evaluation of the System Usability Scale. Int. J. Hum. Comput. Interact. 2008;24:574–594. doi: 10.1080/10447310802205776. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
All data needed to evaluate the conclusions in the paper are present in the paper. Additional data related to this paper may be requested from the authors.