Abstract
Objective
Ophthalmic ward nursing work is onerous and busy, and many researchers have tried to introduce artificial intelligence (AI) technology to assist nurses in performing nursing tasks. This study aims to use augmented reality (AR) and AI technology to develop an intelligent assistant system for ophthalmic ward nurses and evaluate the usability and acceptability of the system in assisting clinical work for nurses.
Methods
Based on AR technology, under the framework of deep learning, the system management, functions, and interfaces were completed using acoustic recognition, voice interaction, and image recognition technologies. Finally, an intelligent assistance system with functions such as patient face recognition, automatic information matching, and nursing work management was developed. Ophthalmic day ward nurses were invited to participate in filling out the System Usability Scale (SUS). Using the AR-based intelligent assistance system (AR-IAS) as the experimental group and the existing personal digital assistant (PDA) system as the control group. The experimental results of the three subscales of learnability, efficiency, and satisfaction of the usability scale were compared, and the clinical usability score of the AR-IAS system was calculated.
Results
This study showed that the AR-IAS and the PDA systems had learnability subscale scores of 22.50/30.00 and 21.00/30.00, respectively; efficiency subscale scores of 29.67/40.00 and 28.67/40.00, respectively; and satisfaction subscale scores of 23.67/30.00 and 23.17/30.00, respectively. The overall usability score of the AR-IAS system was 75.83/100.00.
Conclusion
Based on the analysis results of the System Usability Scale, the AR-IAS system developed using AR and AI technology has good overall usability and can be accepted by clinical nurses. It is suitable for use in ophthalmic nursing tasks and has clinical promotion and further research value.
Keywords: Digitalhealth, technology, medicine, augmented reality, artificial intelligence, ophthalmic nursing, system usability scale
Introduction
Due to the characteristics of “short, frequent, and fast” in ophthalmic surgery, the development of ophthalmic day surgery wards has become a trend. This can lead to a large number of surgical patients being gathered in a short period of time, posing great challenges for the nursing work of ophthalmology wards, 1 and also putting higher requirements on the nursing quality and safety of ophthalmic day surgery ward nurses. The nursing work in the day surgery ward mainly involves patient information verification, surgery information verification, patient medical order execution, health education, patient surgery transfer handover, patient medical record writing, etc. In order to solve the problems faced by nursing work in ophthalmic day surgery wards, many policies have been introduced by the government and many hospitals have also developed response plans and introduced new technologies and tools. For example, mobile handheld PDA (personal digital assistant) terminals and mobile nursing carts have improved day surgery ward nursing work. Our hospital has equipped with a mobile nursing cart since it started its day surgery ward service; however due to its large space occupation and inconvenient mobility we introduced PDA devices in 2019 as well. However there are some problems encountered during clinical use. Although handheld PDA devices are relatively portable, they restrict nurses’ hands. Nurses constantly take out PDAs from their pockets during nursing works, not only increasing the complexity of operation and the frequency of hand hygiene, but also increasing nosocomial infection risks. Given these issues, our hospital is exploring new methods that can not only free up nurses’ hands but also assist them in their nursing work. We are trying to use augmented reality (AR) and artificial intelligence (AI) technologies to build an intelligent assistant system for ophthalmic nursing. We hope that this system can assist nurses in clinical nursing in the ophthalmology ward, reduce their workload, and improve the quality and safety of nursing work. The usability of the system will help us evaluate nurses’ adoption and acceptance of the ophthalmic nursing intelligent assistant system. Through this evaluation, we can continuously improve the system.
AR is an emerging digital technology that combines virtual information with the real world through computer image processing. It allows users to experience an enhanced or altered reality scene. In recent years, AR has achieved many successful applications in various fields such as industry, agriculture, healthcare, military, and education.2–6 In healthcare, AR technology is primarily used for doctor surgery training, evaluation and guidance, etc.7–9 This technology has greatly improved the efficiency and accuracy of medical diagnosis and treatment. Additionally, AR is widely utilized in surgical procedures, image analysis remote consultations, and medical education.10–12 In terms of nursing research on AR technology both domestically and internationally has mainly focused on teaching and experimental areas. Relevant studies have shown that AR technology holds important application value in changing nursing teaching methods by stimulating students’ innovative thinking while avoiding risks associated with actual experiments.13–16 AI is an emerging technological science that studies and develops theories, methods, technologies, and application systems for simulating, extending, and expanding human intelligence. Research in this field includes robots, speech recognition, image recognition, natural language processing, expert systems, etc. Currently, in the ophthalmic medical nursing field, AI has been widely applied in nursing robots, optical coherence tomography (OCT) image interpretation, special disease knowledge base construction.17–21
To evaluate the usability of our developed system, we need to use relevant measurement tools. Currently, there are several effective subjective measurement scales available for evaluating systems usability. Lund proposed the Usefulness, Satisfaction, and Ease of Use (USE) scale; Tullis proposed the Questionnaire for User Interaction Satisfaction (QUIS) scale; Finstad proposed the Usability Metric for User Experience (UMUX) scale; Lewis proposed the After-Scenario Questionnaire (ASQ); and Lewis proposed the Post-Study System Usability Questionnaire (PSSUQ).22–27 However, Brooke 28 proposed the System Usability Scale (SUS), which is one of the most widely used and accepted measurement methods, and it is commonly used in questionnaires to evaluate medical innovation usability.29,30 SUS can scientifically quantify user experience and measure overall macro usability of products or systems after completing a series of task scenarios. The International Standardization Organization (ISO)'s 9241-11 international standard defines usability as learnability, efficiency, and user satisfaction when specific users use products for specific purposes in specific usage environments. 31 SUS provides a comprehensive reflection of users’ subjective evaluation of system usability quickly and easily, 32 with results immediately scored on a single reliable ranging from 0 to 100. Kubra et al.33–36 also utilized SUS to evaluate the usability of various systems such as rehabilitation systems and breast cancer systems. In this study, we will adopt SUS to evaluate the usability of an ophthalmic nursing intelligent assistant system based on AR and AI technology.
Based on the research and applications conducted both domestically and internationally, there is currently no relevant report on the actual implementation of AR technology in clinical nursing work in day surgery wards. Therefore, this project holds great significance in exploring the application of AR technology in clinical nursing. The objective of this article is to introduce the design and implementation of the system, and evaluate its usability in clinical nursing work and the acceptability of nurses.
Methods
Design and implementation of the system
As a wearable device, AR glasses offer excellent mobility and flexibility. Nurses require interaction with the system during their usage. Therefore, a 5G network or comprehensive wireless WIFI coverage throughout the ward is necessary to provide the fundamental support for system operation. By combining machine vision and auditory AI algorithms, AR glasses enable nurses to accurately verify patient identity, access surgical information, retrieve patient orders, and check education status while performing various nursing tasks. Additionally, it can automatically record real-time audio-visual information of each nursing operation and execution process to achieve closed-loop management. Consequently, AR glasses assist nurses in improving patient nursing accuracy and efficiency, reducing clinical nursing work pressure, as well as improving hospital nursing quality and safety.
In this study, we utilized free software components to design and construct the architecture of an ophthalmic nursing intelligent assistance system. The AR glasses were Vuzix M400 smart glasses, which featured the Qualcomm Snapdragon XR1 platform, 64GB internal flash memory, and an Arm-based 8-core CPU. The AR nursing system was developed using Unity 3D development platform base on Spring, Spring Boot, and Spring Cloud frameworks. Project management followed the Scrum methodology, while the Android Studio integrated development environment was used, and Java and C# programming languages were applied to complete the development of the AR nursing system as well as its integration with Hospital Information System (HIS) and Electronic Medical Record (EMR). The overall usability, learnability, efficiency, satisfaction levels, etc., of the system were evaluated through SUS, followed by an assessment of its clinical usability based on SUS results.
System architecture
Due to the excellent mobility of AR glasses, a 5G network or comprehensive wireless WIFI coverage is essential throughout the ward to provide the necessary support for system operation. The system comprises three main modules: system management, system functions, and system interfaces. It is worth noting that only the system function module requires the use of AR glasses, while the other modules do not necessitate their usage. The system architecture diagram, please refer to Figure 1.
Figure 1.
Architecture diagram of augmented reality-based intelligent assistance system (AR-IAS).
System management
The management of this system encompasses several key aspects, such as the configuration of initialization parameters, department and user management, as well as the setting and querying of educational outline templates. Additionally, it involves the statistical analysis of system usage information and the management of system log services. During the initialization process, user information is collected for login verification, including voiceprint data, and corresponding permission and self-selected item management. This includes managing users, roles, and permissions. The statistical report function plays a crucial role in querying various data generated within the AR system to facilitate better analysis of system data and enhance nursing work. Currently implemented query functions include monitoring medical order execution status, verifying surgical information, providing education situation reminders, and documenting patient surgical transport handover records.
System function
This module serves as the central component of the system, enabling nurses to efficiently carry out various nursing tasks. In order to cater to the unique features of the equipment, the AR login system offers two options for authentication: voiceprint recognition or scanning the nurse's personal QR code. The interaction with the system is primarily conducted through voice commands, allowing nurses to effortlessly communicate their instructions without physically interacting with the device. Presently, there are four key scenarios supported by the system: patient identification and verification of surgical information, verification and execution of medical orders, provision of health education, and management of patient surgery transfer handover.
System interface
The system has completed the integration with the hospital's HIS and EMR System, allowing for intelligent verification of patient information, surgical information, and medical orders. At the same time, we have also reserved interfaces for other systems to integrate such as Laboratory Information System (LIS) and Picture Archiving and Communication System (PACS). Integrating with these systems only requires following the defined interface.
System usage process and function introduction
First, the nurse puts on AR glasses and reads the numbers prompted by the system. The system then compares these numbers with the voiceprint data stored in the database. If a match is found, the nurse is granted access to the system (see Figure 2). Subsequently, nurses can interact with the system using voice commands and read out specific numbers associated with different functions to navigate through various functional interfaces based on their work requirements (see Figure 3).
When using the patient identification and surgical information verification function, nurses can scan the patient's facial information after entering the interface (see Figure 4), and matching with the patient information in the database to obtain information such as the patient's name, hospitalization number, surgical name, and anesthesia method. If the matching is successful (see Figure 5(a)), the information related to the patient and the surgery will be displayed on the left side of the screen, and the nurse can perform patient identification and surgical information verification. Otherwise, a red warning sign will be displayed on the right side of the screen to remind the nurse of an abnormal patient information match (see Figure 5(b)).
After selecting the “Medical Orders” function, the AR glasses screen will display a list of the patient's unexecuted medical orders (see Figure 5(c)). Nurses only need to say “completed” after executing the medical orders, and the temporary medical order execution information in the patient's electronic medical record will be updated accordingly. During preoperative preparation, flushing the conjunctival sac or pupil dilation operations, the AR glasses can intelligently assist nurses in checking the left or right eyes to ensure accurate execution of medical orders.
When selecting “Patient education” function, nurses are required to scan patients’ facial features first in order to accurately identify which health education content should be delivered based on whether it has been previously provided or not. The results are then displayed on the screen as shown in Figure 5(d). Once a patient completes an educational activity, the system synchronously updates disease nursing records with details of health education implementation including who provided it for future reference.
After selecting the “Patient Surgical Transport Handover Management” function, when the patient goes to the operating room, the day ward nurse scans their faces, verify the patient's identity, and updates the EMR. When they return from the operating room to the ward, their faces are scanned again by the day ward nurse, verify the patient's identity, and the EMR is updated once more. After each function is completed, nurses can also issue a “return” command to automatically go back to the previous menu level.
Figure 2.
System usage process chart.
Figure 3.
Nurse login system.
Figure 4.
Nurse scanning patient's face.
Figure 5.
System interface screenshot. (a) Patient matching success interface. (b) Patient matching failure interface. (c) Patient order execution interface. (d) Patient education interface.
Patient identification and surgical information verification module
The function of this module includes the recognition and verification of patient information, as well as the verification of patient surgical eyes, surgical methods, surgical site identification, and anesthesia methods. The majority of surgical patients in our hospital are cataract patients. With the increasing demand for visual quality of life, not only are various types of artificial lenses emerging, but also precise femtosecond cataract surgery methods are becoming more popular. Therefore, the surgical methods and types of artificial lenses for each patient are also different. Before the surgery, the doctor will mark the surgical method and eye type for each patient. For example: if it is a right-eye astigmatism intraocular lens implantation surgery, the surgeon will use a skin marker pen to draw a 1 cm diameter circle above 1 cm away from their right brow bone. They will then make an astigmatism mark on their cornea followed by ticking inside that circle to form a symbol
. If it is femtosecond laser-assisted cataract surgery, another symbol
will be drawn for differentiation. Meanwhile, other routine cataract surgeries will have a symbol ○ above their operating eye directly. AR glasses perform AI deep learning based on labeled images of different surgical methods to verify the patient's surgical approach. When identifying and verifying patient information, nurses wear AR glasses, scan the patient's face, and obtain patient information through interaction with HIS and EMR systems. The patient's identity information is identified and verified. After verification, the patient's surgical eye, surgical method, anesthesia method, and surgical site identification are automatically verified. If unsuccessful, an alarm is triggered. Nurses receive alarm signals and conduct detailed analysis and processing.
Medical order verification and execution module
The nurse obtains the patient's pending medical orders by wearing AR glasses and selecting this function through voice, and executes the orders after verifying the patient's medical order information. After the execution is completed, the nurse directly says “execution completed,” and the system will immediately interact with EMR to update the execution status of the medical order in the patient's electronic medical record to “Completed,” and record the electronic signature of the logged in nurse in the AR intelligent nursing system.
Preoperative education module
This module is a crucial component of the healthcare system, with nurses playing a vital role in patient education. Day surgery patients arrive at the hospital based on their scheduled surgery time, and surgeries commence one after another following preoperative preparations. While preoperative education is typically conducted centrally in regular wards, day wards must conduct it in batches for patients, resulting in increased workload for nurses and potential oversight. Therefore, real-time recording of completed preoperative education is of utmost importance. During preoperative education, nurses can wear AR glasses to scan the patient's face and verify if they have received necessary education. For patients who haven't been educated, the AR system provides an educational outline for guidance. Once the education is completed, the nurse only needs to say “execution completed,” and the system immediately interacts with the EMR. This ensures that information regarding completion of preoperative education is recorded in the patient's EMR.
Patient surgery transport handover module
Patient surgery transport handover check is a critical aspect of patient safety and a key responsibility of ward nursing staff. Its primary objective is to guarantee the accuracy of patient identification and ensure the safety of handover procedures. With the utilization of the AR intelligent nursing system, nurses can scan the patient's face to accurately identify them, verify the details of the surgery handover, and ensure patient safety throughout the entire surgical transportation handover process. Furthermore, this system has the capability to record real-time location movements of patients and accurately update their information in electronic medical records. This ensures that all relevant details regarding when patients are transported to the operating room or back to the ward are precisely recorded in surgical handover forms.
All data generated by these operations be recorded in a database that head nurse can access via office computers through system interface functions for querying, statistics, and analysis purposes. This helps improve management and enhance nursing quality in ward nursing work.
System usability analysis
System usability scale
The SUS is a validated measurement method for system usability and user satisfaction. 28 It provides an overall assessment of usability through a questionnaire consisting of 10 questions. Odd-numbered items are positive statements, while even-numbered items are negative statements. Subscale 1 includes questions 4, 5, and 10, which measuring “Learnability.” Subscale 2 comprises questions 2, 3, 7, and 8, measuring “Use Efficiency.” While subscale 3 consists of questions 1, 6, and 9, assessing “Satisfaction.” Each statement is rated on the Likert five-point scale from one (strongly disagree) to five (strongly agree). For odd numbered items, subtract one from the score (e.g. if question number one has a score of four then the adjusted score would be three). For even numbered items, subtract the score from five (e.g. if question number two has a score of three, the adjusted score would be two). Sum up all scores at the end and multiply by 2.5, this will give you the SUS usability score for your product. According to SUS evaluation criteria, scores above 70 indicate acceptable usability, scores above 85 indicate excellent usability, and scores above 90 indicate truly exceptional usability. 30 Bangor's large sample study in 2008 reported a reliability coefficient of 0.91 for the SUS. 37 Multiple empirical studies have demonstrated that the SUS questionnaire is effective for evaluation purposes and requires a smaller sample size.
Sample size calculation
When selecting participants for system usability testing, several factors should be considered, including application coverage, target user group, test content, and objectives. Research by Virzi and Nielsen suggested that using only 5 participants could reveal approximately 80% of product usability problems. 38 , 39 However, Lewis argued that the relationship between participant numbers and the identification rate of system problems follows a curve. 40 , 41 With 5 participants, only 55% of usability problems could be identified, 10 patients were able to identify 80% of usability problems, and 15 participants were able to identify 100% of usability problems. As this study's AR ophthalmic nursing intelligent assistance system was deployed in the day surgery ward where all 15 nurses were involved in the testing process.
Permits and ethical approval
The study requires the collection of patient facial information for identity verification and surgical identification for AI learning and training. Additionally, the results of the SUS for nurses in the ward are also needed. Therefore, participating patients have signed the “Patient Informed Consent Form,” and nurses have signed the “Subject Informed Consent Form.” This study has been approved by the Ethics Committee of the Eye Hospital Affiliated to Wenzhou Medical University with the approval number H2023-020-K-17-02.
Data collection and statistical analysis
After a 3-month trial period, we collected 15 valid datasets. Each nurse completed the informed consent form and the SUS survey questionnaire, which included personal information such as age, education level, and whether they wore glasses. We used IBM SPSS Statistics 24 software for data analysis and described general information such as demographics using ratios. Furthermore, we adopted paired t-test and Wilcoxon signed-rank test to analyze statistical differences in the three dimensions of the usability scale between the experimental and control groups. Independent sample t-test and Mann-Whitney U test were used to analyze the statistical differences in system usability based on education level and wearing glasses. Additionally, linear regression and generalized linear model analysis were conducted to investigate the impact of age on system usability.
Results
Participant description
There were 15 female participants in this study, with an average age of 35.47 ± 5.88 years. 7participants had a junior college degree, accounting for 46.7% of the total; 8 participants had a bachelor's degree, accounting for 53.3% of the total. 8participants wore glasses, representing 53.3% of the total; 7 participants did not wear glasses, representing 46.7% of the total. The data information is presented in Table 1. During the system trial period, no adverse events were reported, and no test participants withdrew from the study for any reason.
Table 1.
Basic information of participants.
| Characteristic | Values | |
|---|---|---|
| Age (mean ± SD) (years) | 35.47 ± 5.88 | |
| Educational level (N (%)) | Junior collage | 7(46.7) |
| Bachelor | 8(53.3) | |
| Glasses (N (%)) | Wearing glasses | 8(53.3) |
| No wearing glasses | 7(46.7) |
SD: standard deviation.
Results of the AR-IAS and PDA system usability scales
The SUS survey results for the AR-IAS and PDA systems are presented in Table 2. The statistical results for the learnability, efficiency, and satisfaction subscales of both systems are presented in Table 3.
Table 2.
Descriptive statistics each item of SUS questionnaire for AR-IAS and PDA.
| No. | SUS | AR-IAS | PDA |
|---|---|---|---|
| Mean (SD) | Mean (SD) | ||
| 1 | I think that I would like to use the AR-IAS/PDA frequently | 3.40(0.61) | 3.40(0.49) |
| 2 | I found the AR-IAS/PDA unnecessarily complex | 2.93(0.93) | 2.67(1.01) |
| 3 | I thought the AR-IAS/PDA was easy to use | 2.67(0.94) | 2.53(1.15) |
| 4 | I think that I would need the support of a technical person to be able to use the AR-IAS/PDA | 3.07(0.85) | 3.07(1.06) |
| 5 | I found the various functions in the AR-IAS/PDA were well integrated | 2.60(0.95) | 2.73(0.68) |
| 6 | I thought there was too much inconsistency in the AR-IAS/PDA | 2.93(0.77) | 2.87(1.02) |
| 7 | I would imagine that most people would learn to use the AR-IAS/PDA very quickly | 3.53(0.62) | 3.13(0.88) |
| 8 | I found the AR-IAS/PDA very cumbersome to use | 2.73(1.06) | 3.13(0.81) |
| 9 | I felt very confident using the AR-IAS/PDA system | 3.13(0.72) | 3.00(0.82) |
| 10 | I needed to learn a lot of things before I could get going with the AR-IAS/PDA | 3.33(0.60) | 2.60(0.80) |
| SUS's value | 75.83(10.72) | 72.83(10.97) |
AR-IAS: augmented reality-based intelligent assistance system; PDA: personal digital assistant; SD: standard deviation; SUS: System Usability Scale.
Table 3.
Statistical results of AR-IAS and PDA subscales.
| Group | Learnability | Efficiency | Satisfaction | SUS | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Mean(SD) | t | p | Median (IQR) | Z | p | Mean(SD) | t | p | Mean(SD) | t | p | |
| AR-IAS | 22.50(4.53) | 1.86 | 0.07 | 2.77(2.00–4.00) | -0.879 | 0.379 | 23.67(3.52) | 0.11 | 0.91 | 75.83(10.72) | 1.759 | 0.08 |
| PDA | 21.00(4.20) | 2.93(2.00–4.00) | 23.17(4.67) | 72.83(10.97) | ||||||||
SUS, learnability, satisfaction: paired t-test; efficiency: Wilcoxon signed-rank test (p < 0.05 is statistically significant).
AR-IAS: augmented reality-based intelligent assistance system; IQR: interquartile range; PDA: personal digital assistant; SD: standard deviation; SUS: System Usability Scale.
AR-IAS usability analysis
The results of statistical analysis using Independent sample t-test and Wilcoxon signed-rank test to analyze the impact of education level and wearing glasses on the overall usability, learnability, efficiency, and satisfaction of the AR-IAS system are presented in Tables 4 and 5. The results of statistical analysis examining the impact of age on the system usability scale for AR-IAS are presented in Table 6.
Table 4.
Significance test of the impact of wearing glasses on system usability.
| Total | No wearing glasses | Wearing glasses | Value / p | |
|---|---|---|---|---|
| SUS | ||||
| Mean (SD) | 75.83 (10.72) | 72.14 (8.83) | 79.06 (11.72) | −1.30/0.22 |
| Learnability | ||||
| Mean (SD) | 59.17 (8.80) | 55.36(5.62) | 62.50(10.02) | −1.73/0.58 |
| Efficiency | ||||
| Median (IQR) | 81.25 (68.75–81.25) | 68.75 (65.62–84.38) | 81.25 (78.12–81.25) | 23.50/ 0.63 |
| Satisfaction | ||||
| Mean (SD) | 56.25 (11.33) | 54.46 (9.35) | 57.81 (13.26) | −0.57/0.11 |
SUS, learnability, satisfaction: independent sample t-test; efficiency: Mann-Whitney U test (p < 0.05 is statistically significant).
IQR: interquartile range; SD: standard deviation; SUS: System Usability Scale.
Table 5.
The significance test of the impact of education level on system availability.
| Total | Junior collage | Bachelor | Value / p | |
|---|---|---|---|---|
| SUS | ||||
| Mean (SD) | 75.83 (10.72) | 76.79 (11.34) | 75(10.86) | 0.31/0.76 |
| Learnability | ||||
| Mean (SD) | 59.17 (8.80) | 58.93 (11.89) | 59.38 (5.79) | −0.09/ 0.93 |
| Efficiency | ||||
| Median (IQR) | 81.25 (68.75–81.25) | 81.25 (71.88–81.25) | 75.00 (68.75- 82.81) | 29.00/ 0.95 |
| Satisfaction | ||||
| Mean (SD) | 56.25 (11.33) | 57.14 (9.83) | 55.47 (13.13) | 0.28/0.78 |
SUS, learnability, satisfaction: independent sample t-test; efficiency: Mann-Whitney U test (p < 0.05 is statistically significant).
IQR: interquartile range; SD: standard deviation; SUS: System Usability Scale.
Table 6.
Regression analysis of age on SUS results.
| B | Se | t | p | |
|---|---|---|---|---|
| SUS | −0.25 | 0.49 | −0.91 | 0.38 |
| Learnability | −0.3 | 0.51 | −1.14 | 0.28 |
| Efficiency | 0.04 | 0.66 | 0.14 | 0.89 |
| Satisfaction | −0.42 | 0.38 | −1.65 | 0.12 |
SUS, learnability, satisfaction: linear regression; efficiency: generalized linear model (p < 0.05 is statistically significant).
SUS: System Usability Scale.
Discussion
With the rapid advancement of technology, Extended reality (XR) has emerged as a groundbreaking leap in computing power, connectivity, and display capabilities. It is now being deployed in an increasingly diverse range of scenarios. XR includes various forms such as virtual reality (VR), augmented reality (AR), and mixed reality (MR). 40 It creates a real-virtual combination environment that can be interacted with by humans through computer technology and wearable devices. VR seamlessly integrates real-world components into a virtual space, enabling users to experience a sense of reality. Some researchers have achieved remarkable success in clinical medical settings by utilizing VR for surgical simulation training and anatomical research. MR, on the other hand, merges real and virtual spaces to facilitate artificial interactions between the two realms. 42 In clinical medicine, MR finds numerous applications across various domains. However, due to the lack of realism in VR, users find it difficult to immerse themselves in virtual scenes and may experience health problems such as dizziness or nausea when used for long periods of time. 43 , 44 Therefore, it is not suitable for wearing during nursing operations in hospitals. In addition, compared with AR glasses, MR devices are larger, heavier, and more expensive; and they require physical hardware support during use, which limits their portability. Sometimes, the distinguishing features between the real world and the virtual world may not be fast enough. Therefore, they cannot currently be well applied to clinical work in mobile environments. 45
AR, in general, refers to the real-time integration of additional information or graphic elements into the user's environment, with a primary focus on enhancing the real world rather than creating a completely artificial one. 46 , 47 AR technology combines the real and virtual environments through devices equipped with cameras in their interfaces. By analyzing camera videos and images using image analysis technology, virtual worlds can be overlaid onto real-world environments for interactive experiences. This branch of virtual reality technology plays a crucial role. 48 , 49 When using AR technology, users’ eyes can not only see the real world but also see computer-generated virtual worlds through objects in the real world. AR glasses are typically lightweight and support wireless networks, offering convenience in terms of portability. 50 Based on these features of AR glasses, we developed an intelligent assistance system for ophthalmic care using language recognition and image recognition technologies based on AR and AI. This is a construction and innovation of nursing assistance tools.
Usability of the AR-IAS system
The use of AR glasses in nursing work has changed the traditional approach to caregiving. Especially for nurses who already wear framed glasses, the question arises of whether they can accept wearing an additional pair of AR glasses. Furthermore, whether older nurses or those with lower educational backgrounds can embrace this novel, disruptive approach to traditional nursing methods is a concern we addressed during the experimental design. These factors fundamentally influence the acceptance and utilization of the AR-IAS system in clinical settings.
Comparing the usability scales of the AR-IAS system and the PDA system, the PDA system's SUS score was 72.83 ± 10.97, which is higher than the cutoff score of 72.6. Based on the SUS evaluation criteria, the PDA system is rated as B-. The SUS score of the ophthalmology nursing intelligent assistance system developed in this study was 75.83 ± 10.72, which is higher than the cutoff score of 75. According to the SUS evaluation criteria, the AR-IAS system is considered to have good usability and a satisfactory user experience, although there is still room for improvement. The system is rated as B and has better usability than 75% of similar products. 30 Through statistical analysis, there was no significant difference in the usability of the system among nurses with different levels of education or whether they wore glasses (p > 0.05), indicating that education level did not affect their operation of the system, and wearing glasses did not cause discomfort when using AR glasses. There was no statistically significant difference in the impact of age on system usability (p > 0.05), indicating that the AR-IAS system can assist nurses of all ages in nursing operations.
The AR-IAS system scored higher than the PDA system in terms of learnability, possibly due to its simpler operation compared to the PDA system and the use of voice commands, which made it easier to learn and master. In terms of efficiency, the AR-IAS system scored slightly higher than the PDA system, indicating that using the AR-IAS system in clinical settings does not compromise nursing efficiency. The satisfaction scores for both systems were very similar, indicating that nurses were generally satisfied with both systems during actual use. There was no significant difference in overall usability scores between the two systems, and no statistical differences were observed in any dimension. In other words, both the AR-IAS and the PDA systems are capable of meeting the needs of clinical nursing effectively.
Clinical application of the AR-IAS system
In clinical nursing work, a significant amount of repetitive tasks can easily lead to fatigue and burnout among nurses. Habitual thinking can result in inadequate verification of nursing operations, potentially leading to adverse nursing events. Through an intelligent assistance system, nurses can be helped with patient identification, medical order execution, educational reminders, and patient surgery transportation handover tasks through multisensory interactions such as vision and hearing.
AR glasses assist nurses in patient identification and surgical information verification, increasing the accuracy of patient and surgical information, enhancing the safety of medical quality, and reducing the risk of medical accidents. The system assists nurses in verifying patient information and medical orders, improving the accuracy of order execution and the timeliness of medical record writing. Simultaneously, it reduces the frequent need for nurses to perform hand hygiene, thus reducing the occurrence of nosocomial infections. The use of educational modules can reduce patient omissions during preoperative education. The system assists nurses in verifying information during patient surgery transportation handover, ensuring the correctness of patient identity, surgical site, and surgical method.
The implementation of these functions has alleviated nurses’ work pressure to some extent, safeguarded nursing quality and patient medical safety. Conducting usability studies on the system provides a basis for future iterative development and maintenance. This, in turn, improves the efficiency and quality of ophthalmic nursing work, providing an effective reference for data-enabled healthcare. To our knowledge, this is the first study to explore the development and usability of an ophthalmic nursing intelligent assistance system using AR and AI technology, which has not been reported previously.
Limitations
Although this study has achieved good results, it also has some limitations. Firstly, the system has high requirements for the performance of wireless networks within its scope of use. In areas where wireless networks are not fully covered and when the network signal is weak, the system response may have a delay of approximately 5 s. Secondly, the battery life of the AR glasses is not satisfactory, and they cannot be used continuously throughout the nursing process without interruption. Charging is required during use, which brings inconvenience to nurses.
Conclusion and next steps
Based on the SUS survey results, the AR-IAS system developed using AR and AI technology demonstrates good usability overall and is capable of meeting the needs of clinical nursing. However, there are also some limitations in practical use. With the continuous development of AR technology and the ongoing optimization and equipment updates of the AR-IAS system, we believe that the system will become increasingly intelligent, efficient, and user-friendly.
In the next stage, we will continue to explore the intelligent assistant effect of AR-IAS system in improving the quality of nursing care and the safety of patient diagnosis and treatment in practical clinical nursing work, as well as the research on the impact of AR glasses on the visual function of users and rehabilitation training after strabismus surgery.
Acknowledgements
The authors thank all patients and nurses who participated in this study and Mr Wang Bin'an (Hangzhou Chagine Technology co.,ltd) for providing guidance and help.
Footnotes
Contributorship: All authors contributed to the execution of the study and approved the final version of the submitted paper. CKH and LJZ drafted this manuscript. HL was responsible for quality control of this study. WJZ, CKH, and NT participated in the development and research execution of the application. XFH, YYS, and HLL conducted data collection and analysis. YXG was responsible for the design, execution, and manuscript revision of the research.
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Ethical approval: This study was approved by the ethics committee of Eye hospital of Wenzhou Medical University (Reference No. H2023–020-K-17–02) and adhered to the tenets of the Declaration of Helsinki.
Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the Wenzhou Association for Science and Technology, Nursing Special Project of Eye Hospital of Wenzhou Medical University (grant number No.jczc73, No.YNHL2302104).
ORCID iD: Yingxuan Guo https://orcid.org/0000-0001-7736-189X
References
- 1.Ward U, Nilsson UG. Acupuncture for postoperative pain in day surgery patients undergoing arthroscopic shoulder surgery. Clin Nurs Res 2013; 22: 130–136. [DOI] [PubMed] [Google Scholar]
- 2.Hiranaka T, Fujishiro T, Hida Y, et al. Augmented reality: the use of the PicoLinker smart glasses improves wire insertion under fluoroscopy. World J Orthop 2017;8: 891–894. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Hanna MG, Ahmed I, Nine J, et al. Augmented reality technology using microsoft HoloLens in anatomic pathology. Arch Pathol Lab Med 2018; 142: 638–644. [DOI] [PubMed] [Google Scholar]
- 4.Uruthiralingam U, Rea PM. Augmented and virtual reality in anatomical education—a systematic review. Adv Exp Med Biol 2020; 1235: 89–101. [DOI] [PubMed] [Google Scholar]
- 5.Xie J, Chai JJK, O'Sullivan C, et al. Trends of augmented reality for agri-food applications. Sensors (Basel) 2022; 22: 8333. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Reiner AJ, Vasquez HM, Jamieson GAet al. et al. Comparing an augmented reality navigation display to an electronic map for military reconnaissance. Ergonomics 2022; 65: 78–90. [DOI] [PubMed] [Google Scholar]
- 7.Dickey RM, Srikishen N, Lipshultz LI, et al. Augmented reality assisted surgery: a urologic training tool. Asian J Androl 2016; 18: 732–734. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Prada F, Del BM, Casali C, et al. Intraoperative navigated angiosonography for skull base tumor surgery. World Neurosurg 2015; 84: 1699–1707. [DOI] [PubMed] [Google Scholar]
- 9.Hassan AE, Desai SK, Georgiadis ALet al. et al. Augmented reality enhanced tele-proctoring platform to intraoperatively support a neuro-endovascular surgery fellow. Interv Neuroradiol 2022; 28: 277–282. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Haji Z, Arif A, Jamal Set al. et al. Augmented reality in clinical dental training and education. J Pak Med Assoc 2021; 1: S42–S48. [PubMed] [Google Scholar]
- 11.Urlings J, Sezer S, Ter Laan M, et al. The role and effectiveness of augmented reality in patient education: a systematic review of the literature. Patient Educ Couns 2022; 105: 1917–1927. [DOI] [PubMed] [Google Scholar]
- 12.Sutherland J, Belec J, Sheikh A, et al. Applying modern virtual and augmented reality technologies to medical images and models. J Digit Imaging 2019; 32: 38–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Mendez K, Piasecki R, Hudson K, et al. Virtual and augmented reality: implications for the future of nursing education. Nurse Educ Today 2020; 93: 104531. [DOI] [PubMed] [Google Scholar]
- 14.Quqandi E, Joy M, Drumm Iet al. et al. Augmented reality in supporting healthcare and nursing independent learning: narrative review. Comput Inform Nurs 2023; 41: 281–291. [DOI] [PubMed] [Google Scholar]
- 15.McCafferty KL, Flott B, Hadenfeldt C. Using augmented reality to foster clinical readiness and critical thinking in nursing education. Nurs Educ Perspect 2022; 43: 181–183. [DOI] [PubMed] [Google Scholar]
- 16.Chen PJ, Liou WK. The effects of an augmented reality application developed for paediatric first aid training on the knowledge and skill levels of nursing students: an experimental controlled study. Nurse Educ Today 2023; 120: 105629. [DOI] [PubMed] [Google Scholar]
- 17.Wang S, Tang HL, Al turk LI, et al. Localizing microaneurysms in fundus images through singular spectrum analysis. IEEE Trans Biomed Eng 2017; 64: 990–1002. [DOI] [PubMed] [Google Scholar]
- 18.Hassan T, Usman Akram MU, Hassan B, et al. Automated segmentation of subretinal layers for the detection of macular edema. Appl Opt 2016; 55: 454–461. [DOI] [PubMed] [Google Scholar]
- 19.ElTanboly A, Ismail M, Shalaby A, et al. A computer-aided diagnostic system for detecting diabetic retinopathy in optical coherence tomography images. Med Phys 2017; 44: 914–923. [DOI] [PubMed] [Google Scholar]
- 20.Biswas S, Logan NS, Davies LN, et al. Assessing the utility of ChatGPT as an artificial intelligence-based large language model for information to answer questions on myopia. Ophthalmic Physiological Optics 2023; 43: 1562–1570. [DOI] [PubMed] [Google Scholar]
- 21.Biswas S, Davies LN, Sheppard AL, et al. Utility of artificial intelligence-based large language models in ophthalmic care. Ophthalmic Physiological Optics 2024; 44: 641–671. [DOI] [PubMed] [Google Scholar]
- 22.Lewis JR. An after-scenario questionnaire for usability studies: psychometric evaluation over three trials. ACM SIG-CHI Bull 1991; 23: 79. [Google Scholar]
- 23.Lewis JR. The system usability scale: past, present, and future. Int J Hum Comput Interact 2018; 34: 577–590. [Google Scholar]
- 24.Finstad K. The usability metric for user experience. Interact Comput 2010; 22: 323–327. [Google Scholar]
- 25.Lewis JR. Psychometric evaluation of the PSSUQ using data from five years of usability studies. Int J Hum Comput Interact 2002; 14: 463–488. [Google Scholar]
- 26.Tullis TS, Stetson JN. A comparison of questionnaires for assessing website usability. Usability Prof Assoc Conf 2004: 1–12. [Google Scholar]
- 27.Lund A. Measuring usability with the USE questionnaire. Usability Interface 2001; 8: 3–6. [Google Scholar]
- 28.Brooke J. SUS—a quick and dirty usability scale. In: Jordan PW, Thomas B, Weerdmeester BA, McClelland AL. (eds) Usability evaluation in industry. London, England: Taylor and Francis, 1996, pp.189–194. [Google Scholar]
- 29.Almasi S, Bahaadinbeigy K, Ahmadi H, et al. Usability evaluation of dashboards: a systematic literature review of tools. Biomed Res Int 2023; 2023: 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Brooke J. SUS: a retrospective. J Usability Stud 2013; 8: 29–40. [Google Scholar]
- 31.Borsci S, Federici S, Lauriola M. On the dimensionality of the system usability scale: a test of alternative measurement models. Cogn Process 2009; 10: 193–197. [DOI] [PubMed] [Google Scholar]
- 32.Zviran M, Glezer C, Avni I. User satisfaction from commercial web sites: the effect of design and use. Inform Manage 2006; 43: 157–178. [Google Scholar]
- 33.Stoyanov SR, Hides L, Kavanagh DJ, et al. Mobile app rating scale: a new tool for assessing the quality of health mobile apps. JMIR Mhealth Uhealth 2015; 3: e3422. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Isgin-Atici K, Ozkan A, Celikcan U, et al. Usability study of a novel tool: the virtual cafeteria in nutrition education. J Nutr Educ Behav 2020; 52: 1058–1065. [DOI] [PubMed] [Google Scholar]
- 35.Karajizadeh M, et al. Usability of venous thromboembolism prophylaxis recommender system. Stud Health Technol Inform 2022; 289: 220–223. [DOI] [PubMed] [Google Scholar]
- 36.Cruz FOAM, Vilela RA, Ferreira EB, et al. Evidence on the use of mobile apps during the treatment of breast cancer: systematic review. JMIR Mhealth Uhealth 2019; 7: e13245. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Nielsen J. Usability engineering. San Francisco: Morgan Kaufmann, 1993. [Google Scholar]
- 38.Andrews C, Southworth MK, Silva JNAet al. et al. Extended reality in medical practice. Curr Treat Options Cardiovasc Med 2019; 21: 18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Lewis JR. Sample sizes for usability studied: additional considerations. Hum Factors 1994; 36: 368–378. [DOI] [PubMed] [Google Scholar]
- 40.Ong CW, Tan MCJ, Lam Met al. et al. Applications of extended reality in ophthalmology: systematic review. J Med Internet Res 2021; 23: e24152. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Ribaupierre S, Eagleson R. Editorial: challenges for the usability of AR and VR for clinical neurosurgical procedures. Healthc Technol Lett 2017; 4: 151–151. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Fertleman C, Aubugeau-Williams P, Sher C, et al. A discussion of virtual reality as a new tool for training healthcare professionals. Front Public Health 2018; 6: 44. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Smith RT, Clarke TJ, Mayer W, et al. Mixed reality interaction and presentation techniques for medical visualisations. Adv Exp Med Biol 2020; 1260: 123–139. [DOI] [PubMed] [Google Scholar]
- 44.Yeung AWK, Tosevska A, Klager E, et al. Virtual and augmented reality applications in medicine: analysis of the scientific literature. J Med Internet Res 2021; 23: e25499. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Venkatesan M, Mohan H, Ryan JR, et al. Virtual and augmented reality for biomedical applications. Cell Rep Med 2021; 2: 100348. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Wang X, Zhang F, Liu Y. Augmented reality technology. Sci Technol Rev 2018; 36: 75–83. [Google Scholar]
- 47.Layona R, Yulianto B, Tunardi Y. Web based augmented reality for human body anatomy learning. Proc Comput Sci 2018; 135: 457–464. [Google Scholar]
- 48.Gallos P, Georgiadis C, Liaskos J, et al. Augmented reality glasses and head-mounted display devices in healthcare. Stud Health Technol Inform 2018; 251: 82–85. [PubMed] [Google Scholar]
- 49.Bangor A, Kortum PT, Miller JT. Determining what individual SUS scores mean: adding an adjective rating scale. J Usability Stud 2009; 4: 114–123. [Google Scholar]
- 50.Virzi R. Refining the test phase of the usability evaluation: how many subjects is enough?. Hum Factors 1992; 34: 457–568. [Google Scholar]





