Abstract
AI-powered smart glasses are emerging as a highly promising advancement in the field of digital health management, owing to their capabilities in real-time monitoring, chronic disease management, and personalized treatment planning. To comprehensively understand the current state of development, we systematically searched multiple databases, including Web of Science, PubMed, and IEEE Xplore, to collect relevant literature. This paper provides a systematic analysis of the current applications of smart glasses in healthcare, focusing on their potential benefits and limitations. Key issues discussed include user engagement, treatment adherence, data privacy, standardization, battery efficiency, clinical validation, and medical ethics. Our findings suggest that, supported by emerging clinical evidence, smart glasses have demonstrated significant improvements in areas such as assisted medical services, health management, anxiety alleviation in children, and telemedicine. By integrating multi-modal sensors, these devices are capable of accurately tracking certain physiological indicators and synchronizing real-time visual input, thereby enhancing the accuracy and timeliness of health interventions and medical services. Notably, some cutting-edge smart glasses have adopted advanced artificial intelligence algorithms, particularly large language models (LLMs) with context awareness and human-like interaction capabilities. These AI-powered glasses can offer real-time, personalized dietary and health management recommendations tailored to users’ daily life scenarios. Building on these findings, this study further proposes a conceptual framework for proactive health management using smart glasses and explores future directions in technological development and practical applications. Overall, AI-enhanced smart glasses show great potential as a critical interface between healthcare providers and patients, poised to play a vital role in the future of personalized medicine and continuous health management.
Subject terms: Information technology, Lifestyle modification, Preventive medicine
Introduction
Wearable devices, broadly defined as technologies designed to be worn on or attached to the human body, embody a practical realization of advanced wearable technology. These devices function as seamless extensions of personal space, operating under the user’s control to facilitate continuous interaction and functionality1,2. In recent years, with the advancement of artificial intelligence (AI) technologies, wearable devices have become an essential tool in health management. Google’s Verily Life Sciences has launched the four-year Project Baseline3, which aims to recruit 10,000 participants to integrate data from wearable devices and genetic testing, with the goal of predicting emergencies such as strokes and seizures. Similarly, the All of Us research initiative by the National Institutes of Health (NIH) also utilizes wearable devices to collect physiological data, advancing precision medicine4. Both studies leverage real-time health data collection via wearables to enable personalized interventions and early detection. Among the array of mainstream wearable devices—including smartwatches, fitness trackers, wearable cameras, and medical wearables—smart glasses have emerged as a particularly promising innovation, especially within the domain of health management5,6.Unlike other types of wearables, AI smart glasses integrate advanced sensors and algorithms to monitor physiological data in real-time, providing instant feedback and health recommendations7. Meanwhile, the high speed and low-latency features provided by 5G networks, combined with the capabilities of edge computing, can achieve real-time health monitoring and immediate feedback, greatly enhancing the efficiency and security of telemedicine8.
The integration of miniaturization, sensor technology, and artificial intelligence (AI) has revolutionized the landscape of wearable devices, positioning AI-powered smart glasses as a cornerstone innovation in health management. These advanced wearables combine compact design with portability, addressing the escalating global demand for continuous, real-world health monitoring and personalized care. The ongoing development of this technology is expected to significantly advance real-time health surveillance, preventive medicine, precision healthcare, and tailored interventions9.
Globally, the challenges in managing chronic diseases are pronounced10, characterized by fragmented service delivery networks, insufficient primary care infrastructure, inefficient funding mechanisms, and underdeveloped health information systems. These systemic issues impede the effectiveness of health management programs and contribute to inequities in access to timely and quality care. In resource-constrained settings, these barriers are even more pronounced, delaying early intervention and exacerbating health disparities11–13.
AI-powered smart glasses offer a transformative solution to these challenges14. Integrated AI algorithms can process real-time physiological data to provide users with personalized health recommendations, early warnings of potential issues, and actionable insights that emphasize prevention and prediction in health management15. The lightweight and unobtrusive nature of smart glasses ensures they can be seamlessly integrated into daily life, allowing for constant health tracking without disrupting routine activities16. Moreover, their potential in advancing remote diagnostics and treatment is significant. Smart glasses can facilitate telemedicine, enabling healthcare providers to remotely assess patients and deliver timely interventions, which is especially critical in addressing healthcare disparities in underserved regions worldwide17.
Therefore, this paper focuses on the application of AI-powered smart glasses in health management, systematically reviewing their key technological pathways, clinical application scenarios, and future development trends. It emphasizes their strategic value in advancing global health equity and enabling precision health management.
AI-powered smart glasses, as an advanced evolution of wearable technology, increasingly incorporate a range of cutting-edge technologies, including Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR), to deliver immersive and interactive user experiences. The distinctive characteristics and synergistic advantages of these technologies have led to their widespread adoption across multiple application scenarios such as entertainment, gaming, education, and professional training domains. For instance, the combination of educational theory and virtual reality (VR) technology is feasible. By using VR technology, students’ participation and learning outcomes can be significantly enhanced, especially in experimental or practical courses, to improve the educational experience18. The combination of VR technology and educational theory has great development potential. It changes the traditional way of learning and provides a new direction for future medical educational practice19. In addition, their exceptional capability for real-time health monitoring establishes them as a transformative tool in modern healthcare applications20. To better understand the historical development and current technological status of smart glasses, Fig. 1 provides an evolution from early prototypes to modern products. It is noteworthy that while there are various types of smart glasses available on the market, our research specifically focuses on advanced models integrated with large language models (LLMs), augmented reality (AR), and other key functionalities, as they represent the forefront of technological advancements and show immense potential in healthcare. For instance, MYVU AR glasses and Rokid glasses not only support LLMs but also feature image object detection capabilities, enabling them to perform highly specialized tasks in complex medical environments.
Fig. 1. Historical development of smart glasses.
Smart glasses are developing rapidly year by year, and the development forms are becoming more diversified. This timeline highlights selected representative smart glasses products based on technological innovation and market influence. It does not include all existing products.
During the Prototype Phase, the inception of smart glasses can be traced back to the pioneering efforts of Professor Steve Mann, widely acknowledged as the “Father of Wearable Computing.” Mann’s groundbreaking work during this era resulted in the development of early prototypes that primarily integrated Augmented Reality (AR) functionalities, laying the foundational stones for subsequent advancements in smart eyewear technology. However, owing to their considerable size and the experimental phase of the technology, these early devices were largely restricted to academic research environments. One of Mann’s contributions was the EyeTap apparatus, which achieved a milestone by capturing the wearer’s visual perspective while processing and overlaying digital information. Although the EyeTap represented a notable leap forward in technology, its practical applications were limited by the bulky design and the experimental state of the technology, thereby confining its usage mainly to academic research within laboratory settings, rather than achieving widespread adoption in daily life.
In the Commercialization Initiation Phase (2010–2015), advancements in hardware and technology facilitated the emergence of simpler head-up displays (HUDs) in the early 2010s21, which gradually found applications in military, industrial, and scientific research fields. In 2012, Google launched the Google Glass Explorer Edition, marking the beginning of modern smart glasses with AR features. By 2013, Google Glass22 incorporated a camera, touch-pad, and voice commands, offering users an entirely new way to access information. However, due to concerns over privacy, social acceptance, and battery life, Google Glass did not achieve commercial success but provided valuable lessons for future innovations. Meanwhile, Japan’s NTT Docomo demonstrated AR Walker in 201023, focusing on AR functionalities which laid the groundwork for subsequent developments. Although smart glasses had not yet been widely applied to health monitoring or healthcare, there were preliminary attempts. For instance, Google Glass was used to assist doctors in surgical guidance and teleconsultation22. Additionally, it helped children with autism recognize facial expressions24, indicating its potential in education25 and healthcare14.
The Diversification and AI Integration Phase (2016–2020) witnessed the maturation of smart glasses, introducing more diverse application scenarios and technological breakthroughs. Commencing in 2016, this era saw Snap Inc., the parent company of Snapchat, launch the Spectacles series, captivating a younger audience with its distinctive video recording capabilities. Additionally, the release of the Vuzix Blade in 2018, which integrated the Alexa voice assistant, further propelled consumer adoption by enhancing user interaction and functionality. The collaboration between Ray-Ban and Meta produced the Stories smart glasses, popular for photography, videography, and audio playback, designed to seamlessly integrate into daily life. Microsoft’s HoloLens26, launched in 2016, combined mixed reality (MR) technology, targeting industrial and professional applications. In other parts of Asia, Samsung Electronics actively participated in the competition, launching Galaxy Glass to connect smartphones with personal wearables. Sony focused on developing specialized smart glasses for industries like medical training and remote collaboration. During this phase, smart glasses saw significant growth in healthcare applications. Microsoft HoloLens has been utilized in surgical training, enhancing doctors’ learning efficiency and operational accuracy through three-dimensional views. Moreover, smart glasses entered public health sectors, such as supporting fever screening at Liangzhu Museum and Liangzhu Ancient City Ruins in Hangzhou, China, using non-contact temperature detection methods27. This approach reduced virus transmission risks while improving operational efficiency.
The New Era of Smart Glasses as AI Terminals (Since 2021) has been characterized by the integration of advanced AI and machine learning technologies, enabling the development of increasingly sophisticated models28. Rokid Glasses exemplify this trend with transparent lenses, stylish design, and built-in displays, providing real-time translation and navigation services via AI vision perception, greatly enhancing the user experience. XREAL Air AR glasses have gained popularity for their immersive virtual screen experiences29. Tech giants in the U.S., including Amazon and Meta Platforms (formerly Facebook), have increased investment in smart glasses, releasing some advanced and user-friendly products. Amazon’s Echo Frames30 allow users to access various information and services through simple voice commands. Meta’s Ray-Ban Stories emphasize social sharing, enabling users to easily capture memorable moments31. In China, Baidu is slated to launch its Xiaodu AI Glasses in the first half of 2025, marking a significant milestone as they are touted as “the world’s first AI glasses equipped with a native Chinese model32.” Designed by Baidu’s hardware division, Xiaodu Technology, these glasses showcase Baidu’s advanced Ernie language model, aiming to integrate AI into wearable technology tailored for Chinese-speaking users. Weighing a mere 45 grams, the Xiaodu AI Glasses feature a 16MP ultra-wide-angle camera and AI image stabilization technology, supporting smooth and stable first-person perspective photography. Additionally, the glasses are equipped with a four-microphone array for effective sound capture and open-back leak-proof speakers for clear audio, providing users with an immersive, hands-free experience. The glasses support six core functions: real-time Q&A, calorie recognition, object recognition through an encyclopedia feature, audio-visual translation, and smart reminders. Notably, since 2021, smart glasses have increasingly integrated AI technology to offer personalized user experiences. In 2022, Google revived its smart glasses project, focusing on real-time translation and health monitoring. That same year, Large Language Models like ChatGPT were introduced into smart glasses, providing real-time interaction and support for complex tasks33,34.
The evolution of smart glasses reflects the close cooperation and intense competition among different countries and regions under globalization. Each country leverages its own technological strengths and development needs to jointly advance this emerging technology, forming a vibrant international market landscape. As technology continues to evolve and societal acceptance of smart wearable devices grows, smart glasses are expected to become an indispensable part of everyday life, transforming how we interact with the world.
The development of AI-powered smart glasses is characterized by rapid technological advancements and a growing diversification of applications. These devices integrate cutting-edge sensor technologies with AI algorithms, including deep learning and computer vision, to significantly enhance their data processing and analytical capabilities.
The rapid expansion of the global AI industry underscores the burgeoning growth of AI-powered smart glasses35–37. Notably, leading nations such as the United States and China38 are at the forefront in terms of enterprise density and investment scale. These nations are instrumental in fostering innovation and propelling the market adoption of AI-driven products, including smart glasses. A prime example is the Ray-Ban Meta smart glasses, a collaborative venture between Meta and Ray-Ban. These glasses have achieved significant market success, with over 1 million units sold by May 202439. However, despite the swift pace of technological advancements, several challenges persist. Key issues include:
Data privacy concerns, the continuous collection and processing of sensitive health data, necessitate robust privacy safeguards. As smart glasses become more integrated into daily life, ensuring the secure handling of personal data is paramount.
User experience optimization, ensuring seamless interaction and comfort for diverse user demographics, remains an ongoing challenge. The design and functionality of smart glasses must accommodate a wide range of user needs and preferences to enhance adoption.
Lack of standardized frameworks, the absence of universally accepted technical standards impede interoperability and widespread adoption. Standardization is crucial for the development of a cohesive ecosystem that supports the seamless integration of smart glasses with other devices and systems.
Despite these challenges, the maturation of foundational technologies and the expansion of application scenarios are unlocking significant market potential for AI-powered smart glasses40. As industry evolves, these devices are poised to become a driving force of innovation, translating technological breakthroughs into practical solutions across healthcare, industrial, and consumer sectors.
This study makes several contributions to the field of digital health management. To the best of our knowledge, this study represents the first systematic review focusing on the application of AI-powered smart glasses in the field of health management. Following the PRISMA guidelines, we conducted a Systematic Literature Review (SLR)41 to comprehensively examine recent advancements in AI-driven smart glasses, with a particular focus on their integration into digital health management systems aimed at facilitating proactive healthcare interventions. Secondly, through an in-depth analysis of the existing literature, we explored the core strengths and challenges associated with smart glasses in healthcare contexts. Key issues addressed include user engagement, treatment adherence, data privacy protection, standardization, battery efficiency, clinical validation, and medical ethics. Based on these findings, we propose corresponding strategies for improvement and future research directions. Thirdly, we highlight the potential of AI-enabled smart glasses to enhance clinical workflows, increase patient engagement, and improve treatment compliance. These findings demonstrate the unique value of this technology in advancing personalized medicine and precision health management.
The remainder of this paper is organized as follows: The “Results” section summarizes the key findings of our review, analyzes the strengths and limitations of AI smart glasses across diverse healthcare application scenarios, and proposes a conceptual framework for an active health management platform based on smart glasses. The “Discussion” section further explores the practical implications of these findings for clinical practice, emphasizing the role of smart glasses in real-time health monitoring and individualized treatment planning, while also discussing current research challenges and potential future directions. The “Methods” section details the literature search and evaluation methodology, including the database retrieval strategies and the inclusion and exclusion criteria applied.
Results
The results of the search and selection process
A total of 863 publications were retrieved, of which 101 studies meeting the inclusion criteria were selected for systematic review. Based on the research content, the included literature was categorized into the following themes: health management (16/101), stress relief and psychological intervention (15/101), assistance in clinical surgery(23/101), tools for supporting clinical diagnosis and treatment (32/101) and telemedicine services—including telemedical education and remote diagnosis and treatment (30/101). Some of the literature covers multiple content areas. In addition, a portion of the literature focused on the ethical challenges and limitations faced by smart glasses in medical applications42–46, providing valuable references for future technological advancements and clinical integration.
Figure 2 illustrates our systematic search strategy and result flowchart using the PRISMA framework, detailing the process from the identification of records through database searching to the final inclusion of studies after screening and eligibility assessment.
Fig. 2. Systematic search strategy and result flowchart using the Prisma systematic review tool.
This diagram illustrates the complete screening process222.
Technical basis and research status of smart glasses
Smart glasses represent an advanced form of wearable computing technology designed to be worn on the head or as part of eyewear. These devices typically integrate a transparent display that overlays digital information onto the wearer’s field of vision, augmenting the physical world with real-time data47. The composition of smart glasses can vary widely but generally includes components such as microprocessors, sensors, cameras, connectivity modules, and user interface elements. Depending on their design and intended use, smart glasses can be classified into several categories based on their primary function or underlying technology:
Mixed reality (MR) glasses combine elements of both the real and virtual worlds to create new environments where physical and digital objects coexist and interact.
Virtual reality (VR) glasses fully immerse the user in a simulated environment, blocking out the real world entirely.
Augmented reality (AR) glasses overlay computer-generated information on top of the user’s view of the real world, enhancing the perception of reality without fully replacing it.
AI-powered glasses integrate artificial intelligence to provide context-aware assistance, predictive analytics, and personalized experiences.
Seamless integration is the standout feature of MR glasses, enabling users to interact with both real and digital objects simultaneously. This capability sets MR apart for applications requiring complex interactions, such as remote collaboration, medical training, industrial maintenance, and education. Compared to VR, MR offers a higher level of endurance and medium fashionability, and portability, making it suitable for extended use in various environments. Devices like Microsoft HoloLens2 and Meta Quest Pro are equipped with precision tracking systems that ensure stable performance without obstructing daily activities.
Total immersion characterizes VR glasses, which provide an unparalleled experience by replacing the user’s view of the physical world with a simulated one. While they score lower on portability and fashionability due to their heavy weight, VR devices excel in delivering high-resolution displays and powerful graphics processing. This makes them ideal for entertainment, gaming, education, and training scenarios where complete isolation from the external environment is beneficial. In addition, products like Meta Quest and HTC Vive prioritize sensory engagement over mobility, offering lifelike simulations that can be used for everything from architectural walkthroughs to therapeutic treatments.
Contextual enhancement without displacement is the forte of AR glasses, which overlay information onto the real world without fully replacing it. AR glasses strike a balance between functionality and wearability, with medium weight and portability that do not impede daily activities. They are well-suited for practical applications such as navigation, translation, entertainment, and photography. Devices like Microsoft HoloLens and Magic Leap offer valuable context-aware data that enhances situational awareness and decision-making. Unlike VR, AR maintains a connection to the physical world, ensuring continuous interaction while providing supplementary information.
Table 1 provides a detailed comparison of different types of smart glasses, highlighting the unique attributes and functionalities associated with each category.
Table 1.
Comparative analysis of smart glasses categories based on functional attributes and application domains
![]() |
MR | VR | AR | AI |
|---|---|---|---|---|
| Fashionability | Medium | Low | Medium | High |
| Portability | Medium | Low | Medium | High |
| Weight | Medium | Heavy | Medium | Light |
| Application | Remote Collaboration, Medical, Industrial, Education | Entertainment, Gaming, Education, Training | Navigation, Translation, Entertainment, Photography | Healthcare, Finance, Transportation, Services, Entertainment |
| Endurance | High | Low | Medium | High |
This table provides a comparative analysis of different categories of smart glasses based on their functional attributes and application domains, highlighting the variability in characteristics such as fashionability, portability, weight, and endurance, which collectively determine their suitability and user preferences across specific use cases.
Intelligent assistance through context-aware computing defines AI-powered glasses, which provide personalized support using machine learning algorithms. These glasses excel in sectors like healthcare, finance, transportation, services, and entertainment, thanks to their high fashionability, portability, and lightweight. Products like Ray-Ban-Meta integrate seamlessly into everyday life, offering predictive analytics and real-time recommendations based on environmental cues and historical data. The key advantage of AI-powered glasses is their ability to enhance human performance with smart, anticipatory guidance, setting them apart from other types of smart glasses that may not offer the same level of personalized assistance.
Common smart glasses models on the market
Table 2 offers an exhaustive comparison of various smart glass models currently available on the market, with an in-depth analysis based on key characteristics. These include support for large language models, optical technology, voice recognition, image object detection, AR/VR capabilities, applications, and appearance. The information comes from the official websites of each brand.
Table 2.
Analysis of current market smart glasses models based on key technological and functional characteristics
| Brand/Model | LLM | Optics | Voice identify | Image object detection | AR/VR | Refresh rate | Resolution | Battery capacity | Sensor | OS | Application |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Rokid Max AR Glasses175 | / | Yes | Yes | / | AR | 120 Hz | 3840 × 1080 | / | Enhanced 9-axis (gyroscope, accelerometer, magnetometer) sensor fusion scheme, distance sensor, etc. | / | Cinematic experience of giant-screen theaters, high-speed competitive gaming, multi-screen cloud-based office work, etc. |
| Ray-Ban Meta176 | Yes | / | Yes | Yes | / | / | / | 160 mAh | ambient light sensor, captouch sensor, etc. | / | Conversational assistant, photographing, video recording, sending messages, translation, object recognition, etc. |
| RayNeo Air3177 | / | Yes | / | / | AR | 120 Hz | 3840 × 1080 | / | Accelerometers, Gyroscope, Distance sensor, Geomagnetic sensors, etc. | / | Movie entertainment, multi-screen work, games, and office assistance, etc |
| Mijia smart audio glasses pilot178 | / | / | Yes | / | / | / | / | 122*2mAh | Touch Sensor, etc. | / | Audio playback, voice assistant, etc. |
| Xiaomi glasses camera179 | / | Yes | Yes | Yes | AR | / | / | 1020 mAh | Light Sensor, etc. | / | Periscope double camera, rapid capture, etc. |
| MYVU Discovery AR glasses180 | Yes | Yes | Yes | Yes | AR | / | 1280 × 480 | 183 mAh | Accelerometer, Gyroscope, Magnetometer, Wearable Sensor, etc. | Flyme AR | FlymeAI large language model, etc. |
| HUAWEI Vision Glass181 | / | Yes | Yes | / | / | 60 Hz | 3840 × 1080 | / | Accelerometer, Gyroscope, etc. | / | Portable assistant, Intelligent cinematic experience, etc |
| HUAWEI GENTLE MONSTER Eyewear II182 | / | / | Yes | / | / | / | / | 85 mAh | Vibration sensing sensor, Pressure sensing sensor, Sliding sensing sensor, etc. | / | Intelligent voice reminder and new intelligent interaction, etc. |
| XREAL Air 2183 | / | Yes | / | / | AR | 120 Hz | 3840 × 1080 | / | Gyroscope, Accelerometer, Magnetometer, etc. | / | Movie viewing experience, game interaction, and office assistance, etc. |
| Lucyd Glasses184 | Yes | / | Yes | / | / | / | / | / | Touch sensor, etc. | / | Music, phone calls, voice assistants, access to ChatGPT, Information, Instruction, and Communication, etc. |
| Amazon Echo Frames (Gen 3)185 | Yes | / | Yes | / | / | / | / | / | Touch Sensor, Accelerometer, etc. | / | Alexa voice assistant, music, podcasts, calls, smart notification filtering, etc. |
This analysis evaluates contemporary smart glasses models, focusing on key technological features like AR, object detection, and voice interaction, as well as their applications in entertainment, assistance, and productivity. Innovations such as advanced optics and AI-driven functionalities highlight the diversity and potential of these devices. The slash (/) in the table represents that the information was not mentioned in the publicly available sources consulted by the research team.
The current market smart glasses have made significant breakthroughs in technology and functionality, but there are still many shortcomings that limit their widespread use. For example, many brands offer relatively simple functions, mainly providing basic voice reminders and broadcasting features, lacking more advanced AR/VR functions, image recognition, or intelligent interaction applications. Therefore, it is recommended to strengthen the integration of AR/VR capabilities and improve their applicability in various scenarios, especially in intelligent interaction. Additionally, the issue of battery life is still a concern. Some products have relatively small battery capacities, limiting long-term use. To improve the user experience, the battery capacity should be increased, ideally to over 400 mAh, to ensure users do not need to charge frequently during extended use. Furthermore, the display quality and resolution still need improvement. A few products offer relatively poor display quality, which is not suitable for detailed AR applications or high-definition displays. Therefore, it is recommended to increase the display resolution to at least 1920 × 1080 or higher to ensure clarity and meet AR/VR and high-quality video requirements. In terms of interactive features, many smart glasses lack voice recognition or gesture control and mainly rely on physical buttons or limited voice interaction, reducing convenience. Therefore, integrating more advanced voice assistants and gesture control features will enhance the interaction experience. Additionally, although some brands offer AR capabilities, their visual effects, refresh rates, and immersive experiences still have room for improvement. Many products lack a comprehensive AR experience, so optimizing AR display effects and increasing the refresh rate to over 120 Hz will significantly improve users’ sense of immersion and overall experience. Lastly, some brands focus too much on basic features like voice assistants, music, and calls, without fully expanding on high-end applications like AR, object recognition, and translation. Therefore, it is recommended to innovate in the area of feature expansion, adding functionalities like health monitoring, real-time translation, and object recognition to meet the needs of various user groups. In conclusion, while current smart glasses products have made breakthroughs in certain fields, to truly expand the market and enhance user experience, further improvements and optimizations are needed in areas such as battery life, display quality, interactive features, and diverse applications.
The core technology of smart glasses is intricately divided into multiple modules, encompassing both hardware and software components that facilitate the realization of intelligent functionalities. As delineated in Table 3, the hardware module serves as the fundamental basis for the functionality of smart glasses, incorporating essential components such as display systems, sensing technologies, processing units, and interaction interfaces.
Table 3.
Details of the hardware module
| Submodules | Module details |
|---|---|
| Display technology186 |
Display technologies: Micro LED displays, optical waveguides, and LCD displays are utilized to enable AR overlay functions, enhancing the visualization of digital information in the real world. Projection technology: Lasers or micromirror arrays are employed to project images directly onto the retina or spectacle lenses, creating immersive visual experiences. |
| Aware of the hardware187 |
Camera: Employed for environmental visual perception, facial recognition, object tracking, and gesture recognition, enabling advanced user interaction and situational awareness. Sensors: Includes technologies such as Inertial Measurement Units (IMU), GPS, and accelerometers, which facilitate motion capture and positional tracking. Microphone and speaker: Support voice command input and audio output, enabling hands-free interaction and seamless auditory communication. |
| Handling unit95 |
AI chips: For example, the Qualcomm Snapdragon platform, which processes visual, auditory, and multi-modal inputs in real time, enabling intelligent decision-making and adaptive functionality. Storage and battery modules: Provide essential support for data processing and ensure continuous operation, facilitating extended use and seamless performance. |
| Interactive devices188 |
Trackpad: Enables users to interact with the interface by swiping their fingers, providing intuitive control. Eye-tracking devices: Utilize infrared light or cameras to monitor eye movements, enabling gaze-based control and enhancing user interaction. |
This hardware module analysis details the core components and functionalities of smart glasses, emphasizing display technologies like Micro LED and projection methods for immersive visuals. It highlights the role of cameras, sensors, and audio devices in enabling advanced interaction, while AI chips and storage modules ensure real-time processing and extended usability.
As delineated in Table 3, the hardware module encompasses several sophisticated submodules pivotal to advanced augmented reality systems. The display technology submodule adopts Micro LED displays, optical waveguides, and LCD displays, facilitating superior AR overlay capabilities that enhance the visualization of digital information within the physical world. Complementing this, the projection technology submodule leverages lasers or micromirror arrays to project images directly onto the retina or spectacle lenses, engendering deeply immersive visual experiences. The awareness of the hardware submodule involves cameras for environmental visual perception, facial recognition, object tracking, and gesture recognition, enabling advanced user interaction and situational awareness. Sensor’s submodule includes technologies such as Inertial Measurement Units (IMU), GPS, and accelerometers, facilitating motion capture and positional tracking. The handling unit submodule comprises AI chips, like the Qualcomm Snapdragon platform, which processes visual, auditory, and multi-modal inputs in real time, enabling intelligent decision-making and adaptive functionality. Storage and battery modules provide essential support for data processing and ensure continuous operation, facilitating extended use and seamless performance. Interactive devices submodules include trackpads for intuitive control and eye-tracking devices that utilize infrared light or cameras to monitor eye movements, enabling gaze-based control and enhancing user interaction.
As outlined in Table 4, the software and AI technology modules are crucial for the functionality of smart glasses, powering their advanced capabilities. The multi-modal data fusion submodule integrates visual, voice, and environmental data, enhancing perception and interaction through sophisticated multi-modal models (such as GPT-4). The computer vision submodule includes object detection and recognition for real-time scene analysis and SLAM technology for precise spatial positioning and navigation in augmented reality. The NLP submodule supports voice instruction processing, question answering systems, translation, and environmental semantic analysis. The AR interaction submodule utilizes AR SDKs (such as Unity, ARKit, and ARCore) to overlay and interact with virtual information. The edge computing vs. cloud computing submodule distinguishes between edge computing, which handles simple tasks on the device side to reduce latency, and cloud computing, which relies on remote servers to complete complex data calculations and multi-modal model operation.
Table 4.
Software and AI technology modules for smart glasses
| Submodules | Module details | Advantages | Disadvantages |
|---|---|---|---|
| Multi-modal data fusion | Combine visual, voice, and environmental data to integrate perception and interaction through multi-modal large models (such as GPT-4)189,190. | Enhances user interaction by combining multiple data streams; more accurate decision-making. | Requires significant computational resources; complexity in data synchronization. |
| Computer vision |
Object Detection and Recognition: Recognize objects and text in a scene in real time191. SLAM technology: Enables precise spatial positioning and navigation in augmented reality192. |
Real-time scene analysis; precise spatial positioning and navigation. | Performance may degrade in complex environments; requires high processing power. |
| NLP | Support voice instruction processing, question answering systems, translation, and environmental semantic analysis193,194. | Facilitates hands-free operation; improves accessibility and global connectivity. | Accuracy can be affected by ambient noise; limited understanding of context in some cases. |
| AR interaction | AR SDKs (such as Unity, ARKit, and ARCore) are used to overlay and interact with virtual information195. | Enhances user experience with immersive interactions; versatile applications. | Dependent on the robustness of AR software development kits; potential latency issues. |
| Edge Computing vs. Cloud Computing |
Edge computing: Handles simple tasks on the device side to reduce latency196,197. Cloud computing: Rely on remote servers to complete complex data calculations and multi-modal model operation197. |
Reduces latency for edge computing; supports extensive calculations via cloud. | Edge computing has limited capacity; cloud computing is dependent on internet connectivity. |
This analysis outlines the software and AI technology modules integral to smart glasses, highlighting their functionalities, advantages, and limitations.
The functional modules of smart glasses are meticulously engineered to enhance user experience and expand applicability across a spectrum of use cases, including translation, navigation, health monitoring, scene recognition, and education and entertainment, as detailed in Table 5, thereby catering to a diverse array of user needs.
Table 5.
Elaboration of functional modules for smart glasses
| Submodules | Module details | Advantages | Disadvantages |
|---|---|---|---|
| Real-time translation | Instant communication in multiple languages is facilitated through speech recognition and machine translation technologies, breaking down language barriers and enhancing global connectivity198. | Breaks down language barriers; enhances global communication. | Translation accuracy may vary; depends heavily on the quality of input and algorithm sophistication. |
| Navigation & Positioning | By integrating with AR technology, smart glasses can display navigation paths or points of interest, providing users with intuitive and immersive directional assistance199. | Provides intuitive directional assistance; enhances user experience. | Accuracy may be affected by GPS signal strength; performance in urban canyons could be problematic. |
| Health monitoring | Equipped with biosensors, these glasses monitor health metrics such as heart rate and blood pressure in real time, offering users continuous health tracking and early detection of potential health issues. | Continuous health tracking; early detection of potential health issues. | Limited battery life; sensors’ accuracy might decrease over time without proper calibration. |
| Scene recognition | Help users understand and make decisions in complex environments, such as help visually impaired people identify obstacles, etc200,201. | Aids in making informed decisions; increases safety in hazardous environments. | Requires substantial training data; performance degradation in unfamiliar settings. |
| Education and entertainment | Smart glasses provide interactive learning experiences or immersive gaming, making education more engaging and entertainment more interactive, thus enhancing both learning outcomes and leisure activities170. | Engages users more effectively; promotes better learning outcomes and leisure activities. | Content creation can be resource-intensive; variability in user engagement levels. |
This analysis highlights smart glasses’ functional modules, emphasizing their advantages and limitations in translation, navigation, health monitoring, scene recognition, and education.
Application scenarios of smart glasses
The core applications of smart glasses encompass visual assistance, smart home control, navigation and positioning, information prompts and reminders, social interaction support, health monitoring, and education and training. Leveraging advanced AI technology and sensor integration, smart glasses offer personalized life support through augmented reality (AR), voice interaction, and other technical capabilities. For example, for visually impaired users, smart glasses can integrate object recognition, real-time obstacle detection, and path planning features, significantly enhancing travel safety and overall convenience48,49.
In the realm of smart homes, smart glasses serve as central control hubs, managing functions such as lighting, temperature, and security systems through voice or visual commands. Building on this capability, the HUAWEI Smart Glasses 2 further extends its functionality by offering application reminders, including weather updates, schedule summaries, and health monitoring, with features like cervical spine fatigue detection50.
Additionally, there are sports-oriented smart glasses designed for fitness enthusiasts, capable of recording exercise data such as steps, heart rate, and GPS routes, providing valuable scientific insights to support training and fitness goals51. In the field of education, smart glasses offer an AR-enhanced learning experience, featuring functions such as translation and recording, supported by visual assistance and interactive learning modules to enhance both learning efficiency and engagement.
Healthcare represents a key development focus for smart glasses, with core technologies encompassing large multi-modal models, high-precision sensors, and real-time data analysis. In the realm of chronic disease management, a study by Guan et al.52 highlights the pivotal role of AI in the prevention and management of diabetes, a field with significant development potential, smart glasses can leverage integrated multi-modal large models and AI algorithms to monitor the health trends of patients with chronic diseases. Current trends focus on incorporating smart sensors into wearable devices, enabling continuous health monitoring within the user’s natural environment53. Figure 3 illustrates the various application possibilities of smart glasses.
Fig. 3. Advanced applications for smart glasses in healthcare monitoring.
Smart glasses integrate sensors and AI technology, pair with wearable devices, and utilize big data analytics to enable real-time health monitoring, personalized health recommendations, telemedicine services, and mental health support (By Figdraw).
By incorporating optical sensors, infrared cameras, and edge computing modules, smart glasses can collect and analyze users’ health data, such as heart rate, blood sugar levels, and blood pressure, in real time, offering precise health monitoring and management for patients. For example, Microsoft’s blood pressure monitoring smart glasses utilize optical technology to quickly and accurately record blood pressure metrics54. In the domain of blood glucose monitoring, smart glasses are still under research and development. However, several projects have already demonstrated the feasibility of this technology. Park et al. introduced the development of a wireless smart contact lens glucose biosensor, showcasing its ability to monitor glucose levels as a non-invasive alternative to traditional blood glucose measurements, highlighting the potential of smart contact lenses in non-invasive glucose monitoring55. Emteq Labs’ smart glasses, Sense, can monitor health and capture data at a rate of 6000 times per second56. VR glasses have been applied intraoperatively to monitor patients’ emotional states and deliver interventions—including guided meditation, relaxation techniques, and distraction strategies—to effectively alleviate anxiety, stress, and pain during surgical procedures57–63. Based on the current developments of smart glasses in the field of mental health, this study proposes future application scenarios for mental health management using smart glasses, as outlined in Table 6.
Table 6.
Application and principle of smart glasses in mental health management and virtual follow-up health services
| Mental health management | Virtual follow-up and health management | |
|---|---|---|
| Function | Patients often experience mood swings or stress, and smart glasses can monitor the patient’s emotional state and provide stress management, meditation or relaxation training58,59. | The voice assistant built into the smart glasses can provide psychological support, such as voice comfort, meditation exercises, or psychological counseling. |
| Apply | Smart glasses integrated with mental health monitoring and intervention capabilities allow individuals experiencing depression or anxiety to continuously track and regulate their emotional states, thereby facilitating the mitigation of negative affective symptoms106,202–205. | Patients can self-regulate themselves through an app on their glasses when they feel anxious. |
| Principle | The glasses use facial expression recognition, voice analysis, and physiological indicators such as heart rate changes to assess the patient’s emotional state and recommend interventions206–208. | Through emotion recognition technology, glasses can capture a patient’s mood changes and provide psychological intervention or relaxation techniques through a voice or visual interface. |
This analysis explores smart glasses’ role in mental health management and virtual follow-up services in the future, utilizing emotion recognition, voice analysis, and physiological monitoring to provide stress management, psychological support, and self-regulation tools for users experiencing anxiety or depression.
As data collection improves and technology advances, smart glasses are poised to play a pivotal role in personalized health management and intervention. By leveraging multi-modal data analysis, these glasses can provide patients with tailored health recommendations on diet, exercise, and more64–67. Additionally, they can remind patients to take their medications on time and assess the effectiveness of these treatments. This personalized health management service is particularly beneficial for individuals with chronic diseases, as it offers real-time feedback, empowering patients to make timely adjustments to their lifestyle or medication regimens, thereby enhancing their daily lives and overall well-being68–72. Table 7 provides a detailed description of personalized health advice and reminders, as well as medication management and reminders, based on the capabilities of smart glasses.
Table 7.
Personalized health advice and reminders, medication management and reminders based on smart glasses
| Personalized health recommendations and reminders | Medication management and reminders | |
|---|---|---|
| Function | Based on real-time health data, Smart glasses analyze a patient’s condition and provide personalized health advice, dietary advice, or exercise guidance through a voice or visual interface. | Smart glasses can remind patients to take their medications on time and monitor the effects of their medications based on their medication history, medical records, and disease status. |
| Apply | For people with high blood pressure, glasses can provide reminders to take medications and provide dietary or exercise advice to help control blood pressure209. | For patients with chronic diseases, especially those who need to take medication for a long time, glasses can help patients take their medications on time and alert them to abnormal conditions209. |
| Principle | The AI algorithm analyzes the user’s health data, combined with the patient’s personal medical history and health goals, to generate a personalized health plan67. | The glasses’ built-in schedule reminder function, combined with an intelligent voice assistant, provides medication reminders at a user-defined time and monitors medication progression209. |
This analysis examines smart glasses’ capabilities in delivering personalized health advice and medication management in the future. By leveraging AI algorithms and real-time health data, they provide tailored recommendations for diet, exercise, and medication adherence, while monitoring patient progress and alerting on abnormalities to enhance chronic disease management.
Smart glasses can assist clinicians in capturing, recording, and storing key findings during consultations, eliminating the need for manual data entry or the use of a scribe73–77. They also facilitate electronic medical record management, enabling direct conversion of records to electronic format, which significantly reduces the time spent transferring data from physical files78–81. With the integration of AI technology, clinicians can analyze rapid test results and gain insights to optimize patient care. Additionally, smart glasses serve as valuable tools for surgeons during procedures46,82–85. Smart glasses have been applied to different types of clinical procedures64,86–92. Following the release of Google Glass, Dr. Phil Haslam and Dr. Sebastian Mafeld demonstrated its potential in interventional radiology. They showcased how Google Glass could assist in liver biopsies and fistuloplasties93, potentially enhancing patient safety, improving operator comfort, and increasing surgical efficiency.
As exemplified by the smart glasses developed by Vuzix specifically for telemedicine applications94. These glasses can bridge the distance between doctors and patients through voice connectivity, which enables caregivers to instantly share medical expertise with healthcare professionals worldwide, offering life-saving guidance from any location95–101. They facilitate real-time exchange of expert medical feedback without compromising patient care, while also providing surgeons with immediate input to reduce errors and enhance surgical precision through AR technology102–104. For patients in remote areas, smart glasses offer doctors the ability to monitor patients’ conditions in real time through remote video calls and data sharing, bringing convenient healthcare services directly to those in need105. This model helps address the challenges of unequal distribution of medical resources and significantly enhances the efficiency and accuracy of patient follow-up. Table 8 showcases the role of smart glasses in telemedicine.
Table 8.
Utilization of smart glasses in telemedicine services for enhanced remote health management and virtual follow-up
| Telehealth support | Virtual follow-up and health management | |
|---|---|---|
| Function | Smart glasses can support real-time communication between patients and doctors through video calls, remote diagnosis, real-time data sharing and other functions154,210–217. | Smart glasses can be used as a health management platform to help doctors remotely monitor and follow up on their condition. |
| Apply | Through the glasses’ camera, the doctor can see the patient’s symptoms and make recommendations in real time, especially for patients living in remote areas96,218,219. | Patients wear glasses every day, perform health data collection and connect with the medical system, and doctors track changes in patients’ conditions through data analysis69. |
| Principle | Smart glasses can transmit patients’ health data and symptom images in real time and provide them to remote doctors in combination with AI analysis results154,216,220,221. | The built-in data synchronization of smart glasses relates to the cloud platform, and doctors can obtain patients’ health data in real time and provide personalized guidance and intervention. |
This analysis highlights the role of smart glasses in telemedicine and virtual follow-up, enabling real-time patient-doctor communication, remote diagnosis, and health data sharing. Integrated with AI, they facilitate remote monitoring, symptom analysis, and personalized interventions, enhancing accessibility and continuity of care, particularly for remote or chronic disease patients.
With the continuous advancement of sensor technology and AI algorithms, smart glasses will gradually become miniaturized and adapt to more medical scenarios106,107. The combination of these technologies will not only improve the quality of medical care, but also optimize the allocation of medical resources and provide personalized health management services for more people108.
The industrial applications of smart glasses, leveraging AR, Internet of Things (IoT) connectivity, AI analytics, and high-precision vision and motion sensing technologies, have the potential to significantly enhance productivity, safety, and product quality. The following Table 9 outlines specific application scenarios.
Table 9.
Industrial applications of smart glasses leveraging AR, IoT, AI analytics, and high-precision sensing
| Apply | Case | Principle | |
|---|---|---|---|
| Security monitoring and emergency | Sensors built into the glasses monitor workers’ environmental conditions (e.g., gas leaks, temperature anomalies) and provide real-time warnings30. | Mining company uses smart glasses to improve the safety of underground operations30. | Sensors and AI algorithms are used to monitor environmental data and determine potential risks in combination with worker status. |
| Production assembly and quality control | Smart glasses provide workers with dynamic guidance on assembly steps or automatically detect product defects during quality control31. | BMW uses smart glasses in production line assembly to reduce the rate of operator errors31. | Smart glasses provide workers with dynamic guidance on assembly steps or automatically detect product defects during quality control. |
| Logistics & Warehouse Management | In the warehouse, smart glasses can guide workers to quickly find the location of items through AR technology to improve sorting efficiency32. | Amazon uses smart glasses for shipment location and real-time inventory management32. | Through barcode scanning, GPS positioning, and AR navigation, the glasses show the specific storage location of the goods and update the system in real time. |
| Remote collaboration and training | Engineers can share first-person perspectives through smart glasses, and experts can remotely guide them to solve complex problems. It is also used to train new employees33. | Boeing uses smart glasses in aerospace assembly training to significantly reduce learning time33. | Smart glasses transmit video and voice over a high-speed network, combined with AR annotation capabilities to achieve efficient collaboration. |
| Equipment maintenance and fault diagnosis | The smart glasses provide real-time guidance to technicians by overlaying equipment operation manuals and fault information through AR technology34. | Daimler and GE use AR glasses to reduce equipment downtime and improve maintenance efficiency34. | The glasses capture real-time images of the device through the camera, and combined with AI analysis, identify the problem parts, and superimpose the maintenance plan. |
This table highlights industrial applications of smart glasses in security, production, logistics, collaboration, and maintenance, leveraging AR, IoT, and AI for enhanced efficiency and safety.
The role and challenges of smart glasses in health management
Under the concept of active health, there is growing emphasis on maintaining a healthy diet and ensuring food quality, with the detection and analysis of food nutrients becoming a major research focus. The proper intake of nutrients such as proteins, fats, carbohydrates, vitamins, and minerals is essential for human health109. Consequently, the development of efficient and precise nutrient detection technologies is crucial for formulating evidence-based dietary plans and upholding food safety standards. Traditional chemical analysis methods, however, are often time-consuming and complex, frequently requiring destructive sampling that limits their practical application110.
Recent advancements in computer vision and deep learning have revolutionized non-destructive nutrient assessment. These technologies excel in automatic feature extraction, accurate classification, and end-to-end learning, positioning them as indispensable tools for food image recognition and nutritional evaluation. For example, the NutriNet system utilizes convolutional neural networks to achieve high classification accuracy but has limitations when processing multi-component images111. The MResNet-50 model, enhanced by natural language processing (NLP), enables automatic recipe extraction, addressing intra-class variability in food images, yet demonstrates limited generalization capabilities for unseen categories112. Wang et al.’s model integrates EfficientNet113, Swin Transformer, and Feature Pyramid Network (FPN) to adapt to complex scenarios, though it requires further development for detailed component identification in traditional Chinese dishes113–116.
The Im2Calories app exemplifies this progress by combining segmentation and classification techniques to evaluate meals with an accuracy rate of 76%, providing a robust solution for fine-grained differentiation117. Liu et al.’s multi-dish recognition model employs EfficientDec to enhance the accuracy of dietary intake reporting, although it necessitates frequent dataset updates to account for seasonal variations118. The ChinaMarketFood109 database has been instrumental in training Inception V3, improving image classification accuracy; however, there remains room for improvement in estimating nutrient content. Emerging models like DPF-Nutrition leverage monocular images along with depth prediction modules to estimate food nutrition, though they encounter limitations when processing stacked images119. The RGB-D feature fusion network integrates color and depth information, enhancing multi-modal learning capabilities and offering solutions for occlusion management and the recognition of complex scenes120. From food image recognition to comprehensive nutritional assessment, deep learning and multi-modal technologies demonstrate significant potential. However, challenges related to adaptability in complex scenarios, model generalizability, and computational costs must be addressed.
Figure 4 depicts the workflow of smart glasses in the realm of food nutrition recognition. This application is primarily designed to offer real-time food identification121, comprehensive nutritional analysis, and tailored health recommendations by integrating state-of-the-art computer vision122, AI, and AR technologies. Initially, real-time food identification is facilitated through the smart glasses’ integrated camera and advanced image recognition capabilities, which swiftly scan and analyze the visual characteristics of food. The AI algorithm subsequently cross-references the captured image data with an extensive nutritional database, thereby providing users with detailed insights into the food’s composition, encompassing calories, proteins, fats, sugars, and other essential nutrients. Nutritional information is presented in an intuitive and engaging format, equipping users with valuable and actionable dietary knowledge. For instance, ChatDiet123 realizes personalized nutrition-oriented food recommendations through an LLM-augmented framework. It integrates individual and population models. The individual model employs causal discovery and reasoning techniques to evaluate the nutritional effects on specific users, while the population model provides generalized nutritional information about food. The coordinator transmits the outputs of both models to the LLM, thereby offering customized food recommendations. The effectiveness of its food recommendation test reaches 92%.
Fig. 4. A smart glasses-based application framework for food nutrition recognition.
Initially, real-time food identification is facilitated through the smart glasses’ integrated camera and advanced image recognition capabilities, which swiftly scan and analyze the visual characteristics of food. The AI algorithm subsequently cross-references the captured image data with an extensive nutritional database, thereby providing users with detailed insights into the food’s composition, encompassing calories, proteins, fats, sugars, and other essential nutrients. Nutritional information is presented in an intuitive and engaging format, equipping users with valuable and actionable dietary knowledge.
Utilizing AI technology, smart glasses seamlessly superimpose real-time nutritional information onto the user’s field of vision. By simply directing their gaze at food, users are automatically presented with relevant nutrient data and health recommendations. This functionality empowers users to conduct swift nutritional assessments prior to consumption, thereby facilitating healthier decision-making. Moreover, by incorporating user-specific health data—such as weight, age, activity level, and health goals—the smart glasses can deliver personalized dietary suggestions based on their real-time food recognition capabilities. For example, if the system detects the consumption of high-sugar food, the glasses may prompt the user to monitor their sugar intake or suggest healthier alternatives. Furthermore, the glasses can integrate with the user’s broader health management ecosystem, such as a smartwatch or health app, to provide a more holistic health assessment. By continuously monitoring eating habits and physical activity, these integrated devices offer long-term solutions for personalized health management.
Integrated active health management platform combining smart glasses and health IoT devices
The proposed platform architecture for AI-powered smart glasses is designed to support proactive health management through a multi-layered approach, integrating advanced technologies to ensure seamless functionality as shown in Fig. 5. The architecture consists of four hierarchical layers: the perceptual layer, data layer, application layer, and interactive layer. Each layer is strategically designed to leverage the latest advancements in technology, ensuring a cohesive system that supports real-time health monitoring and personalized health management.
Fig. 5. Platform architecture and functional module.
The basic architecture of the platform is composed of the perceptual layer, data layer, application layer, and interactive layer.
The perceptual layer comprises a suite of hardware sensors, including AR glasses cameras, heart rate sensors, body temperature sensors, glucose monitoring devices, GPS modules, and other wearable sensors. These components are responsible for real-time data acquisition, enabling comprehensive physiological and environmental monitoring. For example, wearable hydrogel-based health monitoring systems can provide real-time monitoring of health indicators such as glucose, uric acid, lactose, heart rate, blood pressure, and temperature. Additionally, flexible self-powered bioelectronics (FSPB) can dynamically monitor physiological signals, revealing real-time health abnormalities and providing timely, precise treatments124.
The data layer facilitates the processing, storage, and management of the raw data collected by the perceptual layer. Core modules include: Data storage, utilization of cloud databases, and edge storage technologies to store diverse data formats such as images, videos, and health metrics. Cloud databases collect, deliver, replicate, and push data to the edge using hybrid cloud concepts, ensuring efficient data management125. Data cleaning and processing, technologies like Apache Kafka, Apache Flink, and TensorFlow are employed for efficient data preprocessing and integration. Data analysis and security, advanced analytical frameworks (e.g., MySQL, MongoDB, InfluxDB) combined with encryption tools such as AWS KMS and Azure Key Vault ensure robust data analysis and compliance with privacy standards.
The application layer encapsulates the core functionalities of the platform, focusing on health management and user engagement: Data analysis and processing, algorithms for advanced data interpretation, including health trend predictions and anomaly detection. Deep learning algorithms, which have achieved great success in image processing and speech recognition, are expected to open new depths for health monitoring systems. Intelligent recommender system, personalization of health interventions through AI-driven insights. Telemedicine services facilitate remote consultation and real-time diagnosis, bridging gaps in healthcare accessibility. Health Data Management ensures organized and secure storage of user health records for continuous monitoring and evaluation.
The interactive layer is designed to enhance user experience through multi-modal interaction mechanisms. It includes: User interface (UI), features such as a health dashboard, real-time data monitoring, and health report interfaces for intuitive visualization. Interaction modules, speech command, voice assistant, gesture control, and eye-movement interaction for hands-free operation and accessibility. Multi-modal interaction mechanisms, such as those involving LLMs, can enhance text processing abilities and provide more intuitive user experiences. Personalization and push notifications deliver customized health warnings, insights, and recommendations to the user in real time.
This multi-layered design creates a cohesive system that seamlessly integrates hardware capabilities with advanced software functionalities. It facilitates real-time health monitoring, personalized health management, and enhanced interaction for a wide range of users. The platform’s modular structure ensures scalability, adaptability, and robustness, allowing it to meet the evolving needs of next-generation wearable health technologies.
To enhance the data processing capacity and analysis accuracy of the active health management platform integrated with smart glasses and IoT devices, a multi-dimensional approach is essential. This approach involves strengthening data quality, leveraging advanced data processing techniques, and optimizing the data storage and analysis architecture.
The platform needs to ensure the integration of data from smart glasses, IoT devices (e.g., smart bracelets, smart scales, blood pressure monitors), and other health-related devices. This data should encompass physiological signals, signs, behaviors, and environmental factors to ensure multi-dimensional and diversified inputs. High-precision sensors are critical for real-time data collection, as they reduce data errors and enhance health monitoring reliability.
Regarding data storage and processing architecture, we can leverage edge computing to shift preliminary data analysis and filtering from the cloud to the device side (e.g., smart glasses or smart devices). This will reduce data transmission delay and bandwidth requirements, making real-time health monitoring more efficient. Edge computing will enhance data processing timeliness, while distributed database technology can store large volumes of health data. By scaling out, the platform can efficiently process massive amounts of data while maintaining stability. Additionally, a cloud-based big data analysis framework will process complex health datasets. Using distributed computing and storage ensures the platform can handle various data types at scale and generate real-time analysis reports based on user needs.
To improve the accuracy of data analysis and modeling, machine learning methods such as deep learning (e.g., neural networks)126, reinforcement learning127, and support vector machines (SVMs)128 are employed to analyze and predict users’ health data. Continuous optimization and training of these models enhance the accuracy of the analysis. Furthermore, personalized recommendation algorithms can be developed based on the user’s health history, physical characteristics, and behavioral data. These algorithms provide precise health recommendations tailored to the user’s specific situation, such as chronic medical history and genetic characteristics.
The integration of traditional Chinese and Western medicine theories is achieved by combining the constitution identification of traditional Chinese medicine with the modern medical data of Western medicine to build a hybrid model. This approach improves the predictive ability of users’ health status and the accuracy of dietary recommendations. Natural Language Processing (NLP) technology is used to analyze TCM literature129 and Western medicine research, integrating these theories to make personalized health recommendations. By implementing these multi-dimensional measures, the platform not only enhances the accuracy and reliability of health data analysis but also provides users with more personalized and effective health management services. This approach aligns with the latest advancements in technology, ensuring a cohesive system that supports real-time health monitoring and personalized health management.
The synthesis of smart glasses with Internet of Things (IoT) medical devices represents a pivotal advancement in the realm of active health management platforms. Central to this integration is the establishment of robust, seamless connectivity and interoperable data exchange protocols that facilitate real-time physiological monitoring. As a core wearable technology, smart glasses are envisioned to be outfitted with a comprehensive suite of advanced biosensors, including but not limited to heart rate monitors, pulse oximeters, thermometers, gait analyzers, and accelerometers. These sensors continuously capture granular biometric data from the user.
Supplementing the capabilities of smart glasses, IoT-enabled medical devices such as ambulatory blood pressure monitors, continuous glucose monitors, and bioimpedance scales provide additional critical health parameters, thereby enriching the dataset with metrics like arterial pressure, glycemic levels, and anthropometric measures. This synergistic integration ensures the integrity, comprehensiveness, and precision of the collected health information, offering a panoramic overview of the individual’s wellbeing130.
Figure 6 illustrates the design of integrated solutions for smart glasses and IoT devices. The intelligent health system architecture encompasses the hardware layer (including smart glasses, IoT devices, sensors), the data transmission layer (wireless communication and real-time encrypted transmission), the data processing and analysis layer (cloud storage, intelligent analysis, etc.), the user interaction layer (interaction methods such as eye movement and gestures), and the application scenario layer (such as user health management), while also taking into account privacy and real-time feedback.
Fig. 6. Design of integrated solutions for smart glasses and IoT devices.
The intelligent health system architecture encompasses the hardware layer (including smart glasses, IoT devices, sensors), the data transmission layer (wireless communication and real-time encrypted transmission), the data processing and analysis layer (cloud storage, intelligent analysis, etc.), the user interaction layer (interaction methods such as eye movement and gestures), and the application scenario layer (such as user health management), while also taking into account privacy and real-time feedback.
In terms of data transmission, both smart glasses and IoT devices should employ low-power wireless communication standards—such as Bluetooth Low Energy (BLE)131, IEEE 802.11 Wi-Fi, or ZigBee132—to ensure real-time data synchronization. Smart glasses aggregate biometric data from daily activities and transmit it via secure, encrypted channels over a wireless network to a cloud-based platform for storage and processing133. Data security and privacy are paramount; therefore, all transmissions comply with stringent encryption protocols and adhere to pertinent data protection regulations and industry standards.
For analysis and processing, the cloud platform will consolidate multi-modal datasets from smart glasses and IoT devices, harnessing machine learning algorithms and AI for sophisticated analytics. The system will perform continuous health status surveillance, anomaly detection, and trigger alerts or recommendations when deviations from baseline health metrics are observed. For instance, upon detecting tachycardia or bradycardia, the system would promptly notify the user134 and advise appropriate actions, such as resting or seeking medical consultation. Moreover, the platform will generate personalized health management strategies based on each user’s medical history and lifestyle factors, providing tailored services like physical activity guidance and nutritional counseling135.
The multi-modal interaction design of smart glasses enhances user engagement with the health management platform. Leveraging NLP for voice commands, eye-tracking49,136 for interface navigation, and capacitive touchscreens for manual input, the glasses dynamically adapt their visual displays according to the user’s health indicators, offering real-time monitoring and relevant health advisories.
At the application level, the convergence of smart glasses and IoT devices significantly benefits both end-users and healthcare practitioners. Users gain tools for proactive health management, while clinicians can remotely monitor patient health and provide timely interventions137. Through the integrated platform, physicians can access remote patient monitoring (RPM)138 data, evaluate conditions, assess therapeutic efficacy, and devise personalized care pathways, thus enhancing telemedicine capabilities. Furthermore, the platform facilitates the compilation of detailed, longitudinal health records, archiving all user health data for future reference. Such a repository plays an indispensable role in ongoing health maintenance and predictive analytics for disease prevention.
Building an active health management platform based on the integration of smart glasses and IoT medical devices involves the integration and collaboration of smart glasses with a variety of IoT devices (such as health monitoring devices and medical sensors) through device-side AI technology. The application of edge AI technology can ensure that data is processed locally on the device (i.e., smart glasses or other IoT devices) in real time, reducing latency, improving response speed, and data security, while reducing the dependence on cloud processing139. Figure 7 shows different application scenarios of on-device AI technology.
Fig. 7. Application scenarios of on-device AI technology.
Application scenarios rely on smart glasses to achieve diversified application forms.
For the application of real-time health data monitoring and personalized health management, smart glasses leverage built-in sensors and IoT devices (such as blood glucose meters, heart rate monitors, and blood pressure monitors) to collect real-time health data, enabling continuous health monitoring. The device-side processing capabilities of smart glasses allow for the analysis of physiological data (e.g., heart rate, blood glucose, and body temperature) directly on the device. With embedded AI chips, the glasses can analyze this data in real time and provide immediate feedback, even identifying health abnormalities (such as high blood sugar or abnormal heart rate) and issuing automatic warnings. This minimizes reliance on cloud services, ensuring real-time monitoring and improving the timeliness and accuracy of health management. Additionally, by integrating with IoT devices like exercise trackers and sleep monitors, the smart glasses offer personalized health recommendations. For instance, based on daily activity and health data, the system may suggest increasing physical activity, adjusting diet, or improving sleep quality. Through localized AI models, the glasses analyze users’ health data and create tailored intervention strategies. As users’ health data evolves, the AI system adapts the recommendations, ensuring continuous, personalized health management that is both network-independent and responsive to real-time needs. And for action recognition and health interventions, smart glasses, in combination with motion sensors (such as IoT smart bracelets), can monitor the user’s movements in real time and detect potential health-risk behaviors, such as prolonged sitting or improper exercise. By leveraging the visual and motion recognition capabilities of the smart glasses, along with motion data from IoT devices, the integrated AI system can evaluate the user’s activity posture and behavior (e.g., sitting or walking posture) in real time. The system provides corrective guidance instantly, reducing data latency with local processing and feedback. This ensures real-time monitoring and prompt, actionable guidance for improving posture and preventing health risks.
Aiming at the application of medical image analysis and diagnostic support, smart glasses can be seamlessly integrated with IoT devices (such as smart thermometers and blood pressure monitors) and medical imaging equipment (like portable ultrasound and X-ray machines) to assist healthcare professionals in diagnosing conditions. Through computer vision technology, smart glasses analyze real-time image data of the user, combining it with health data from IoT devices to facilitate quicker diagnoses. For example, by visually assessing the user’s face and physical signs, smart glasses can detect potential health issues (such as paleness or eye abnormalities) and provide real-time recommendations to doctors. This integration minimizes data transfer needs and cloud dependencies, ensuring faster diagnostic support.
For automatic detection and alarming of abnormal events, IoT devices integrated with smart glasses can promptly detect health emergencies (such as falls or seizures) and alert medical personnel or family members. By leveraging the sensors and image processing capabilities of smart glasses, along with IoT devices (like smart bracelets and environmental sensors), the system continuously monitors the user’s physiological state. In the event of an emergency, such as a fall or significant heart rate fluctuations, the AI system responds immediately by sending an alarm signal to ensure timely assistance. On-device AI ensures rapid response and processing, reducing dependence on cloud services and enhancing response speed. Seamless integration and collaboration can also be achieved. Smart glasses not only integrate with individual IoT devices but also seamlessly collaborate with multiple devices to enhance data sharing and the comprehensiveness of health management. Through on-device AI technology, smart glasses can interact with various IoT devices, such as smart bracelets, smart home systems, and environmental monitoring devices, to exchange data and provide holistic health management. For instance, based on the user’s health data, smart glasses can automatically adjust the smart home environment—such as temperature, humidity, and air quality—to optimize comfort and well-being.
By deploying edge computing on smart glasses and IoT devices, data processing tasks are moved to edge nodes locally or close to the device. This reduces the latency of data transmission and enables real-time data synchronization and processing between devices, ensuring consistency and accuracy of information. Then the real-time data synchronization protocol is adopted, such as Message Queues Telemetry Transport (MQTT)140, WebSocket141, to ensure efficient and stable data transmission between devices. These protocols are designed for low-latency and efficient communication to ensure synchronization and data consistency between devices, especially in IoT environments. And use data consistency algorithms and verification mechanisms, such as Cyclic Codes for Error Detection, Unix timestamp, to ensure that the transferred data is not lost or corrupted. Even if there is a delay or disconnection in communication between devices, the system ensures data integrity and consistency. A distributed architecture can be designed to ensure seamless connection between the local data processing of smart glasses and IoT devices and cloud systems142. The device can process the data locally and upload the results to the cloud for further analysis or storage, ensuring data synchronization and consistency. Set up the device to continue to collect data when it is offline and compensate for the data when the network is restored. The system can ensure consistency between offline and online data through mechanisms such as timestamping and versioning, avoiding data loss or duplication. This ensures consistency and accuracy of data between devices. The combination of these technologies can improve the efficiency and stability of the active health management platform.
Discussion
In recent years, the rapid development of wearable technology, especially smart glasses, has brought significant innovations to the healthcare field, particularly in areas such as surgical procedures, physician assistance, health monitoring, and patient management. Numerous studies have confirmed the important practical value of such devices in improving medical personnel’s response efficiency, enhancing operational accuracy, and alleviating patient anxiety and pain17,143–153. For instance, in emergency scenarios, a study in Thailand showed that using smart glasses to assess the number of 11–30 casualties resulted in a nearly 9 percentage point improvement in accuracy (98.0% vs 89.2%) and reduced assessment time by over 50%154. In pediatric surgery, the use of VR glasses reduced the treatment time by an average of 5.53 min, significantly alleviating the anxiety and pain experienced by children throughout the treatment process62. Smart glasses equipped with AR and eye-tracking technology achieved high-precision eye tracking at a distance of 1 meter, with an accuracy of 1.0° and a margin of error controlled within ±0.1°, paving the way for new possibilities in health monitoring and medical diagnosis49. Additionally, in anesthesiology, AR smart glasses-assisted pediatric radial artery cannulation, increasing the success rate of interns from 71.7% to 89.8%, with a more than 23 percentage point reduction in complication rates144. For patients with Parkinson’s disease, the gait parameters obtained through AR glasses showed excellent reliability (ICC > 0.942)108. In neurosurgical procedures, the use of mixed reality navigation systems helped control the deviation in the location of lesions from standard navigation within 5.0 mm in 81.1% of cases152. Furthermore, smart glasses have shown potential in diet recognition. For example, the DietGlance system achieved an F1 score of 0.972 for dietary recognition and rigorously compared the estimated results of key constant nutrients with expert assessments. The analysis found that the average absolute percentage error (MAPE) for key constant nutrient estimation was approximately 17.92%155. These achievements fully demonstrate the vast potential of smart glasses in improving medical efficiency and service quality.
However, the application of the technology is not flawless. In the field of pediatric dentistry, a study on children with moderate to severe hearing impairments showed that VR glasses had no significant effect on lowering heart rate or alleviating anxiety during pulpectomy treatment (all assessment indicators had p-values > 0.05)156. Another study involving children aged 4–6 also confirmed that VR technology did not effectively reduce anxiety and pain during dental treatments. A clinical comparison in dental implant surgeries showed that although AR glasses-assisted dynamic computer-aided systems improved the three-dimensional positioning parameters of implants, the differences were not statistically significant108. These studies suggest that there are still technological bottlenecks and limitations in the effectiveness of smart glasses in medical applications, which require further optimization and breakthroughs.
In order to address the limitations of existing smart glasses models and leverage their advantages in the medical and healthcare field, this paper proposes the concept of next-generation smart glasses technology, aiming to overcome the issue of statistical insignificance, which can be dissected into two core components.
The first concept is the Non-Humanoid Wearable Robotic Assistant. This assistant is designed as a 24/7 active health companion equipped with advanced AI157. This enables the device to continuously adapt to the user’s lifestyle habits and health requirements158–160. Continuous learning capability161 allows for increasingly precise understanding of individual patterns over time, thereby delivering personalized health recommendations and behavioral guidance. For instance, by monitoring physiological signals such as heart rate and body temperature in real-time, combined with environmental factors (e.g., weather conditions, geographic location) and daily activity patterns, the smart glasses can optimize exercise routines or provide timely reminders for rest162. Furthermore, this type of intelligent assistant is equipped with situational awareness capabilities, utilizing built-in cameras and other sensors to capture visual data and recognize the user’s environment, thereby tailoring its service provision accordingly.
And the second is the Synchronous Decision-Making and Auxiliary Suggestions. Employing Multi-modal Large Language Models163, next-generation smart glasses not only enhance the quality of real-time decision-making164 but also deliver personalized suggestions165 derived from aggregated life and dietary data. These devices transcend the functionality of traditional mobile phone applications, deeply integrating into the user’s daily visual experience to offer more natural and intelligent life guidance166. The crux of this system lies in its interactive nature—it serves as an active learning assistant rather than merely a passive display device, aiding users in making informed decisions, enhancing quality of life, and providing immediate feedback on health prevention measures.
Smart glasses transform collected data into valuable insights, such as generating personalized nutritional plans based on the analysis of meal timing, food types, and physical activity levels, or suggesting optimal rest periods aligned with social activities to maintain peak performance. Moreover, these devices can seamlessly integrate with other health monitoring tools (e.g., smartwatches, glucose monitors) to form a comprehensive personal health ecosystem. To ensure the efficacy and safety of these recommendations, smart glasses regularly update their internal knowledge bases and reference the latest medical research and clinical guidelines. For elderly users, this technology is particularly beneficial as it aids in chronic disease management, enhances independent living capabilities167, and reduces the risk of accidental injuries, promoting longer and healthier aging. Additionally, smart glasses can connect with healthcare professionals, alerting doctors to potential health risks detected for prompt intervention168.
Figure 8 illustrates how smart glasses can serve as an integral part of elderly health management, offering continuous support and personalized health guidance that adapts to the unique needs of each user. The goal of this technology is to create an intelligent system characterized by the resonance of identity, vision, and senses across three screens, forming a non-humanoid multi-modal wearable robot169 that deeply integrates with the user’s life and health management. Smart glasses will serve as the control center for smart healthcare and other devices, allowing users to interact with these devices in real-time via voice recognition and gesture control, forming an efficient IoT ecosystem.
Fig. 8. Next-Generation Smart Glasses.
Enhancing elderly health management through non-humanoid wearable AI and synchronous decision support. a Health problems faced by the elderly. b Possible scenarios and ways for the elderly to use smart glasses (By Figdraw).
The future of smart glasses will see significant advancements in multi-modal interaction technologies, which combine various forms of input and output—such as voice recognition, gesture control, haptic feedback, and AR—to create a more intuitive and efficient user interface. These technologies enable users to interact naturally with their devices without needing to rely on traditional input methods like keyboards or touchscreens. Figure 9 illustrates the application of different interaction modes across multiple scenarios, highlighting how these technologies can be seamlessly integrated into everyday life.
Fig. 9. Application of multi-modal interaction paradigms across diverse scenarios.
Technologies such as gesture interaction, eye-movement recognition, and automatic speech recognition can be applied to construct intelligent interaction models and play a role in fields such as intelligent education and intelligent healthcare.
Voice recognition technology will continue to improve, allowing for more accurate and contextually aware interactions. Users can issue complex commands or queries, and the AI assistant within the smart glasses can respond appropriately, even in noisy environments. Gesture control will evolve to support a wider range of movements, enabling users to navigate menus, select options, or initiate actions through simple hand motions. Haptic feedback, the use of tactile sensations, will provide additional layers of interactivity, confirming user inputs or signaling alerts in a non-intrusive manner. AR overlays will enrich the user’s perception of the world, superimposing useful information directly onto the physical environment170, enhancing both productivity and entertainment. In terms of emotion recognition and situational understanding, the future multi-modal interaction system will be able to combine multiple input signals such as voice, facial expressions, and body movements to accurately identify the user’s emotional state171 and adjust the interaction strategy accordingly. For example, in the field of mental health management and healthcare, the system can identify mood swings based on the user’s facial expressions, tone of voice, and physiological signals to provide personalized intervention recommendations. Multi-modal Sentiment Analysis172 capabilities will greatly enhance the emotional connection between users and the system, making the interaction more humane and intelligent.
Ultimately, the goal of this technology is to create an intelligent system characterized by the seamless resonance of identity, vision, and senses across multiple interfaces, forming a non-humanoid multi-modal wearable robot that deeply integrates with the user’s life and health management. Smart glasses will serve as the central hub for smart healthcare and other connected devices, allowing users to interact with them in real-time via voice and gesture control, thereby establishing an efficient IoT ecosystem that supports a healthier, more connected lifestyle.
Despite the rapid advancements in smart glasses technology, several critical challenges remain to be addressed for enhancing functionality and user experience. These issues span across hardware optimization, energy management, interaction paradigms, materials science, interdisciplinary collaboration, and clinical validation.
Primarily, hardware optimization plays a crucial role. The processors integrated into smart glasses must facilitate real-time multitasking capabilities, encompassing AR/VR rendering, speech recognition, and execution of AI models. Nevertheless, the compact design of these devices imposes significant constraints on thermal dissipation and power consumption. Future research should therefore prioritize the development of low-power, high-performance chips specifically tailored for smart glasses. Collaborative efforts between software and hardware engineers are essential to enhance energy efficiency, ensuring that computational tasks are executed with optimal power utilization.
Secondly, effective energy management is imperative. The requirement for prolonged usage demands substantial battery longevity, a challenge not adequately addressed by existing lithium-ion batteries due to limitations in volume and capacity. Innovations in battery materials, such as solid-state or graphene-based technologies, offer potential improvements in energy density and safety. Modular battery designs that allow for easy replacement or integration with external power modules could provide users with enhanced flexibility. Moreover, solar charging presents an eco-friendly solution; however, advancements in photovoltaic materials, particularly perovskite cells, are necessary to increase power generation efficiency within the confined surface area of smart glasses frames. Hybrid power systems that combine multiple power sources could ensure an efficient and uninterrupted power supply, addressing the unique challenges posed by wearable technology.
As interaction paradigms, traditional touch and key controls are gradually being replaced by more intuitive interactions, such as gestures, eye movements, and brain signals. Flexible sensors and emerging interface technologies cater to user needs for convenience and efficiency while promoting a shift from “command” to “perception”-based interaction models. Yet, existing gestures and voice controls may falter in complex environments. Future development should introduce advanced gesture tracking algorithms and noise-resistant speech recognition technologies. Integrating multi-modal interactions—combining gestures, voice, touch, and eye tracking—will offer a more seamless user experience, reducing latency and ensuring smooth operation.
For materials, the exploration of flexible materials presents a natural way to interact and significantly improves wearing comfort and functional integration. Flexible smart wrist guards can enable gesture recognition through capturing subtle hand movements, whereas eye-tracking technology facilitates touchless interaction. In complex scenarios, multi-modal combinations enhance interaction efficiency. Furthermore, brain-computer interfaces (BCIs)173 that interpret EEG signals into operational commands represent a breakthrough in traditional interaction methods, offering inclusive solutions for individuals with physical disabilities174.
For clinical validation and efficacy evaluation, critical to assessing the effectiveness of smart glasses technology is clinical validation and efficacy evaluation. Future studies should prioritize large-scale clinical trials and real-world data analysis to verify the impact of smart glasses on user health management. By establishing a robust evidence base, these evaluations will facilitate widespread acceptance and adoption of smart glasses in healthcare settings and beyond.
The future development of smart glasses is expected to align with the trends illustrated in Fig. 10, enabling their widespread application across various scenarios and establishing them as intelligent health assistants for personalized and real-time health management and interaction.
Fig. 10. Prediction for the future development of smart glasses.
Smart glasses leverage technologies such as BCI, ASR, and gesture recognition to achieve functions such as immersive experience, smart medical care, reality navigation, and intelligent assistant.
This study reveals the significant potential of smart glasses in digital health management. By integrating IoT and AI technologies, smart glasses have not only achieved real-time health monitoring, chronic disease management, and personalized interventions but also markedly enhanced the speed and accuracy of medical responses. The research finds that the application of smart glasses has transcended traditional monitoring boundaries, will become an effective tool to promote treatment adherence and improve quality of life. Despite challenges such as data synchronization efficiency, hardware comfort, and adaptability, smart glasses are emerging as integral components of personal health ecosystems. They are progressively transforming paradigms of health management and disease prevention, paving the way for more intelligent and personalized healthcare services. Future research should focus on conducting large-scale clinical trials to validate these findings across diverse populations. Additionally, exploring the long-term impact of continuous health monitoring via smart glasses will be crucial for understanding their full potential in transforming healthcare delivery.
Methods
To systematically identify relevant literature on the application of smart glasses in healthcare and health management, we developed a comprehensive and rigorous search strategy to ensure high recall, relevance, and reproducibility.
Search type and conceptual framework
We adopted a Boolean search approach that combined both controlled vocabulary, such as MeSH terms, and free-text keywords to accommodate indexing variations across multiple databases. The search strategy was constructed around two core conceptual domains: the technology of interest, which included terms like “smart glasses,” “AR glasses,” “VR glasses,” “AI glasses,” and “intelligent glasses,” and the application domain, encompassing terms such as “health management,” “healthcare,” “medical,” and “clinical.” The final search string applied consistently across all databases was: (“smart glasses” OR “AR glasses” OR “VR glasses” OR “AI glasses” OR “intelligent glasses”) AND (“health management” OR “healthcare” OR “medical” OR “clinical”).
Databases and search scope
To focus on the latest advancements in artificial intelligence and smart glasses, we conducted a literature search in three major academic databases—PubMed, Web of Science Core Collection, and IEEE Xplore—retrieving English-language publications from January 2021 to April 2025 to ensure comprehensive coverage of both medical and engineering fields. PubMed was used to identify biomedical and health-related studies, yielding 534 records; the Web of Science Core Collection provided access to multidisciplinary research, including clinical and engineering studies, with 276 records identified; IEEE Xplore focused on literature related to smart hardware and human-computer interaction, contributing 51 records.
Screening and reference management
All identified records were imported into EndNote reference management software for deduplication.
Inclusion and exclusion criteria
The inclusion criteria for this review required studies that explicitly investigated smart glasses or their variants, such as AR, VR, or AI glasses, while excluding those that focused on generalized wearable devices. Eligible studies were those centered on healthcare or medical-related applications, including but not limited to health monitoring, clinical decision support, telemedicine, rehabilitation, nursing care, or disease screening. Studies were excluded if they were duplicate publications across databases (in such cases, only the most complete version was retained), focused on non-glasses wearable devices such as smartwatches or wristbands, or addressed non-healthcare-related contexts such as industrial, military, or entertainment applications. Additionally, non-peer-reviewed literature, including preprints, abstracts without full-text availability, and unpublished theses, was excluded to ensure the scientific rigor and credibility of the included studies.
Supplementary information
Acknowledgements
Gratitude is extended to the Department of Biomedical Sciences at City University of Hong Kong for providing an exceptional research environment and theoretical support. Special thanks are due to Dr. Yue Jiang, Associate Professor Hsiang-Yu Sean YUAN and Professor Junyu Wang for their expertise and guidance in the field of public health and smart glasses, which has laid a crucial foundation for this study.
Abbreviations
- 5G
5G Networks
- AI
Artificial Intelligence
- AR
Augmented Reality
- CDSS
Clinical Decision Support System
- DL
Deep Learning
- EC
Edge Computing
- EHR
Electronic Health Records
- FSPB
Flexible Self-Powered Bioelectronics
- IoMT
Internet of Medical Things
- LLM
Large Language Models
- MR
Mixed Reality
- NLP
Natural Language Processing
- OS
Operating System
- PHM
Proactive Health Management
- PHR
Personal Health Records
- RPM
Remote Patient Monitoring
- RTHM
Real-Time Health Monitoring
- SVM
Support Vector Machines
- UI
User Interface
- UX
User Experience
- VR
Virtual Reality
Author contributions
The concept of this study was developed by B.W. and G.X., who also oversaw the project supervision. B.W. outlined the main manuscript text structure and conducted comprehensive review and revision of the manuscript. Y.Z. conducted the literature review and analyzed recent advancements in technology and applications within the domain of AI-powered smart glasses. X.H., L.K, and S.C. contributed to the research project management. G.X. and Z.X. supervised the research and provided overall funding support for the project. S.C. reviewed the entire manuscript and made contributions to the major revisions, particularly in the modification of figures and content. All authors contributed to the final version of the manuscript
Data Availability
No datasets were generated or analyzed during the current study.
Competing interests
The authors declare no competing interests.
Footnotes
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Contributor Information
Boyuan Wang, Email: boyuan422@foxmail.com.
Gexin Xiao, Email: gexin_xiao@sina.com.
Zunxiong Xiao, Email: hyfyxzx@163.com.
Shanji Chen, Email: 1506376330@qq.com.
Supplementary information
The online version contains supplementary material available at 10.1038/s41746-025-01715-x.
References
- 1.Brasier, N. et al. Applied body-fluid analysis by wearable devices. Nature636, 57–68 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Ates, H. C. et al. End-to-end design of wearable sensors. Nat. Rev. Mater.7, 887–907 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Maxmen, A. Google spin-off deploys wearable electronics for huge health study. Nature547, 13–14 (2017). [DOI] [PubMed] [Google Scholar]
- 4.Jeong, H. et al. Data from the all of us research program reinforces existence of activity inequality. npj Digital Med.8, 8 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Zhang, Z., Bai, E., Stepanian, A., Jagannath, S. & Park, S. Y. Touchless interaction for smart glasses in emergency medical services: user needs and experiences. Int. J. Hum. Comput. Interact.41, 2984–3003 (2024).
- 6.Kazanskiy, N. L., Khonina, S. N. & Butt, M. A. A review on flexible wearables–Recent developments in non-invasive continuous health monitoring. Sens. Actuators A Phys.366, 114993 (2024).
- 7.Vo, D.-K. & Trinh, K. T. L. Advances in wearable biosensors for healthcare: current trends, applications, and future perspectives. Biosensors14, 560 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Ghadi, Y. Y. et al. Enhancing patient healthcare with mobile edge computing and 5G: challenges and solutions for secure online health tools. J. Cloud Comput.13, 93 (2024). [Google Scholar]
- 9.NEWS, T. DUBLIN. Wearable Electronics Global Strategic Industry Report 2024 (Global Industry Analysts, Inc., 2024).
- 10.Thomas, S. A. et al. Transforming global approaches to chronic disease prevention and management across the lifespan: integrating genomics, behavior change, and digital health solutions. Front. Public Health11, 1248254 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Zhang, Y., Stokes, J., Anselmi, L., Bower, P. & Xu, J. Can integrated care interventions strengthen primary care and improve outcomes for patients with chronic diseases? A systematic review and meta-analysis. Health Res. Policy Syst.23, 5 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Wang, Z. et al. Global, regional, and national burden of chronic obstructive pulmonary disease and its attributable risk factors from 1990 to 2021: an analysis for the Global Burden of Disease Study 2021. Respir. Res.26, 2 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Smith, J. D. et al. Preventing and managing chronic disease through implementation science: editor’s introduction to the supplemental issue. Prev. Sci.25, 1–9 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Brewer, D. How smart glasses could change healthcare delivery. HealthTech Magazine (2021).
- 15.Silcox, C. et al. The potential for artificial intelligence to transform healthcare: perspectives from international health leaders. NPJ Digital Med.7, 88 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Parisee. Smart Glasses 2025: innovation and reinvented daily life. https://parisee.com/en/smart-glasses-2025-innovation-serving-daily-life (2025).
- 17.Martínez-Galdámez, M. et al. Smart glasses evaluation during the COVID-19 pandemic: First-use on Neurointerventional procedures. Clin. Neurol. Neurosurg.205, 106655 (2021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Mallek, F., Mazhar, T., Shah, S. F. A., Ghadi, Y. Y. & Hamam, H. A review on cultivating effective learning: synthesizing educational theories and virtual reality for enhanced educational experiences. PeerJ Comput. Sci.10, e2000 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Shah, S. F. A., Mazhar, T., Shahzad, T., Ghadi, Y. Y. & Hamam, H. Integrating educational theories with virtual reality: enhancing engineering education and VR laboratories. Soc. Sci. Humanit. Open10, 101207 (2024). [Google Scholar]
- 20.Klinker, K., Wiesche, M. & Krcmar, H. Smart glasses in health care: a patient trust perspective. HawaiiInternational Conference on System Sciences (2020).
- 21.Blankenbach, K. Augmented Reality HUDs: Challenges, Solutions and Competitors. In 2023 30th International Workshop on Active-Matrix Flatpanel Displays and Devices (AM-FPD) IEEE. 28–31 (2023).
- 22.Wei, N. J., Dougherty, B., Myers, A. & Badawy, S. M. Using Google Glass in surgical settings: systematic review. JMIR mHealth uHealth6, e9409 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Server, I. NTT DoCoMo Displays AR Walker Glasses. https://www.india-server.com/news/ntt-docomo-displays-ar-walker-glasses-34296.html (2025).
- 24.Medicine, S. Google Glass helps kids with autism understand faces. https://med.stanford.edu/communitynews/2018fall/google-glass-helps-kids-with-autism-understand-faces.html (2025).
- 25.Kazakou, G. & Koutromanos, G. An overview of ten years of augmented reality smart glasses in education. Creative Educ.14, 2777–2792 (2023). [Google Scholar]
- 26.Vashisht, S., Tuli, N., Sharma, A., Mantri, A. & Sharma, S. A thorough analysis of Microsoft HoloLens applications in the medical and healthcare fields. In 2022 International Conference on Computational Modelling, Simulation and Optimization (ICCMSO) IEEE. 307–314 (2022).
- 27.Daily, C. Ever-evolving technologies help tell ancient stories of China’s millennia-old Liangzhu City. http://www.chinadaily.com.cn/a/202305/15/WS64619d5ea310b6054fad2fe5.html (2025).
- 28.Rasheed, Z., Ghwanmeh, S. & Almahadin, A. A Critical Review of AI-Driven AR Glasses for Realtime Multilingual Communication. In 2024 Advances in Science and Engineering Technology International Conferences (ASET) IEEE. 1–6 (2024).
- 29.Daily, C. Chinese startup Xreal unveils latest AR glasses. https://global.chinadaily.com.cn/a/202408/01/WS66ab64a0a3104e74fddb807f.html (2025).
- 30.Amazon. Echo Frames smart glasses: 11 things to know. https://www.aboutamazon.com/news/devices/echo-frame-smart-glasses-features (2025).
- 31.Meta. Introducing the New Ray-Ban | Meta Smart Glasses. https://about.fb.com/news/2023/09/new-ray-ban-meta-smart-glasses/ (2025).
- 32.China, G. Baidu unveils Xiaodu AI glasses: World’s first AI glasses with native Chinese model support. https://www.gizmochina.com/2024/11/12/baidu-unveils-xiaodu-ai-glasses-first-ai-glasses-native-chinese (2025).
- 33.Solos. Solos AirGo3 smart glasses expand live translate and group communication capabilities; launching at 2024 CES. https://solosglasses.com/blogs/article/solos-airgo3-smart-glasses-expand-live-translate-and-group-communication-capabilities-launching-at-2024-ces (2025).
- 34.Guide, T. S. These new smart glasses come with a camera and ChatGPT-4o. https://www.tomsguide.com/computing/smart-glasses/these-new-smart-glasses-come-with-a-camera-and-chatgpt-4o (2025).
- 35.Research, Q. Global and United States AI Glasses Market Report & Forecast 2024-2030. https://www.qyresearch.com/reports/3285622/ai-glasses (2025).
- 36.Research, A. No-Display AI Smart Glasses Redefine the Future of Wearables with 15 Million Shipments in 2030. https://www.abiresearch.com/press/no-display-ai-smart-glasses-redefine-the-future-of-wearables-with-15-million-shipments-in-2030 (2025).
- 37.Intellect, M. R. Smart glasses: the next big thing in wearable electronics and semiconductor innovations. https://www.marketresearchintellect.com/blog/smart-glasses-the-next-big-thing-in-wearable-electronics-and-semiconductor-innovations/ (2025).
- 38.Kong, C. D. H. China new growth: Smart glasses captivate consumers, sales surge. https://www.chinadailyhk.com/hk/article/598719 (2025).
- 39.Research, C. Ray-Ban Meta crosses 1-million mark; success indicates promising future for lightweight AR glasses. https://www.counterpointresearch.com/insight/post-insight-research-notes-blogs-rayban-meta-crosses-1million-mark-success-indicates-promising-future-for-lightweight-ar-glasses (2025).
- 40.Gollapalli, S., Sharma, V., Al Ghazwi, A. & Heskin, L. Smart glasses in surgery: the theatre and beyond. Surg. Innov.31, 502–508 (2024). [DOI] [PubMed] [Google Scholar]
- 41.Al Shloul, T. et al. Role of activity-based learning and ChatGPT on students’ performance in education. Comput. Educ. Artif. Intell.6, 100219 (2024). [Google Scholar]
- 42.Ghaempanah, F. et al. Metaverse and its impact on medical education and health care system: a narrative review. Health Sci. Rep.7, e70100 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Zuidhof, N., Peters, O., Verbeek, P. P. & Ben Allouch, S. Social acceptance of smart glasses in health care: model evaluation study of anticipated adoption and social interaction. JMIR Form. Res.9, e49610 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Abdenacer, N. et al. Task offloading for smart glasses in healthcare: Enhancing detection of elevated body temperature. In 2023 IEEE International Conference on Smart Internet of Things (SmartIoT) IEEE. 243–250 (2023).
- 45.Baashar, Y. et al. Towards wearable augmented reality in healthcare: a comparative survey and analysis of head-mounted displays. Int. J. Environ. Res. Public Health10.3390/ijerph20053940 (2023). [DOI] [PMC free article] [PubMed]
- 46.Romare, C. & Skär, L. The use of smart glasses in nursing education: a scoping review. Nurse Educ. Pr.73, 103824 (2023). [DOI] [PubMed] [Google Scholar]
- 47.Kim, D. & Choi, Y. Applications of smart glasses in applied sciences: a systematic review. Appl. Sci.11, 4956 (2021). [Google Scholar]
- 48.Song, Y. et al. Multi-sensory visual-auditory fusion of wearable navigation assistance for people with impaired vision. IEEE Trans. Autom. Sci. Eng.22, 4724–4736 (2023).
- 49.Gao, L., Wang, C. & Wu, G. Wearable biosensor smart glasses based on augmented reality and eye tracking. Sensors24, 6740 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Huawei. HUAWEI Eyewear 2. https://consumer.huawei.com/en/audio/huawei-eyewear-2/ (2025).
- 51.Pro, P. O. Which smart glasses are the best 2024. https://precisionopticspro.com/which-smart-glasses-are-the-best-2024/ (2025).
- 52.Guan, Z. et al. Artificial intelligence in diabetes management: advancements, opportunities, and challenges. Cell Rep. Med4, 101213 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Xue, Z., Gai, Y., Wu, Y., liu, Z. & Li, Z. Wearable mechanical and electrochemical sensors for real-time health monitoring. Commun. Mater.5, 211 (2024). [Google Scholar]
- 54.Holz, C. & Wang, E. J. Glabella: continuously sensing blood pressure behavior using an unobtrusive wearable device. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.1, 1–23 (2017). [Google Scholar]
- 55.Park, W. et al. In-depth correlation analysis between tear glucose and blood glucose using a wireless smart contact lens. Nat. Commun.15, 2828 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Spectrum, I. This eyewear offers a buckshot method to monitor health. https://spectrum.ieee.org/glasses-health-tech (2025).
- 57.Gruber, N. et al. Virtual reality’s impact on children with type 1 diabetes: a proof-of-concept randomized cross-over trial on anxiety, pain, adherence, and glycemic control. Acta Diabetol.61, 215–224 (2024). [DOI] [PubMed] [Google Scholar]
- 58.Kasimoglu, Y., Alpaycetin, E., Ince, G. & Tuna Ince, E. B. Reduction of dental anxiety in children using virtual reality: a randomised controlled trial. Eur. J. Paediatr. Dent.10.23804/ejpd.2024.2109 (2024). [DOI] [PubMed]
- 59.Shamali, M., Vilmann, P., Johansen, N. R. & Konradsen, H. Virtual reality intervention to improve quality of care during colonoscopy: a hybrid type 1 randomized controlled trial. Gastrointest. Endosc.100, 914–922.e912 (2024). [DOI] [PubMed] [Google Scholar]
- 60.Bahrololoomi, Z., Zein Al-Din, J., Maghsoudi, N. & Sajedi, S. Efficacy of virtual reality distraction in reduction of pain and anxiety of pediatric dental patients in an iranian population: a split-mouth randomized crossover clinical trial. Int. J. Dent.2024, 1290410 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Gurbuz, E. & Gurbuz, A. A. Investigation of the effect of virtual reality distraction in patients undergoing mandibular periodontal surgery: a randomized controlled study. J. Esthet. Restor. Dent.36, 813–822 (2024). [DOI] [PubMed] [Google Scholar]
- 62.Gerards, M. et al. Virtual reality for distraction during painful procedures in pediatric surgery: a randomized clinical trial. J. Pediatr. Nurs.82, 116–122 (2025). [DOI] [PubMed] [Google Scholar]
- 63.Iannone, A. & Giansanti, D. Breaking barriers-the intersection of AI and assistive technology in autism care: a narrative review. J. Pers. Med.10.3390/jpm14010041 (2023). [DOI] [PMC free article] [PubMed]
- 64.Kiprijanovska, I. et al. Smart Glasses for Gait Analysis of Parkinson’s Disease Patients. In 2023 46th MIPRO ICT and Electronics Convention (MIPRO) IEEE. 385–390 (2023).
- 65.Fourati, H. & Saidane, L. A. HealthGlasses Project: WBAN Based Communication for Health Monitoring Through Smart Glasses. In 2023 International Symposium on Networks, Computers and Communications (ISNCC) IEEE. 1-6 (2023).
- 66.Bisio I, Garibotto C, Hamedani M, et al. Towards Sensorized Glasses: A Smart Wearable System for Head Movement Monitoring;In Proceedings of the 2024 9th International Conference on Smart and Sustainable Technologies (SpliTech), (2024).
- 67.Hemmerling, D. et al. Multimodal Approach for the Diagnosis of Neurodegenerative Disorders Using Augmented Reality. In 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) IEEE. 1166–1167 (2024).
- 68.Joseph, P., Plozza, D., Pascarella, L. & Magno, M. Gaze-Guided Semi-Autonomous Quadruped Robot for Enhanced Assisted Living. In 2024 IEEE Sensors Applications Symposium (SAS) 1–6 IEEE (2024).
- 69.Luck-Sikorski, C. et al. Digital communication and virtual reality for extending the behavioural treatment of obesity - the patients’ perspective: results of an online survey in Germany. BMC Med. Inf. Decis. Mak.23, 100 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Hardeman, L. E. S., Geerse, D. J., Hoogendoorn, E. M., Nonnekes, J. & Roerdink, M. Remotely prescribed, monitored, and tailored home-based gait-and-balance exergaming using augmented reality glasses: a clinical feasibility study in people with Parkinson’s disease. Front. Neurol.15, 1373740 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Motahari-Nezhad, H. et al. Digital biomarker-based studies: scoping review of systematic reviews. JMIR Mhealth Uhealth10, e35722 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Ma, X. et al. Test-retest reliability of subjective visual vertical from different head-tilt angles in young healthy adults. Lin. Chuang Er Bi Yan Hou Tou Jing Wai Ke Za Zhi38, 692–696 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Li, H. H., Lian, J. J. & Liao, Y. H. Design an Adaptive Virtual Reality Game to Promote Elderly Health. In 2023 International Conference on Computer, Information and Telecommunication Systems (CITS) IEEE. 1–7 (2023).
- 74.Jang, Y. E. et al. Smart glasses for radial arterial catheterization in pediatric patients: a randomized clinical trial. Anesthesiology135, 612–620 (2021). [DOI] [PubMed] [Google Scholar]
- 75.Reis, G. et al. Mixed reality applications in urology: requirements and future potential. Ann. Med Surg. (Lond.)66, 102394 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Costa, J. N. et al. Ultrasound training simulator using augmented reality glasses: an accuracy and precision assessment study. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc.2022, 4461–4464 (2022). [DOI] [PubMed] [Google Scholar]
- 77.Kagawa, S. & Iijima, S. Usability test for a smart glass-based application to support nurses’ hospital admission tasks. Comput. Inform. Nurs. 10.1097/cin.0000000000001295 (2025). [DOI] [PubMed]
- 78.Romare, C. et al. Nurse anesthetists’ experiences using smart glasses to monitor patients’ vital signs during anesthesia care: a qualitative study. PLoS ONE16, e0250122 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.van Bergem, J. et al. Gait and balance assessments with augmented reality glasses in people with Parkinson’s disease: concurrent validity and test-retest reliability. Sensors10.3390/s24175485 (2024). [DOI] [PMC free article] [PubMed]
- 80.Okachi, S. et al. Virtual bronchoscopy-guided transbronchial biopsy simulation using a head-mounted display: a new style of flexible bronchoscopy. Surg. Innov.29, 811–813 (2022). [DOI] [PubMed] [Google Scholar]
- 81.Munusamy, T. et al. Telemedicine via smart glasses in critical care of the neurosurgical patient-COVID-19 pandemic preparedness and response in neurosurgery. World Neurosurg.145, e53–e60 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.Cen, J. et al. Three-dimensional printing, virtual reality and mixed reality for pulmonary atresia: early surgical outcomes evaluation. Heart Lung Circ.30, 296–302 (2021). [DOI] [PubMed] [Google Scholar]
- 83.Wan, X. et al. A novel motionless calibration method for augmented reality surgery navigation system based on optical tracker. Heliyon8, e12115 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Li, H. et al. ImTooth: neural implicit tooth for dental augmented reality. IEEE Trans. Vis. Comput. Graph.29, 2837–2846 (2023). [DOI] [PubMed] [Google Scholar]
- 85.Hardeman, L. E. S., Geerse, D. J., Hoogendoorn, E. M., Nonnekes, J. & Roerdink, M. Remotely prescribed and monitored home-based gait-and-balance therapeutic exergaming using augmented reality (AR) glasses: protocol for a clinical feasibility study in people with Parkinson’s disease. Pilot Feasibility Stud.10, 54 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Gadzhiev, N. et al. Role and utility of mixed reality technology in laparoscopic partial nephrectomy: outcomes of a prospective RCT using an indigenously developed software. Adv. Urol.2022, 8992051 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Lareyre, F. et al. Applications of head-mounted displays and smart glasses in vascular surgery. Ann. Vasc. Surg.75, 497–512 (2021). [DOI] [PubMed] [Google Scholar]
- 88.Sun, X., Gu, S., Jiang, L. & Wu, Y. A low-cost mobile system with multi-AR guidance for brain surgery assistance. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc.2021, 2222–2225 (2021). [DOI] [PubMed] [Google Scholar]
- 89.Fucentese, S. F. & Koch, P. P. A novel augmented reality-based surgical guidance system for total knee arthroplasty. Arch. Orthop. Trauma Surg.141, 2227–2233 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 90.Lambrechts, J. et al. Accuracy of a new augmented reality assisted technique for total knee arthroplasty: an in vivo study. Arthroplast Today30, 101565 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Wang, Y. et al. The effect of smart glasses combined with ultrasound on radial arterial catheterization: a randomized controlled trial. BMC Anesthesiol.24, 444 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Sakai, D. et al. Adolescent idiopathic scoliotic deformity correction surgery assisted by smart glasses can enhance correction outcomes and accuracy and also improve surgeon fatigue. World Neurosurg.178, e96–e103 (2023). [DOI] [PubMed] [Google Scholar]
- 93.Dorey, S. et al. Radiation protection value to the operator from augmented reality smart glasses in interventional fluoroscopy procedures using phantoms. Radiography25, 301–307 (2019). [DOI] [PubMed] [Google Scholar]
- 94.Today, X. Vuzix M400 review: powerful industrial smart glasses. https://www.xrtoday.com/reviews/vuzix-m400-review-powerful-industrial-smart-glasses/ (2025).
- 95.Lee, K. W. et al. Feasibility of wearable display glasses for medical students in the endoscopy room. Clin. Endosc.54, 694–700 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 96.Weerasinghe, K. et al. Real-Time Multimodal Cognitive Assistant for Emergency Medical Services. In 2024 IEEE/ACM Ninth International Conference on Internet-of-Things Design and Implementation (IoTDI) IEEE. 85–96 (2024).
- 97.Baker, J., Schultz, M., Huecker, M. & Shreffler, J. Smart glasses and video conferencing provide valuable medical student clinical exposure during COVID-19. AEM Educ. Train.5, e10571 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Ghavami Hosein Pour, B., Karimian, Z. & Hatami Niya, N. A narrative review of advancing medical education through technology: the role of smart glasses in situated learning. BMC Med. Educ.25, 359 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 99.Punj, A., Negassa, A., Gutierrez, A., Ch’en, P. & Jariwala, S. Technology enhanced medical education using smart glasses for oral and dental examinations: an observational pilot study. BMC Med. Educ.25, 252 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Calik, A., Ozkul, D. & Kapucu, S. Smart glasses use experience of nursing graduate students: qualitative study. BMC Nurs.23, 257 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Glick, Y. et al. Augmenting prehospital care. BMJ Mil. Health167, 158–162 (2021). [DOI] [PubMed] [Google Scholar]
- 102.Heffernan, M. et al. Implications of mobile technology on hospitalization rates in medically underserved areas worldwide: a systematic review. Cureus17, e78409 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.Liebermann, A. et al. Impact of a virtual prosthetic case planning environment on perceived immersion, cognitive load, authenticity and learning motivation in dental students. Eur. J. Dent. Educ.28, 9–19 (2024). [DOI] [PubMed] [Google Scholar]
- 104.Diaka, J. et al. Leveraging smart glasses for telemedicine to improve primary healthcare services and referrals in a remote rural district, Kingandu, DRC, 2019-2020. Glob. Health Action14, 2004729 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.Mitrasinovic, S. et al. Clinical and surgical applications of smart glasses. Technol. Health Care23, 381–401 (2015). [DOI] [PubMed] [Google Scholar]
- 106.Hu, Y. et al. Virtual reality in clinical nursing practice over the past 10 years: umbrella review of meta-analyses. JMIR Serious Games11, e52022 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 107.Kim, J., Heo, N. & Kang, H. Validity and reliability of positive attitudes toward and perceived importance of wearable display technology as an effective learning tool among nursing students. Nurse Educ. Pract.73, 103812 (2023). [DOI] [PubMed] [Google Scholar]
- 108.van Doorn, P. F. et al. Gait parameters can be derived reliably and validly from augmented reality glasses in people with Parkinson’s disease performing 10-m walk tests at comfortable and fast speeds. Sensors10.3390/s25041230 (2025). [DOI] [PMC free article] [PubMed]
- 109.Bermingham, K. M. et al. Effects of a personalized nutrition program on cardiometabolic health: a randomized controlled trial. Nat. Med.30, 1888–1897 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 110.He, Q., Huang, H. & Wang, Y. Detection technologies, and machine learning in food: recent advances and future trends. Food Biosci.62, 105558 (2024). [Google Scholar]
- 111.Mezgec, S. & Koroušić Seljak, B. NutriNet: a deep learning food and drink image recognition system for dietary assessment. Nutrients10.3390/nu9070657 (2017). [DOI] [PMC free article] [PubMed]
- 112.Abdul Kareem, R. S., Tilford, T. & Stoyanov, S. Fine-grained food image classification and recipe extraction using a customized deep neural network and NLP. Comput. Biol. Med.175, 108528 (2024). [DOI] [PubMed] [Google Scholar]
- 113.Tan, M. & Le, Q. In Proceedings of the 36th International Conference on Machine Learning Vol. 97 (eds Kamalika, C. & Ruslan, S.) 6105–6114 (Proceedings of Machine Learning Research, 2019).
- 114.Lin, T. Y. et al. Feature pyramid networks for object detection. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 936–944 (2017).
- 115.Wang, H. et al. Nutritional composition analysis in food images: an innovative Swin Transformer approach. Front. Nutr.11, 1454466 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 116.Liu, Z. et al. Swin transformer: Hierarchical vision transformer using shifted windows. In Proceedings of the IEEE/CVF international conference on computer vision 10012–10022 (2021).
- 117.Myers, A. et al. Im2Calories: towards an automated mobile vision food diary. In Proceedings of the IEEE international conference on computer vision 1233–1241 (2015).
- 118.Liu, Y.-C., Onthoni, D. D., Mohapatra, S., Irianti, D. & Sahoo, P. K. Deep-learning-assisted multi-dish food recognition application for dietary intake reporting. Electronics11, 1626 (2022). [Google Scholar]
- 119.Han, Y., Cheng, Q., Wu, W. & Huang, Z. DPF-nutrition: food nutrition estimation via depth prediction and fusion. Foods10.3390/foods12234293 (2023). [DOI] [PMC free article] [PubMed]
- 120.Shao, W. et al. Vision-based food nutrition estimation via RGB-D fusion network. Food Chem.424, 136309 (2023). [DOI] [PubMed] [Google Scholar]
- 121.Rokhva, S., Teimourpour, B. & Soltani, A. H. Computer vision in the food industry: accurate, real-time, and automatic food recognition with pretrained MobileNetV2. Food Humanit.3, 100378 (2024). [Google Scholar]
- 122.Han, M., Chen, J. & Zhou, Z. NutrifyAI: an AI-powered system for real-time food detection, nutritional analysis, and personalized meal recommendations. Preprint at https://arxiv.org/abs/2408.10532 (2024).
- 123.Yang, Z. et al. ChatDiet: empowering personalized nutrition-oriented food recommender chatbots through an LLM-augmented framework. Smart Health32, 100465 (2024). [Google Scholar]
- 124.Ben Charif, A. et al. Tools for assessing the scalability of innovations in health: a systematic review. Health Res. Policy Syst.20, 34 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 125.Batool, I. RealTime health monitoring using 5G networks: a deep learning-based architecture for remote patient care. Preprint at 10.48550/arXiv.2501.01027 (2025).
- 126.LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature521, 436–444 (2015). [DOI] [PubMed] [Google Scholar]
- 127.Wang, X. et al. Deep reinforcement learning: a survey. IEEE Trans. Neural Netw. Learn. Syst.35, 5064–5078 (2024). [DOI] [PubMed] [Google Scholar]
- 128.Cervantes, J., Garcia-Lamont, F., Rodríguez-Mazahua, L. & Lopez, A. A comprehensive survey on support vector machine classification: applications, challenges and trends. Neurocomputing408, 189–215 (2020). [Google Scholar]
- 129.Tan, Y. et al. MedChatZH: a tuning LLM for traditional Chinese medicine consultations. Comput. Biol. Med.172, 108290 (2024). [DOI] [PubMed] [Google Scholar]
- 130.Wu, X., Liu, C., Wang, L. & Bilal, M. Internet of Things-enabled real-time health monitoring system using deep learning. Neural Comput. Appl.35, 14565–14576 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 131.Bulić, P., Kojek, G. & Biasizzo, A. Data transmission efficiency in bluetooth low energy versions. Sensors19, 3746 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 132.Kim, Y., Lee, J., Ko, J., Kim, C. & Jung, Y. J. Personalized display techniques for next generation DTV. In 2008 10th International Conference on Advanced Communication Technology IEEE. 3, 1564–1568 (2008).
- 133.Abdulqader, A. F. et al. 5G and the Internet of Things Collaborate to Improve Smart Glasses for the Visually Impaired. In 2024 36th Conference of Open Innovations Association (FRUCT) IEEE. 159–167 (2024).
- 134.Golzar, M., Fotouhi-Ghazvini, F., Rabbani, H. & Zakeri, F. S. Mobile cardiac health-care monitoring and notification with real time tachycardia and bradycardia arrhythmia detection. J. Med. Signals Sens.7, 193–202 (2017). [PMC free article] [PubMed] [Google Scholar]
- 135.Wang, Z. et al. Precision calories: a promising strategy for personalized health interventions in the precision nutrition framework. Trends Food Sci. Technol.153, 104727 (2024). [Google Scholar]
- 136.Schärer, N. et al. ElectraSight: smart glasses with fully onboard non-invasive eye tracking using hybrid contact and contactless EOG. Preprint at https://arxiv.org/abs/2412.14848 (2024).
- 137.Noah, B. et al. Impact of remote patient monitoring on clinical outcomes: an updated meta-analysis of randomized controlled trials. npj Digital Med.1, 20172 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 138.Boikanyo, K., Zungeru, A. M., Sigweni, B., Yahya, A. & Lebekwe, C. Remote patient monitoring systems: applications, architecture, and challenges. Sci. Afr.20, e01638 (2023). [Google Scholar]
- 139.Xu, J. et al. On-device language models: a comprehensive review. Preprint at https://arxiv.org/abs/2409.00088 (2024).
- 140.Cho, Y. & Noh, S. D. Design and implementation of digital twin factory synchronized in real-time using MQTT. Machines12, 759 (2024). [Google Scholar]
- 141.Pimentel, V. & Nickerson, B. G. Communicating and displaying real-time data with WebSocket. IEEE Internet Comput.16, 45–53 (2012). [Google Scholar]
- 142.Li, L., Wang, Y., Wang, H., Hu, S. & Wei, T. An efficient architecture for imputing distributed data sets of IoT networks. IEEE Internet Things J.10, 15100–15114 (2023). [Google Scholar]
- 143.Zhang, R., Chen, H., Ma, Y. & Jin, Z. Dental artificial intelligence systems: a review of various data types. Discov. Med.36, 482–493 (2024). [DOI] [PubMed] [Google Scholar]
- 144.Kim, J. et al. Effectiveness of head-mounted ultrasound display for radial arterial catheterisation in paediatric patients by anaesthesiology trainees: a randomised clinical trial. Eur. J. Anaesthesiol.41, 522–529 (2024). [DOI] [PubMed] [Google Scholar]
- 145.Cercenelli, L. et al. Augmented reality to assist skin paddle harvesting in osteomyocutaneous fibular flap reconstructive surgery: a pilot evaluation on a 3D-printed leg phantom. Front. Oncol.11, 804748 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 146.Grajek, J. et al. Multidimensional formats of surgical anatomy in otorhinolaryngology student teaching-a comparison of effectivity. HNO72, 357–366 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 147.Yoon, J. W. et al. Do-it-yourself augmented reality heads-up display (DIY AR-HUD): a technical note. Int J. Spine Surg.15, 826–833 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 148.Sumner, J. et al. Through the lens: a qualitative exploration of nurses’ experiences of smart glasses in urgent care. J. Clin. Nurs.34, 948–958 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 149.Iacono, V. et al. The use of augmented reality for limb and component alignment in total knee arthroplasty: systematic review of the literature and clinical pilot study. J. Exp. Orthop.8, 52 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 150.Enlöf, P., Romare, C., Jildenstål, P., Ringdal, M. & Skär, L. Smart glasses for anesthesia care: initial focus group interviews with specialized health care professionals. J. Perianesth. Nurs.36, 47–53 (2021). [DOI] [PubMed] [Google Scholar]
- 151.Ceccariglia, F., Cercenelli, L., Badiali, G., Marcelli, E. & Tarsitano, A. Application of augmented reality to maxillary resections: a three-dimensional approach to maxillofacial oncologic surgery. J. Pers. Med.10.3390/jpm12122047 (2022). [DOI] [PMC free article] [PubMed]
- 152.Qi, Z. et al. Holographic mixed-reality neuronavigation with a head-mounted device: technical feasibility and clinical application. Neurosurg. Focus51, E22 (2021). [DOI] [PubMed] [Google Scholar]
- 153.England, A. et al. A comparison of perceived image quality between computer display monitors and augmented reality smart glasses. Radiography29, 641–646 (2023). [DOI] [PubMed] [Google Scholar]
- 154.Apiratwarakul, K. et al. Smart glasses: a new tool for assessing the number of patients in mass-casualty incidents. Prehosp. Disaster Med.37, 480–484 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 155.Jiang, Z. et al. DietGlance: dietary monitoring and personalized analysis at a glance with knowledge-empowered AI assistant. Preprint at https://ui.adsabs.harvard.edu/abs/2025arXiv250201317J (2025).
- 156.Xu, Z., Wu, J., Lei, J. & Jin, Z. The impact of augmented reality glasses on human visual efficiency and digital eye fatigue. Zhonghua Yan Ke Za Zhi60, 352–358 (2024). [DOI] [PubMed] [Google Scholar]
- 157.Rajpurkar, P., Chen, E., Banerjee, O. & Topol, E. J. AI in health and medicine. Nat. Med.28, 31–38 (2022). [DOI] [PubMed] [Google Scholar]
- 158.Guingrich, R. E. & Graziano, M. S. Chatbots as social companions: how people perceive consciousness, human likeness, and social health benefits in machines. Preprint at https://arxiv.org/abs/2311.10599 (2023).
- 159.Olawade, D. B. et al. Enhancing mental health with Artificial Intelligence: current trends and future prospects. J. Med. Surg. Public Health3, 100099 (2024). [Google Scholar]
- 160.Shaban-Nejad, A., Michalowski, M. & Buckeridge, D. L. Health intelligence: how artificial intelligence transforms population and personalized health. npj Digital Med.1, 53 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 161.Hassan, B. M. & Elagamy, S. M. Personalized medical recommendation system with machine learning. Neural Comput. Appl.10.1007/s00521-024-10916-6 (2025).
- 162.Chang, W. J. et al. A 2 Fitness: An Artificial Intelligent FitnessAssistance System Using a Augmented Reality Smart Glasses for Elderly Health-Promotion. In 2023 International Conference on Consumer Electronics - Taiwan (ICCE-Taiwan) 665–666 (2023).
- 163.Caffagni, D. et al. The Revolution of Multimodal Large Language Models: A Survey. In Findings of the Association for Computational Linguistics: ACL 13590–13618 Bangkok, Thailand (Association for Computational Linguistics) (2024).
- 164.Zhang, D. et al. MM-LLMs: recent advances in multimodal large language models. In Findings of the Association for Computational Linguistics: ACL 12401–12430, Bangkok, Thailand. Association for Computational Linguistics. Preprint at https://arxiv.org/abs/2401.13601 (2024).
- 165.Linseisen, J. et al. Perspective: Data in personalized nutrition: Bridging biomedical, psycho-behavioral, and food environment approaches for population-wide impact. Adv. Nutr.10.1016/j.advnut.2025.100377 (2025). [DOI] [PubMed]
- 166.Taherdoost, H. Wearable healthcare and continuous vital sign monitoring with IoT integration. Comput. Mater. Contin.81, 79–104 (2024). [Google Scholar]
- 167.Gacem, M. A. et al. Smart assistive glasses for Alzheimer's patients. In 2019 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT) 1–5 (2019).
- 168.Pierdicca, R. et al. In Augmented Reality, Virtual Reality, and Computer Graphics (eds De Paolis, L. T. & Bourdot, P.) 231–247 (Springer International Publishing, 2020).
- 169.Xia, H. et al. Shaping high-performance wearable robots for human motor and sensory reconstruction and enhancement. Nat. Commun.15, 1760 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 170.Koutromanos, G. & Kazakou, G. Augmented reality smart glasses use and acceptance: a literature review. Comput. Educ. X Real.2, 100028 (2023). [Google Scholar]
- 171.Cai, Y. et al. Multimodal sentiment analysis based on multi-layer feature fusion and multi-task learning. Sci. Rep.15, 2126 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 172.Zhi, Y., Li, J., Wang, H., Chen, J. & Wei, W. A multimodal sentiment analysis method based on fuzzy attention fusion. IEEE Trans. Fuzzy Syst.32, 5886–5898 (2024). [Google Scholar]
- 173.Sun, Y. et al. Signal acquisition of brain–computer interfaces: a medical-engineering crossover perspective review. Fundam. Res.10.1016/j.fmre.2024.04.011 (2024). [DOI] [PMC free article] [PubMed]
- 174.Angulo-Sherman, I. N. & Salazar-Varas, R. in Advances in Smart Healthcare Paradigms and Applications: Outstanding Women in Healthcare—Volume 1 (eds Kwaśnicka, H. et al.) 173–197 (Springer Nature, Switzerland, 2023).
- 175.Rokid. Rokid Max AR glasses. https://jp.rokid.com (2025).
- 176.Meta-Platforms. Ray-Ban Meta smart glasses. https://www.ray-ban.com (2025).
- 177.RayNeo. RayNeo Air3. https://www.rayneo.com (2025).
- 178.Xiaomi. Mijia Smart Audio Glasses Pilot. https://www.mi.com (2025).
- 179.Xiaomi. Xiaomi Smart Glasses Camera. https://www.mi.com (2025).
- 180.Meizu. MYVU AR Glasses. https://meizu.com/starv/myvu/summary_myvu (2025).
- 181.Huawei-Technologies. Huawei’s Third-Generation Smart Glasses. https://consumer.huawei.com (2025).
- 182.Huawei-Technologies. HUAWEI × GENTLE MONSTER Eyewear II. https://consumer.huawei.com/jp/wearables/gentle-monster-eyewear2/buy (2025).
- 183.XREAL. XREAL Air 2 Smart Glasses. https://jp.shop.xreal.com/en/products/xreal-air-2 (2025).
- 184.Lucyd. Lucyd Smart Glasses. https://www.lucyd.co (2025).
- 185.Amazon. Amazon Echo Frames (Gen 3). https://www.amazon.com (2025).
- 186.Yin, K. et al. Advanced liquid crystal devices for augmented reality and virtual reality displays: principles and applications. Light Sci. Appl.11, 161 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 187.Oudah, M., Al-Naji, A. & Chahl, J. Hand gesture recognition based on computer vision: a review of techniques. J. Imaging10.3390/jimaging6080073 (2020). [DOI] [PMC free article] [PubMed]
- 188.Lee, L. H. & Hui, P. Interaction methods for smart glasses: a survey. IEEE Access6, 28712–28732 (2018). [Google Scholar]
- 189.Ma, C. C. et al. Multimodal fusion with LLMs for engagement prediction in natural conversation. Preprint at https://ui.adsabs.harvard.edu/abs/2024arXiv240909135M (2024).
- 190.Mondal, M., Khayati, M., Sandlin, H. -Â & Cudré-Mauroux, P. A survey of multimodal event detection based on data fusion. VLDB J.34, 9 (2024). [Google Scholar]
- 191.Moosmann, J. et al. Ultra-efficient on-device object detection on AI-integrated smart glasses with TinyissimoYOLO. Preprint at. https://ui.adsabs.harvard.edu/abs/2023arXiv231101057M (2023).
- 192.Wang, J. et al. A practical stereo depth system for smart glasses. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2023, pp. 21498-21507. Preprint at https://ui.adsabs.harvard.edu/abs/2022arXiv221110551W (2022).
- 193.Zhang, D., Li, Y., He, Z. & Li, X. in Companion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing 631–633 (Association for Computing Machinery, Melbourne, VIC, Australia, 2024).
- 194.Waisberg, E. et al. Meta smart glasses—large language models and the future for assistive glasses for individuals with vision impairments. Eye10.1038/s41433-023-02842-z (2023). [DOI] [PMC free article] [PubMed]
- 195.Kim, M., Choi, S. H., Park, K.-B. & Lee, J. Y. User interactions for augmented reality smart glasses: a comparative evaluation of visual contexts and interaction gestures. Appl. Sci.9, 3171 (2019). [Google Scholar]
- 196.Jin, X., Li, L., Dang, F., Chen, X. & Liu, Y. A survey on edge computing for wearable technology. Digital Signal Process.125, 103146 (2022). [Google Scholar]
- 197.Alshahrani, A., Elgendy, I. A., Muthanna, A., Alghamdi, A. M. & Alshamrani, A. Efficient multi-player computation offloading for VR edge-cloud computing systems. Appl. Sci.10, 5515 (2020). [Google Scholar]
- 198.Bal, A. et al. in ICT for Intelligent Systems (eds Choudrie, J., Mahalle, P. N., Perumal, T. & Joshi, A.) 279–288 (Springer Nature, Singapore, 2025).
- 199.Hasan, M. A. & Mishuk, M. N. MEMS IMU based pedestrian indoor navigation for smart glass. Wirel. Pers. Commun.101, 287–303 (2018). [Google Scholar]
- 200.A, H. A., Rao, S. U., Ranganath, S., Ashwin, T. S. & Reddy, G. R. M. A Google Glass based real-time scene analysis for the visually impaired. IEEE Access9, 166351–166369 (2021). [Google Scholar]
- 201.Poy, Y. L., Darmaraju, S., Goh, C. H. & Kwan, B. H. Standalone smart glass system for the blind and visually impaired. In 2024 IEEE 14th Symposium on Computer Applications & Industrial Electronics (ISCAIE) IEEE. 239–244 (2024).
- 202.van Spaendonck, Z. et al. Comparing smartphone virtual reality exposure preparation to care as usual in children aged 6 to 14 years undergoing magnetic resonance imaging: protocol for a multicenter, observer-blinded, randomized controlled trial. JMIR Res. Protoc.12, e41080 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 203.Rosa, A., De Angelis, R., Pujia, A. M., Cardelli, P. & Arcuri, C. Virtual reality in specialized dentistry: employing virtual reality for the alleviation of pain and anxiety in hereditary angioedema patients. Minerva Dent. Oral. Sci.74, 20–25 (2025). [DOI] [PubMed] [Google Scholar]
- 204.Okay, B. & Üze Okay, Z. Investigating the effect of virtual reality glasses during inhaler therapy use in children: a randomized clinical trial. Paediatr. Child Health30, 11–16 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 205.Abbasnia, F., Aghebati, N., Miri, H. H. & Etezadpour, M. Effects of patient education and distraction approaches using virtual reality on pre-operative anxiety and post-operative pain in patients undergoing laparoscopic cholecystectomy. Pain. Manag Nurs.24, 280–288 (2023). [DOI] [PubMed] [Google Scholar]
- 206.Bannink Mbazzi, F. et al. Use of virtual reality distraction to reduce child pain and fear during painful medical procedures in children with physical disabilities in uganda: a feasibility study. Pain. Med23, 642–654 (2022). [DOI] [PubMed] [Google Scholar]
- 207.Almedhesh, S. A., Elgzar, W. T., Ibrahim, H. A. & Osman, H. A. The effect of virtual reality on anxiety, stress, and hemodynamic parameters during cesarean section: a randomized controlled clinical trial. Saudi Med J.43, 360–369 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 208.Dalir, Z., Seddighi, F., Esmaily, H., Abbasi Tashnizi, M. & Ramezanzade Tabriz, E. Effects of virtual reality on chest tube removal pain management in patients undergoing coronary artery bypass grafting: a randomized clinical trial. Sci. Rep.14, 2918 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 209.Lu, L. et al. Wearable health devices in health care: narrative systematic review. JMIR Mhealth Uhealth8, e18907 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 210.Ienghong, K., Cheung, L. W., Wongwan, P. & Apiratwarakul, K. Smart glasses to facilitate ultrasound guided peripheral intravenous access in the simulation setting for Thai emergency medical service providers. J. Multidiscip. Health.16, 2201–2206 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 211.Yoon, H., Kim, S., Lee, Y. & Choi, J. Google Glass-supported cooperative training for health professionals: a case study based on using remote desktop virtual support. J. Multidiscip. Health.14, 1451–1462 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 212.Zhang, Z., Ramiya Ramesh Babu, N. A., Adelgais, K. & Ozkaynak, M. Designing and implementing smart glass technology for emergency medical services: a sociotechnical perspective. JAMIA Open5, ooac113 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 213.Zhang, Z. et al. Applications and user perceptions of smart glasses in emergency medical services: semistructured interview study. JMIR Hum. Factors9, e30883 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 214.Osterwalder, J., Polyzogopoulou, E. & Hoffmann, B. Point-of-care ultrasound-history, current and evolving clinical concepts in emergency medicine. Medicina10.3390/medicina59122179 (2023). [DOI] [PMC free article] [PubMed]
- 215.Sommer, F. et al. Feasibility of smart glasses in supporting spinal surgical procedures in low- and middle-income countries: experiences from East Africa. Neurosurg. Focus52, E4 (2022). [DOI] [PubMed] [Google Scholar]
- 216.Apiratwarakul, K., Cheung, L. & Ienghong, K. Impact of smart glasses on patient care time in emergency medical services ambulance. Prehosp. Disaster Med38, 735–739 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 217.Lee, Y. et al. Smart glasses and telehealth services by professionals in isolated areas in Korea: acceptability and concerns. Technol. Health Care31, 855–865 (2023). [DOI] [PubMed] [Google Scholar]
- 218.Park, S. & Lee, H. The effect of communication in emergency department isolation rooms using smart glasses: a mixed-methods study. J. Clin. Nurs. 10.1111/jocn.17690 (2025). [DOI] [PubMed]
- 219.Reed, T. et al. Seeing yourself through the learner’s eyes: incorporating smart glasses into objective structured teaching exercises for faculty development. J. Contin. Educ. Health Prof.43, 60–64 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 220.van der Kruk, S. et al. Virtual reality as a patient education tool in healthcare: a scoping review. Patient Educ. Couns.105, 1928–1942 (2022). [DOI] [PubMed] [Google Scholar]
- 221.Muroi, K. et al. An analysis of the effectiveness of reflective learning through watching videos recorded with smart glasses-with multiple views (student, patient, and overall) in radiography education. PLoS ONE19, e0296417 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 222.Page, M. J. et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ372, n71 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
No datasets were generated or analyzed during the current study.











