Skip to main content
Nature Communications logoLink to Nature Communications
. 2026 Feb 3;17:2289. doi: 10.1038/s41467-026-68918-y

Spatial-chromatic encoding cosmetic contact lenses for enhanced natural eye tracking

Hengtian Zhu 1, Heyu Huang 1, Huan Yang 1, Zixu Li 1, Zhenning Qi 1, Yuan Fang 2, Yining Xu 1, Yifeng Xiong 1, Ye Chen 3, Songtao Yuan 2, Fei Xu 1,4,5,
PMCID: PMC12976142  PMID: 41634000

Abstract

Vision-based analysis of ocular features represents the predominant approach for eye tracking. However, these features are highly susceptible to interference from illumination, eyelid/eyelash occlusion, and individual variations, leading to low recognition rates and diminished tracking accuracy. To address these limitations, eye movement feature enhanced cosmetic contact lenses are proposed, implementing a spatial-chromatic encoding strategy. Employing a head-mounted eye tracker integrated with RGB cameras, this system enables accurate and robust gaze tracking in natural environments. Under challenging illumination, the lenses achieve a 93% feature recognition rate, significantly surpassing pupil recognition and tolerating highly off-axis camera placement. Eye movement model and human eye tracking demonstrate superior accuracy (<1°), gaze direction estimation, and continuous fixation positioning. Utilizing these lenses, diverse eye-tracking applications are demonstrated, including image identification, reading analysis, and outdoor interaction. This approach advances the development of lightweight, unobtrusive eye-tracking systems and facilitates broader application of gaze-based interaction technology in real-world settings.

Subject terms: Computer science, Biomedical engineering, Glasses


Eye tracking relies on ocular features, that are susceptible to interference from light and individual variations. Here, Zhu et. al. present eye movement feature enhanced cosmetic contact lenses, enabling accurate, natural, and homogenized eye tracking.

Introduction

Eye tracking technology provides an objective, real-time observational window into human attention allocation, cognitive processing, and behavioral decision-making by recording and analyzing visual behaviors such as gaze point trajectories, pupillary movements, and blink frequency. Its significance stems from the central role of the visual system in human information processing - approximately 83% of external information is acquired visually. Within fundamental research, this technique serves as a critical tool for exploring perception1, memory2, language comprehension3, and decision-making mechanisms4, capable of revealing implicit cognitive processes often elusive to traditional behavioral experiments. At the applied level, eye tracking has found widespread use in human-computer interaction5,6, user experience research7, clinical medicine8, educational psychology9, and intelligent cockpit10.

To date, researchers have proposed numerous technical approaches to achieve accurate, real-time eye tracking, including electrooculography11, the scleral search coil technique12,13, video/image-based methods14,15, magnetic resonance-based methods16, and electrostatic induction methods17. Among these, video/image-based eye tracking currently represents the most dominant and widely adopted technology1. It utilizes one or more cameras to capture images of the eye18, subsequently employing image processing algorithms to identify eye movement-related ocular features. These features encompass intrinsic properties of the eyeball itself, such as the iris1921and pupil2224, as well as reflection-based features provided by the eye tracking system, such as near-infrared glints2528 or patterns29 projected from the screen. Accurate identification of these eye movement features is fundamental to achieving eye tracking. However, robustly identifying these features within complex, naturalistic environments remains a significant challenge30,31. On one hand, variable ambient light reflected from the corneal surface is captured by the camera, superimposing random environmental patterns onto the eye movement features. This complicates their automatic extraction using generic algorithms. On the other hand, user-specific factors can reduce the success rate of feature identification and tracking accuracy, including low iris-pupil contrast, occlusion by eyelids and eyelashes, indistinct iris boundaries, and pupil shape irregularities. Achieving lightweight and unobtrusive eye tracking in authentic, natural environments represents an important developmental goal. Yet, in such contexts, eye trackers often prove susceptible to interference. This vulnerability primarily stems from the highly limited and individually varied eye movement characteristics. Consequently, enhancing eye movement-related ocular features is key to realizing robust and high-performance eye tracking.

In recent years, advancements in flexible electronics and bioelectronics have driven flexible devices to revolutionize human-computer interaction3234. Among these, contact lenses conform to the corneal surface and move with the rotation of the eyeball. Integrating optoelectronic devices into contact lenses serves as an effective approach to enrich eye movement characteristics35,36. For instance, our research team previously integrated inductive–capacitive resonators based on a frequency encoding strategy into contact lenses, achieving precise eye-tracking and eye-command recognition37. Gan et al. incorporated magnetic materials into contact lenses and accurately identified eye orientations across 9 directions wirelessly by measuring magnetic field variations using an 8-channel teslameter38. Fradkin et al. integrated a passive optical module within a contact lens, comprising two overlapping grating layers that form a moiré pattern sensitive to eye movement angles, achieving remarkably high angular resolution and strong resistance to ambient light interference39. Khaldi et al. integrated a wirelessly powered near-infrared laser into contact lenses, where eye movement direction was indicated by the position of the near-infrared spot40. Evidently, integrating devices with diverse physical properties into contact lenses can provide consistent eye movement characteristics. This consistency enhances interference-resistant eye-tracking, improves the usability of eye-tracking technology, and expands its application scenarios, such as enabling eye-tracking during eye closure41. However, at its current stage, eye-tracking contact lenses still need to focus on meeting the fundamental physicochemical properties of contact lenses, such as being lightweight, highly flexible, and exhibiting excellent surface wettability, before they can be widely adopted by broader populations.

To address the aforementioned challenges, cosmetic contact lenses, easily accessible daily items with high public acceptance, are employed to enhance eye movement features, enabling accurate and robust eye tracking in natural scenarios. Using an RGB (Red, Green, Blue) primary spatial-chromatic encoding strategy, these lenses print arrayed green solid circles and red dots circularly, endowing brightly colored and standard-shaped eye movement features while also readily meeting requirements for safety and comfort. By capturing lens images using the common, low-cost RGB camera and applying straightforward hue-space and morphological image processing algorithms, a significant enhancement in eye tracking performance is achieved. The proposed solution offers the following advantages: (a) superior overall recognition rate of 93% under challenging and complex lighting conditions, which is improved by 53% compared to the bare eye; (b) <1° eye tracking accuracy, gaze direction estimation, and full-frame continuous gaze point positioning; (c) support for highly off-axis camera placement, enabling unobtrusive eye tracking technology. Having all these features, the proposed eye movement feature enhanced (EMFE) cosmetic contact lens is expected to make eye trackers accessible, reduce the cost of eye tracking technology, and promote its widespread application in real-world, natural daily life scenarios.

Results

Design of eye movement feature enhanced (EMFE) cosmetic contact lens

Cosmetic contact lenses can integrate specific patterns with vivid colors within the lens material to decorate the eye. To achieve accurate and robust eye tracking, we designed the EMFE cosmetic contact lens based on an RGB color-coded strategy, as shown in Fig. 1a. The lens pattern features a peripheral blue ring adorned with 12 evenly spaced green solid circular elements, interspersed with red dots between the green elements. Red, green, and blue are located at 0°, 120°, and 240° in the hue domain, respectively, exhibiting significant chromatic differences. When imaging the eye wearing this lens with an RGB camera, the high color contrast and sharp pattern edges enable recognition algorithms to accurately identify these eye movement features. The peripheral pattern distribution avoids visual field obstruction while ensuring feature visibility even under highly off-axis camera angles. Compared to the single feature (e.g., pupil) in bare-eye images, the 12 uniformly distributed green solid circles significantly enhance pattern recognition robustness against complex lighting interference, eyelid occlusion, and eyelash obstruction. Accurate gaze tracking is achieved through a centroid-angle mapping function that calculates gaze angles from the green solid circles’ centroids. Furthermore, a greater number of green solid circles enhances the accuracy of gaze angle measurement. During dynamic eye movements, motion blur often occurs in images captured by low-frame-rate cameras. Here, the red dots streak into red trajectories. Due to the small diameter of the red dots, the direction of the red trajectories aligns with that of the eye movement. By identifying these red trajectories and combining them with eye movement angles calculated from adjacent static frames, full-frame continuous gaze point positioning can be extracted. This spatiotemporal feature-enhanced tracking method enables eye tracking technology applications in natural environments. Furthermore, the green solid circles are standard shapes with known diameters. Combined with the camera’s imaging parameters, they enable the estimation of the pose information of the green solid circles in the camera coordinate system. This capability aids in determining the spatial position of the EMFE cosmetic contact lens, thereby supporting more advanced eye-tracking functionalities such as gaze direction estimation and camera slippage compensation.

Fig. 1. Eye movement feature enhanced (EMFE) cosmetic contact lens.

Fig. 1

a Schematic diagram of the EMFE cosmetic contact lens, which contains green solid circles and red dots with strong color contrast for gaze direction calculation and eye movement trace tracking, respectively. b Photographs of EMFE contact lens. c Photo of a participant wearing the EMFE cosmetic contact lens and a head-mounted eye tracker.

The EMFE cosmetic contact lens pattern, created using commercial cosmetic dyes, is transferred from a flat design to the spherical lens surface via pad printing technology. A sandwich fabrication process embeds the pattern layer within the lens body. Steam autoclave sterilization ensures the lenses are safe and sterile. As shown in Fig. 1b, the lenses exhibit cornea-matched curvature and diameter. An ultra-thin 80-μm thickness ensures stable adhesion to the corneal surface, enabling the lens to move precisely with the eye. The EMFE contact lens has a measured oxygen permeability (Dk) of 18.2 × 10−11 (cm2/s)[mL·O2/(mL·hPa)] and an oxygen transmissibility (Dk/t) of 19.8 × 10−9 (cm/s) [mL·O2/(mL·hPa)]. The water content of the EMFE contact lens is 56%. The lens has excellent surface wettability. As shown in Supplementary Fig. 1, the droplet rapidly spread across the lens surface without maintaining a spherical shape. This behavior indicates that the interaction energy between the water and the lens surface significantly exceeds the cohesive energy of the water itself. The standardized pattern design and fabrication process allow the EMFE lenses to be easily mass-produced at extremely low cost while maintaining excellent pattern consistency. To verify the cytotoxicity of the EMFE cosmetic contact lens, the viability of cells from human corneas (HCE-T) was measured. Extracts of the lens were prepared by immersing the lens in the cell culture medium for 24 h. After culturing the cells using extracts for 24 h and then adding CCK-8 reagent, the absorbance was measured to record the viability of the cells. More details are provided in the Methods. As shown in Supplementary Fig. 2, the cell viability using extracts of the EMFE cosmetic contact lens remained above 95% after 24 h of incubation with no notable differences among the controls, and it is significantly superior to commercial contact lenses. This suggests that the EMFE cosmetic contact lens was noncytotoxic over long-term wear and would pose little risk of corneal inflammation. The micrographs of HCE-T after 24-hour culture are demonstrated in Supplementary Fig. 3. The cell distribution density incubated in different extracts was similar, indicating low bio-toxicity. Besides, a 6-h-long dye leakage test was conducted under tear-mimicking conditions. Two EMFE contact lenses were immersed in phosphate-buffered saline (PBS) and incubated in a water bath maintained at 35 °C for 6 h. As shown in Supplementary Fig. 4, the PBS solution remained clear and colorless after immersion, and no observable changes in color or pattern were detected on the EMFE contact lenses, indicating great biosafety and stability.

Figure 1c shows a participant wearing the EMFE contact lens and the head-mounted eye tracker. The head-mounted eye tracker incorporates two off-axis eye-facing cameras for imaging the eyes and one scene camera capturing the forward view. The three video streams are transmitted via the Universal Serial Bus (USB) protocol, combined using a USB hub onto a single cable connected to the image signal processing terminal, such as a laptop. This system enables real-time acquisition and transmission of three channels of 30-fps eye movement RGB video. Furthermore, cosmetic contact lenses inherently possess vision correction functionality while weighing only approximately 20 mg. This eliminates the need for additional bulky lenses (which weigh approximately 10 g), a critical advantage for weight-sensitive eye-tracking technology applications in emerging fields such as virtual reality (VR) and augmented reality (AR).

Recognition rate characterization of eye movement feature

Accurate identification of eye movement-related features is the prerequisite for eye movement tracking. Complex and random environmental lighting interference poses a significant challenge to eye movement feature recognition in natural environments. Especially when using RGB cameras, lighting interference is far more severe compared to cameras equipped with narrowband near-infrared filters42. We set up three distinct lighting environments: indoor artificial lighting, outdoor natural lighting, and window-side natural lighting. Under these three lighting conditions, we captured a total of 700 eye images of the volunteer wearing EMFE cosmetic contact lenses, and captured naked eyes using the same RGB camera. Then, the images were analyzed using MATLAB R2020b, and the recognition accuracy of the target recognition algorithm was compared. For eye images with cosmetic contact lenses, the green solid circles serve as the feature pattern for eye movement tracking. Thanks to the strong color contrast between the green solid circles and the surrounding blue area, as well as the sharp edges of the pattern, we initially extracted the green region in the image using the HSV threshold method. However, due to lighting conditions and the scene context, some pixels in the eye images may be misidentified, such as highlights and projections of natural objects. Additionally, highlights, eyelids, and eyelashes may overlap with or obscure the green solid circles. Nevertheless, since the green solid circle pattern, created using a transfer printing process, is a consistent standard circle, it often appears elliptical under different eye movement angles when projected by the camera. The aforementioned interferences typically produce irregular patterns. Therefore, by applying threshold screening to the area, eccentricity, and solidity of connected regions, we can achieve high-accuracy recognition of the green solid circle pattern in cosmetic contact lenses. The top row of images in Fig. 2a illustrates representative eye images with cosmetic contact lenses under the three lighting conditions. The algorithm-identified green solid circles are outlined with red lines. Some small or overly flattened green solid circles are actively excluded by the algorithm. Furthermore, in outdoor scenes, green tree projections on the surface of the cosmetic contact lenses are not mistakenly identified as green solid circles, demonstrating the robustness of the recognition algorithm. For naked eye images, due to their sharp edges, pupils are commonly used as features for eye movement tracking. We employed the ElSe43, ExCuSe44, PuRe45, PuReST46, Swirski2D47 algorithms in PupilEXT48, and the Gradient algorithm49 proposed by Fabian Timm et al. to identify pupil regions in the images. The centroids of the identified regions are marked with different symbols in the images, as shown in the bottom row of Fig. 2a. The absence of a marker indicates algorithm failure due to excessive centroid deviation or undetected pupil. Additionally, the low contrast between the iris and pupil colors, as well as lighting and scene projections on the cornea, significantly interferes with pupil recognition accuracy. By manually annotating the green solid circles in cosmetic contact lenses and pupil regions in the naked eye as ground truth, we compared these with the eye movement features identified by the algorithms. If the algorithm-identified region’s centroid lies within the groundtruth region, the recognition is considered successful, and the pixel deviation between the centroid and the groundtruth region’s centroid is calculated. Figure 2b, c compares the recognition rate differences between green solid circle recognition in cosmetic contact lenses and pupil recognition in naked eyes, as well as the cumulative probability of centroid pixel deviations for these eye movement features. Under indoor artificial lighting, outdoor natural lighting, and window-side natural lighting scenarios, the recognition rates for green solid circles in cosmetic contact lenses were 96.6%, 83.3%, and 99.0%, respectively, while the pupil recognition rates for bare eyes using different algorithms were all below 55%. The significant improvement in recognition rates is attributed to two factors: the strong color contrast and consistent morphological standards of the cosmetic contact lens patterns, and the increased number of feature patterns. This demonstrates the robustness of using EMFE cosmetic contact lenses for eye movement feature recognition in complex and random natural environments. The overall recognition rate for green solid circles was 93%, with 90% of centroid deviations being less than 12 pixels, while pupil centroids exhibited significantly larger pixel deviations. The distribution of centroid deviations of the green solid circles is shown in Supplementary Fig. 5. In subsequent applications, using the centroid of the green solid circles to map gaze angles via the gaze angle mapping function, the more precise recognition of green solid circle centroids can enhance the accuracy of eye movement angle calculations.

Fig. 2. Highly robust off-axis photography of eye movement features based on EMFE cosmetic contact lenses.

Fig. 2

a Representative images of the EMFE cosmetic contact lens (top row) and the naked eye (bottom row) under various illumination environments, such as indoor artificial light, outdoor natural light, and window light. In the EMFE contact lens images, the red line indicates the algorithm-identified boundary of the green solid circle. In the bare eye images, circular, upright triangular, inverted triangular, square, diamond, and pentagram markers represent the pupil centroids detected by the ElSe, EsCuSe, PuRe, PuReST, Swirski2D, and Gradient algorithms, respectively. The absence of a marker indicates algorithm failure due to excessive centroid deviation or undetected pupils. b Comparison of eye movement feature recognition rates under different illumination environments. For EMFE cosmetic contact lenses (EMFE-CCL), the feature is the green solid circle; for the naked eye, the feature is the pupil. c Cumulative probability of centroid deviation for the algorithm-identified eye movement features. d Representative images of the EMFE cosmetic contact lens (top) and the bare eye (bottom) under high off-axis camera conditions. e Comparison of gaze range based on feature recognition for EMFE cosmetic contact lenses and the naked eye across different camera pitch angles.

At present, to avoid obstructing the line of sight and construct a natural, unobtrusive head-mounted eye-tracking device, the camera position must deviate from the eyeball’s optical axis when looking straight ahead. The camera must capture the eye region at a large tilt angle, which makes it highly susceptible to losing the eye movement feature and causing tracking failures under large eye movement angles. The two-dimensional rotation range of human eye movement is ±40° horizontally and ±30° vertically. Gaze points are set at 10° intervals, and volunteers are guided to sequentially fixate on each point. The camera is placed below and in front of the eyeball, with different pitch angles set to keep the eyeball centered in the image. Eye movement feature recognition is analyzed under different gaze angles for both cases: wearing EMFE cosmetic contact lenses and without lenses. As shown in Fig. 2d, for the image with lenses, the aforementioned feature recognition algorithm is used to identify the green solid circles, with their edges marked in red. For the image without lenses, manual annotation is used. However, due to the large off-axis camera angles and eye movement angles, combined with the objective physiological structure of the pupil being inside the eyeball, the pupil is often occluded, leading to the loss of the feature. The lower image in Fig. 2d demonstrates a representative occluded pupil image when the camera pitch angle is 80°, and the eyeball rotates to a yaw angle of 0° and a pitch angle of 30°. As the camera pitch angle increases from 50° to 80°, the feature recognition algorithm can accurately identify the green solid circle pattern across the full range of eye movements for images with EMFE cosmetic contact lenses. In comparison, for images without lenses, the camera pitch angles of 50° and 60° can capture the pupil region across the full range of eye movements. However, at pitch angles of 70° and 80°, 14% and 18% of the eye movement range, respectively, lose pupil features. Furthermore, even when the volunteer squints to different degrees, the camera is still able to easily capture the green circular pattern, as shown in Supplementary Fig. 6. This demonstrates that wearing EMFE cosmetic contact lenses enables a high off-axis camera placement, facilitating a more natural and unobtrusive eye-tracking technology.

Eye tracking characterization using the eye movement model

A two-dimensional rotation platform (as shown in Fig. 3a) can precisely control the horizontal yaw angle and vertical pitch angle of the model eye, aiding in the characterization of the accuracy of eye tracking technology based on EMFE cosmetic contact lenses. Referencing the rotation range of the human eyeball, the model eye’s two-dimensional rotation range was set to horizontal ±40° and pitch ±30°, with intervals of 10°. Images of the model eye were captured at each eye movement angle. Using the HSV threshold method and connected domain morphological threshold method, the green solid circles on the EMFE cosmetic contact lens were identified, and the centroids of the green solid circles were extracted. Subsequently, a polynomial fitting method is used to construct the mapping function between the green solid circles’ centroids and the gaze angle, enabling gaze angle calculation. As shown in Fig. 3b, the set values of the two-dimensional gaze angles (gray squares) and the calculated results (yellow circles) obtained through feature recognition and mapping function calculations are compared. If all 63 gaze angle data points are used for polynomial parameter fitting, it implies a complex and time-consuming calibration process for the user. Therefore, to simplify the calibration process, we selected 9 gaze angles from a 3 × 3 array as calibration data for polynomial parameter fitting to construct the mapping function. The remaining 54 eye movement angle images are used as test data, with the calculated eye movement angle results shown as red points in Fig. 3b. It can be seen that whether all data or 9-point calibration is used, the calculated eye movement angles align well with the set values. The specific angular errors are statistically shown in Fig. 3c. The eye movement angle calculation error using the 9-point calibration method is 0.82° ± 0.58° (mean ± S.D.), while using all 63 eye movement angle data points results in an error of 0.74° ± 0.40°. This indicates that the 9-point calibration method maintains high accuracy in eye tracking while significantly simplifying the calibration process. Besides, a more detailed accuracy characterization was conducted with intervals of 1° under the eye rotation range. As shown in Supplementary Fig. 7, the computed gaze angles exhibit minimal errors in the central region of the eye movement range, while larger errors occur mainly at the four corners, corresponding to more extreme gaze angles. These residual errors stem from the reduced number of detectable feature patterns at extreme angles and the inherent limitations of polynomial-based fitting under large-angle rotations. The overall angular error was 0.29° ± 0.21°. To quantitatively evaluate the accuracy of the EMFE contact lens-based eye tracking technique under various environmental conditions, the eye tracking characterization was also conducted beside a window (with a light intensity of 322 lux) and outdoors (with a light intensity of 25100 lux). The gaze angle map and error statistic under different illumination scenarios are shown in Supplementary Fig. 8. The tracking accuracy was 0.95° ± 0.61° beside a window and 0.96° ± 0.67° outdoors. The slightly higher angular errors observed in the window-side and outdoor scenarios are attributed to more complex lighting conditions in those environments, which introduced interference in identifying the centroid coordinates of the green solid circles. Due to the human eye’s attention mechanism, the brain primarily analyzes visual information within a 1–2° field of view corresponding to the fovea50. Therefore, a gaze angle error of less than 1° is sufficient to accurately determine the gaze point and analyze the regions of visual attention. Additionally, the gaze accuracy based on a 4-green-solid-circle EMFE cosmetic contact lens is characterized with the angle error of 1.20° ± 0.72°, as shown in Supplementary Fig. 9. The 12-green-solid-circle lens exhibits superior eye-tracking accuracy, which can be attributed to the camera’s ability to identify at least three green solid circles across various gaze angles. After averaging the computed gaze angles, the system mitigates the effects of centroid deviations caused by environmental light interference and pattern distortion due to eyelid occlusion. The results indicate that a greater number of green solid circle features can improve the eye tracking accuracy of EMFE contact lens-based systems.

Fig. 3. Eye movement model characterization based on EMFE contact lenses.

Fig. 3

a High-precision eye movement characterization equipment, which contains 2 rotation stages for yaw rotation and pitch rotation. b Gaze angle coordinates when the eyeball model rotated to different gaze directions. The gray squares represent the set gaze coordinates, and the yellow and red circles represent the calculated gaze coordinates under the full data fit and 9-point calibration, respectively. c Error statistic of gaze angle coordinates under the full data fit and 9-point calibration, respectively. d Gazing direction estimation. e Eye movement feature recognition under static frame and dynamic frame. f Comparison of the eye movement trace between the full frame and the static frame was analyzed. Inset: eye movement velocity was calculated frame-by-frame. g Schematic diagram of camera slippage. h Comparison of gaze angle map with and without compensation when the camera was slipped to Position 3. i Error statistic of gaze angle coordinates with and without compensation under camera slippage. In the error statistic, the middle line is determined by the median, the box is determined by the 25th and 75th percentiles, and the whiskers are determined by the 5th and 95th percentiles. n = 63 independent gazing point coordinates.

Using a mapping function for eye tracking compensates for errors introduced by the individual variation in the angle between the visual axis and optical axis, enabling precise gaze angle tracking. However, this method requires a calibration process to obtain polynomial parameters. Alternatively, EMFE cosmetic contact lenses can achieve gaze direction estimation directly. As shown in Fig. 3d, due to the structural consistency of the green circular patterns, the pose information (i.e., the 3D coordinates of the circle center and the normal vector of the circular plane) of each green solid circle in the camera coordinate system can be calculated using the known diameter of the green solid circles and the intrinsic camera parameters51. Using the pose information from multiple green solid circles, the intersection point of their normal vectors gives the center coordinates of the cosmetic lens sphere, and the normal vector of the plane fitted to the 3D coordinates of the circle centers represents the direction of the eye’s optical axis (indicated by the green line in the figure). Supplementary Fig. 10 presents the estimated gaze directions across the full eye movement range (yaw: ±40°, pitch: ±30°) at 10° steps. A comparison between the set gaze directions and the calculated results is shown in Supplementary Fig. 11. The gaze angle estimation error was 3.08° ± 1.54°.

Human eye movement modes, such as fixation, smooth pursuit, and saccades, exhibit distinct characteristics in terms of eye movement speed. During static fixation or slow eye movements, cameras can capture clear eye images. However, during dynamic eye movements, motion blur makes it difficult for cameras to recognize eye movement features, leading to the loss of gaze point data. Thanks to the design of eye movement feature enhancement patterns, common cameras are able to achieve full-frame-rate gaze point localization. For static eye movement images, the green solid circles are used as the eye movement feature for calculation. For dynamic eye movement images, the red trajectories formed by red points in motion blur are used as the eye movement feature, as shown in Fig. 3e. Similarly, the HSV threshold method and connected domain morphological threshold method were used to identify the red trajectory, followed by extracting the skeleton of the red trajectory region as the motion trajectory during the dynamic eye movements. Since dynamic image frames are always preceded and followed by static image frames, by combining the centroid coordinates and eye movement angles of the red points in the preceding and succeeding static image frames with the red point trajectory in the dynamic frame, the gaze point coordinates of the dynamic frame can be calculated. Figure 3f demonstrates the eye movement trace simulated by a two-dimensional eye movement model, where the red curve represents the continuous gaze angle positioning analyzed from static and dynamic images, while the gray curve represents the gaze angle positioning analyzed solely from static images. The inset in Fig. 3f calculates the eye rotation speed for each frame in this eye movement trajectory. If only static images are analyzed, the true sampling rate of the eye-tracking technology will drop to 65% of the video’s full sampling rate, and break the continuity of gaze points. This is of great importance in revealing cognitive processing mechanisms, assessing neurological function, and aiding clinical diagnoses. In addition, the eye movement trace results from the low-frame-rate system with the EMFE contact lens are compared with those obtained with a Pico Neo3 Pro eye tracker, which samples at 90 fps. Supplementary Fig. 12 shows the time-series plots of the yaw and pitch angles obtained from both methods, indicating a highly consistent gaze angle trace.

Head movements or slippage of the head-mounted eye tracker cause displacement between the camera and the eyeball, altering the mapping relationship and reducing the accuracy of long-term eye tracking. This is a common challenge in eye-tracking technology. To mitigate gaze angle errors induced by camera slippage, a compensation algorithm has been developed. Based on the three-dimensional coordinates of three or more green solid circle centers, the spatial position of the eye can be calculated. Changes in the eye’s spatial position reflect the displacement of the camera relative to the eyeball. After calculating the displacement, spatial compensation is applied to the green solid circles (i.e., subtracting the displacement vector), as shown in Supplementary Fig. 13. The compensated circles are projected back to the image plane to update their centroid coordinates, which are then input to the mapping function to obtain the corrected gaze angle. Here, the camera was calibrated to achieve the mapping function (Position 1), moved upward by 2.5 mm (Position 2), and then away from the eyeball by 2.5 mm (Position 3) in off-axis photography conditions to verify the effect of the compensation algorithm, as shown in Fig. 3g. Supplementary Fig. 14a, b shows the distributions and error statistic of gaze angles when the camera is at the Position 1, indicating strong agreement between the estimated and target gaze coordinates. Due to the slippage of the camera, noticeable pixel-level shifts in the green solid circle centroids occurred. Since these shifts did not originate from actual eye rotation, directly inputting the shifted centroid coordinates into the mapping function led to large gaze errors. The comparisons of the gaze angle map and error statistic with and without compensation at Position 2 and Position 3 are shown in Supplementary Fig. 14c, d and Fig. 3h, i, respectively. The yellow points represent the gaze angle results without compensation, exhibiting a distinct trapezoidal distortion. At Position 3, the overall gaze error reached 17.14° ± 1.88°. After applying the compensation algorithm, the resulting gaze angles, shown as red points, were significantly closer to the targets. As shown in Fig. 3i, the overall gaze error was reduced to 2.03° ± 0.65°, indicating that 88% of the original error was successfully compensated for. The residual error after compensation mainly stems from inaccuracies in calculating the 3D positions of the green solid circles and the eye.

Eye tracking characterization in human eyes wearing EMFE cosmetic contact lenses

EMFE cosmetic contact lenses were prepared using a sandwich process and undergo steam high-temperature sterilization to ensure safety and sterility. Volunteers wore EMFE lenses and a head-mounted eye tracker, with an off-axis camera integrated into the device, capturing eye images. Volunteers faced a display screen or whiteboard at a distance of 60 cm. Setup gazing points with known screen coordinates were marked on the screen or the whiteboard to guide volunteers to fixate, thereby testing the performance of eye-tracking based on EMFE contact lenses (as shown in Fig. 4a). This human experiment was approved by the Ethical Committee of the First Affiliated Hospital of Nanjing Medical University, Jiangsu Provincial People’s Hospital, with an ethical approval number of 2025-SR-023. Figure 4b shows photographs of volunteers wearing EMFE lenses while gazing in different directions.

Fig. 4. Eye tracking characterization of human eyes wearing EMFE contact lenses.

Fig. 4

a Schematic diagram of eye tracking characterization. Volunteers wore contact lenses and frame glasses integrated with RGB cameras, which captured images of the eye. Red dots were displayed on the screen to guide the volunteer’s gaze. Adapted under terms of the CC Attribution license. Copyright 2026, zhuoyi0904, Sketchfab, Inc54. b Photograph of the volunteer gazing in different directions. c Gaze point coordinates when the volunteer gazed in different directions. The gray squares represent the set gaze coordinates, and the red circles represent the calculated gaze coordinates. Each gaze coordinate was recorded six times. d Error statistic of the accuracy of the eye tracking technique based on the EMFE contact lens, which reflects the deviation of the estimated gaze coordinates and the set values. The middle line is determined by the median, the box is determined by the 25th and 75th percentiles, and the whiskers are determined by the 5th and 95th percentiles. n = 486 independent gazing point coordinates. e Error statistic of the precision of the eye tracking technique based on the EMFE contact lens, which reflects the degree of dispersion when gazing at the same location. The middle line is determined by the median, the box is determined by the 25th and 75th percentiles, and the whiskers are determined by the 5th and 95th percentiles. n = 486 independent gazing point coordinates. f Eye movement trace when the volunteer gazed a red dot on the screen which moved at different speed. g The velocity of gaze frame by frame.

To test the accuracy of eye-tracking based on EMFE contact lenses, four volunteers (two males and two females) with different ocular features (in Supplementary Table 1) were instructed to observe a whiteboard positioned 60 cm away. A two-dimensional array of gaze targets was arranged on the whiteboard. While the human eye is capable of a wide range of rotational movement, it generally prefers head turning for horizontal angles beyond approximately 16°. Therefore, the target points were confined within a range of ±16° in yaw and ±16° in pitch, spaced at 4° intervals, forming a 9 × 9 grid comprising 81 gaze targets. The volunteers were guided to fixate on each of these points sequentially, and six frames of eye images were captured per gaze point. The green solid circles were identified using MATLAB R2020b, and a mapping function was derived to calculate the gaze angle coordinates. Figure 4c displays the set values (gray squares) and calculated values (red dots) of the gaze angle. Eye-tracking performance is commonly described by accuracy and precision. Accuracy is defined as the average error between the actual and calculated positions of the gaze angle. Precision is defined as the dispersion of the eye-tracking device when recording the same fixation point. Figure 4d, e statistically shows the accuracy and precision of eye-tracking based on EMFE lenses, with values of 0.70° ± 0.38° and 0.22° ± 0.13°, respectively. The distributions of gaze angles, along with the calculated accuracy and precision for all four volunteers, are shown in Supplementary Fig. 15. The specific accuracy and precision values for each volunteer (A–D) are summarized in Supplementary Table 2. All four volunteers achieved eye-tracking accuracy and precision better than 1°, demonstrating that the EMFE-based eye-tracking technique can avoid the interference caused by ocular individual differences. Similarly, an eye movement angle error of less than 1° is sufficient to accurately determine the gaze point and analyze the regions of visual attention52. Besides, the long-term stability was evaluated by wearing the EMFE contact lens continuously for 6 h. As shown in Supplementary Fig. 16, the eye-tracking accuracy and precision with the EMFE contact lens were 0.61° ± 0.34° and 0.28° ± 0.18°, respectively, after long-term wearing and showed no significant difference from the initial wearing state. After removal of the lens, no blurring or deformation of the feature patterns was observed (in Supplementary Fig. 17), confirming the stability of the EMFE contact lens for long-term eye-tracking applications.

Blinking-induced eyelid occlusion poses a common challenge for eye tracking by significantly lowering the success rate of ocular feature identification. To quantify this effect, volunteers wore an EMFE contact lens on the left eye while blinking naturally. A frame-by-frame analysis correlated the detectability of the enhanced eye movement feature with the degree of eyelid occlusion. Simultaneously, an infrared camera tracked the right eye via pupil recognition for comparison. As shown in Supplementary Fig. 18a, the number of detectable green solid circles decreased with increasing occlusion. At occlusion levels exceeding 80%, pattern recognition failed completely, resulting in approximately 66 ms of data loss with the EMFE contact lens method. In contrast, pupil-based tracking experienced longer interruptions (approximately 231 ms) due to more frequent and complete pupil occlusion. A similar trend was observed during the prolonged, heavy blink (Supplementary Fig. 18b), where data loss extended to about 198 ms (EMFE contact lens-based) versus 330 ms (pupil-based). In summary, while eyelid occlusion above 80% leads to temporary tracking failure, the EMFE contact lens-based method exhibits significantly shorter failure durations and resumes immediately upon eye reopening, demonstrating superior robustness compared to pupil-based tracking during blinks. However, due to the dynamic interaction between the eyelid and the lens, a slight gaze angle error occurs during the moment of blinking. As shown in Supplementary Fig. 19, during eye reopening, tangential eyelid forces caused vertical lens slippage, introducing a pitch angle error of approximately 2–4° with no notable horizontal displacement. This vertical shift self-corrected rapidly after the eye reopened, and the resulting gaze error was eliminated within 0.3–0.7 s.

Additionally, a red dot moving along a 20 cm × 20 cm rectangular path at different speeds (0.5 s, 0.75 s, and 1 s per side length) is displayed on the screen to guide the volunteer to perform eye movements of varying speeds. The gaze point trajectories of the volunteer are shown in Fig. 4f. As the speed of the guided gaze point increases, the density of the participant’s gaze points along the movement path becomes sparser, yet continuous gaze point localization is maintained, demonstrating the capability to track dynamic eye movement traces. Figure 4g displays the corresponding eye movement speeds. The colored regions in the figure correspond to the process where the volunteers’ gaze points follow the moving red dot, while the uncolored regions correspond to moments when the red dot is stationary. This demonstrates that enhancing eye movement features through EMFE lens patterns enables spatially accurate and temporally continuous eye tracking.

Eye tracking in multiple indoor and outdoor scenarios

Based on the eye-brain consistency hypothesis, the human brain primarily analyzes visual information at the fixation point. By tracking users’ eye movements, we can identify regions of interest, infer user intentions, and achieve efficient human-computer interaction. As shown in Fig. 5a, the volunteer was asked to find the image containing a bicycle among nine images. To enable interaction between the eye movement and the screen, the volunteer first underwent a 9-point calibration process. The volunteer was guided to look at each of the nine calibration points on the screen in sequence, with five frames of eye images captured at each calibration point. A mapping function was then constructed based on feature patterns. As shown in Fig. 5b, the upper part illustrates the calibration point settings and calculated fixation point coordinates, while the lower part provides statistics on the fixation point error distribution, with an accuracy of approximately 0.74 cm ± 0.39 cm. Subsequently, the volunteer freely observed the images on the screen for a period of time, during which a head-mounted eye tracker recorded the eye images of the volunteer wearing EMFE cosmetic contact lenses and identified eye movement features to calculate fixation point coordinates. Considering precision errors of the eye-tracking system, the algorithm performed a gradient score on pixels at varying distances from the calculated fixation point, with the highest score at the fixation point and a decreasing score as the distance increases. During the observation of images, the score accumulated frame by frame. After processing, the score distribution formed a heatmap of regions of interest, as shown in Fig. 5c. The image containing a bicycle clearly had a higher interest score. Combining the pixel display areas of each image, the interest scores calculated for each region are shown in Fig. 5d, which also demonstrates that the user paid more attention to the bicycle image. The entire process of region of interest recognition is shown in Supplementary Video 1.

Fig. 5. Eye tracking in multiple indoor and outdoor scenarios based on EMFE contact lenses.

Fig. 5

a Schematic diagram of ROI recognition. The camera recorded the eye movements of the volunteer when watching the picture, calculated the gaze coordinates and duration, calculated the interest value of the pictures by integrating, and realized the visualization of the ROI. Adapted under terms of the CC Attribution license. Copyright 2026, zhuoyi0904, Sketchfab, Inc54. b Gaze point coordinates on the screen under 9-point calibration. Each gaze coordinate was recorded 5 times. Bottom: error statistic of gaze point coordinates. The middle line is determined by the median, the box is determined by the 25th and 75th percentiles, and the whiskers are determined by the 5th and 95th percentiles. n = 45 independent gazing point coordinates. c ROI heat map. d ROI sum of each picture. e Eye movement trace of reading. The blue circles represent the coordinates of fixation in turn, and the lines present the eye movement trace. Different colors represent varying eye movement velocity. f Velocity of eye movement during reading. g Eye tracking in the wild under different scenarios and lighting conditions.

The analysis of fixation points during reading represents one of the core applications of eye-tracking technology. It provides an indirect yet objective means to reveal the reader’s internal, real-time cognitive processing. By analyzing fixation points, insights can be gained into the difficulty and depth of cognitive processing, as well as into reasoning patterns and individual differences among readers. A text segment is displayed on the screen, and the user reads it. The head-mounted eye tracker recorded the eye images of the volunteer wearing EMFE cosmetic contact lenses and calculated the user’s fixation point coordinates frame by frame. As shown in Fig. 5e, the eye movement trace during reading consisted of static fixation points and dynamic eye movements. The blue circles in the figure mark the coordinates of the volunteer’s fixation points in sequence, and the color changes in the trajectory curve represent variations in the speed of eye movements. Additionally, the trajectory clearly shows reading details such as regressions and re-fixations. The eye movement trace obtained from a 180-fps high-speed RGB camera is shown in Supplementary Fig. 20. As shown in Fig. 5f, this figure specifically illustrates the speed changes during the eye movement process. The entire process of eye tracking when reading is shown in Supplementary Video 2.

Eye tracking in the wild is a crucial functional module for future lightweight mixed reality, enabling dynamic fusion between virtual objects and the real world. However, achieving robust eye tracking faces challenges due to randomly varying outdoor lighting conditions and complex scene changes. The volunteer wearing EMFE contact lenses and a head-mounted eye tracker was tested across multiple scenarios from afternoon till dusk, including beside a shopping mall, along a tree-shaded road, in a square, by a lakeside, and in front of a large electronic screen, as illustrated in Fig. 5g. The volunteer held a red dot marker, moving it to different angles within their field of view while continuously fixating on the dot. During outdoor eye tracking experiments, ambient lighting shifted from intense direct sunlight during the day to soft oblique sunlight at dusk. Despite random scenes projecting interference patterns onto the contact lens surface, eye movement features on the lens were consistently identified, enabling continuous tracking. This process is shown in Supplementary Video 3.

Discussion

Addressing the challenge of achieving robust recognition of eye-movement-related ocular features in natural environments, this paper utilizes cosmetic contact lenses, a widely accepted, extremely low-cost decorative lens in daily life, to enhance eye-movement features. The lens is designed with 12 brightly colored, high-contrast patterns of green solid circles and red dots, enhancing the spatial accuracy and temporal continuity of eye-tracking technology, respectively. Fabricated using commercial-grade cosmetic lens materials, manufacturing processes, and sterilization protocols, the proposed EMFE contact lens exhibits excellent biocompatibility and comfort. The eye-movement feature patterns feature standardized colors and consistent structures, enabling accurate recognition using HSV threshold and connected domain morphological processing. Under random and complex natural lighting conditions, the recognition rate for eye-movement features with the EMFE lens improved by 53% compared to bare eyes, achieving an overall recognition success rate of 93%. The annular distribution of feature patterns along the peripheral ring of the lens permits highly off-axis camera placement. Even with the camera positioned at an 80° pitch angle, full-range eye movement feature acquisition remains achievable, facilitating the unobtrusiveness of head-mounted eye-tracking devices. Validated using a high-precision eye model, the eye-tracking accuracy was 0.82° ± 0.58° with an eye movement range of ±40° in yaw and ±30° in pitch. In the human eye movement tests, the eye-tracking accuracy and precision were 0.70° ± 0.38° and 0.22° ± 0.13°, respectively, with an eye movement range of ±16° in yaw and ±16° in pitch. These sub-1° angular errors are smaller than the visual angle provided by the fovea on the retina, sufficiently enabling accurate determination of the human gaze region. Furthermore, by calculating the 3D pose information of the green solid circles, gaze direction estimation can be achieved. During dynamic eye movements, the red trails generated by the motion blur of the red dot patterns aid in recognizing eye-movement features, compensating for the continuity of eye movement trace captured by low-frame-rate cameras. Based on the EMFE lens, this paper demonstrates human-computer interaction applications such as image region-of-interest identification, fixation point tracking during reading, and in-the-wild eye tracking, showcasing its application potential in fields including human factors design, cognitive research, and psychological studies. The comparison of key parameters, advantages, and limitations between this work and other commercially available eye trackers and state-of-the-art academic solutions is summarized in the Supplementary Table 3.

In the future, eye-tracking technology based on EMFE smart contact lenses still has room for improvement. For instance, upgrading to high sampling rate cameras (e.g., ≥200 Hz) would support the capture and analysis of eye movement behaviors such as fixation, smooth pursuit, saccades, and microsaccades, enabling faster and more precise tracking of eye movement trajectories and calculation of peak eye movement velocities. Enhancing the wear stability of EMFE lenses on the eye, by increasing the wet friction on the inner surface of the lens or customizing scleral lenses, could improve their resistance to slippage. Additionally, improving the usability of EMFE-based systems through spatially-chromatically encoded eye movement feature patterns and three-dimensional spatial computation models could further simplify the calibration process and enhance robustness against camera slippage.

Methods

Pattern design and fabrication of the EMFE cosmetic contact lens

The eye-tracking feature pattern of the EMFE cosmetic contact lens is designed based on an RGB primary color strategy. The annular pattern is positioned peripherally on the lens, featuring an outer diameter of 13 mm and an inner diameter of 8 mm to ensure an unobstructed visual window. Considering camera parameters and the precision of transfer printing technology, the blue ring incorporates 12 uniformly distributed green solid circles (1.5 mm diameter) for static image eye feature enhancement and 12 red dots (0.3 mm diameter) for dynamic eye movement trajectory tracking.

The EMFE cosmetic contact lens, fabricated from HEMA hydrogel material, employs a sandwich-like multi-layer preparation process to integrate the eye-tracking pattern within the lens body by a manufacturer of cosmetic contact lenses with a registration certificate for a medical device. As the pattern comprises three colors, corresponding steel plates were sequentially prepared for the red, green, and blue elements. Lens pigment was applied to fill the engraved patterns on the plates. Using an elastic silicone transfer printing process, the patterns were transferred from the two-dimensional steel plates onto the three-dimensional spherical surface of the lens. To compensate for the distortion inherent in transferring from a flat surface to a spherical substrate, the plate patterns underwent radial pre-calibration. Furthermore, areas corresponding to the green solid circles and red dots within the blue ring were hollowed out on the plate to ensure color fidelity in the final product. The encapsulated pigment is sealed within the lens interior and is vulnerable to steam sterilization, ensuring biocompatibility.

Cytotoxicity tests

Human corneal epithelial cell lines (Cellosaurus CVCL_1272, obtained from Shanghai Zhong Qiao Xin Zhou Biotechnology Co. Ltd.) were employed to test cytotoxicity. HCE-Ts were maintained in HCE-T cell complete medium (ZM1003, obtained from Shanghai Zhong Qiao Xin Zhou Biotechnology Co. Ltd.) at 37 °C in a humidified atmosphere of 5% CO2. After several passages, these cells were harvested and plated at a density of 5000 cells per well. The cells were incubated in complete medium for 24 h.

Before the extracts were prepared by immersing both the EMFE cosmetic contact lenses and commercial contact lenses (Clariti, Coopervision) in complete medium at 37 °C for 24 h. In addition, complete medium was used as a control group. Extracts of the EMFE cosmetic contact lenses and extracts of commercial contact lenses were used as the comparisons. The extracts were prepared with a contact lens of 0.2 g in a complete medium of 1 ml according to ISO 10993-5. After the cells were cultured using extracts for 24 h, a cell viability test (n = 5) was performed. The cytotoxicity was assessed by a cell counting kit-8 assay (C0038, Beyotime Biotech Inc). The absorbance was read at 450 nm using a multimode plate reader. The absorbance values were converted into percentage values relative to the absorbance obtained from only the cell growth media.

Head-mounted eye tracker

The head-mounted eye tracker comprises two eye-tracking cameras targeting the eyes and one forward-facing scene camera, as shown in Supplementary Fig. 21. Detailed specifications for both camera types are provided in Supplementary Table 4. The two eye-tracking cameras are integrated into the inferior-anterior region of both eyeballs, positioned approximately 30 mm from the ocular surface. Each camera is pitched upward at 50° to capture eye images. Video streams from all three cameras are consolidated via a USB hub, enabling communication with a host computer through a single USB cable. The head-mounted eye tracker has a total weight of 96g, supporting prolonged wear and continuous acquisition of the user’s eye movement.

Eye movement feature recognition algorithm

The eye movement features on the cosmetic contact lens consist of two types of patterns: green solid circles and red dots. These patterns exhibit standardized, consistent color and structure, with a peripheral blue ring providing strong color contrast. Consequently, these characteristic patterns can be accurately identified using a simple HSV threshold and a connected component morphological threshold. The recognition results of the algorithm at key stages are illustrated in Supplementary Fig. 22.

To prevent misidentification of similarly colored patterns outside the eye, the algorithm first identifies the eye region before detecting features like the green solid circles and red dots within this region. For green solid circle identification: The algorithm applies an HSV threshold within the eye region to select pixel areas exhibiting reasonable saturation and brightness within the green color range. While illumination reflections, scene object projections, and eyelashes in the image are prone to misidentification, these regions exhibit significant morphological differences compared to the green solid circle. Therefore, filtering based on the area, eccentricity, and solidity of connected components effectively recognizes the green solid circle regions. This step also removes green solid circles severely occluded by eyelids. Due to complex and random lighting environments, reflection artifacts may superimpose on the green solid circle, causing centroid deviation. The algorithm assesses the interference level by calculating the Intersection over Union (IoU) between the connected component and its fitted ellipse. Setting an appropriate IoU threshold filters out green solid circles excessively affected by lighting interference. For red dots or their motion blur identification, similarly, the algorithm uses an HSV threshold to select red regions within the eye area. In static images, red regions appear circular, while in dynamic images, they manifest as linear blurs. Consequently, the algorithm filters based on the area of connected components. Finally, the skeleton of the connected component is extracted, enabling the calculation of either the centroid for the red dot or the trajectory for the motion blur.

Eye movement angle calculation algorithm

The eye tracking characterization, both on the eye movement model and the human eye, experiences two phases: the calibration phase and the tracking phase.

During the calibration phase, the eye movement model is rotated to a set of setup angles, and the volunteers are guided to fixate sequentially on a set of visual targets at known positions (e.g., specified gaze angles or screen coordinates). The eye tracker captures images of the eye at each target. The regions of the green solid circles are identified in each image, and their respective centroid pixel coordinates are computed. A polynomial mapping function is then fitted for each green solid circle, relating its centroid coordinates to the corresponding known gaze directions, thereby concluding the calibration phase.

During the eye-tracking phase, the eye tracker continuously captures eye images. The centroids of the detectable green solid circles are computed, and their corresponding mapping functions are applied to estimate the gaze angle. When multiple green solid circles are detected in both calibration and tracking, multiple mapping functions are constructed and applied. The final gaze direction is obtained by averaging the individual estimates from all available circles.

Safety assurance for human eye movement tracking

To evaluate key performance metrics such as recognition rate and accuracy of the eye-tracking technology during practical application, healthy participants (2 males and 2 females) were selected to wear these commercial cosmetic contact lenses. Ages of the participants are between 22 and 29 years old. This study was approved by the Ethics Committee of the First Affiliated Hospital of Nanjing Medical University (2025-SR-023). All data measurements on the bodies of participants are performed with their full, informed consent. The public disclosure of all individual data has been approved by the participants. Participants were required to be free from ocular pathologies such as dry eye syndrome or ocular surface inflammation. Before wearing the EMFE contact lens, intact packaging was verified, and participants were instructed to wash their hands thoroughly prior to lens application. During the experiments, the head-mounted eye tracker employed no close-range eye illumination. The wearing time of the cosmetic lens did not exceed 6 h. Collectively, these measures ensured the safety of the experimental procedures.

Reporting summary

Further information on research design is available in the Nature Portfolio Reporting Summary linked to this article.

Supplementary information

41467_2026_68918_MOESM2_ESM.pdf (81KB, pdf)

Description of Additional Supplementary Information

Supplementary Video 1 (70.2MB, mp4)
Supplementary Video 2 (39.8MB, mp4)
Supplementary Video 3 (92.4MB, mp4)
Reporting Summary (77.4KB, pdf)

Source data

Source data (935.2KB, xlsx)

Acknowledgments

This research was funded by National Key R&D Program of China (2021YFA1401103), National Natural Science Foundation of China (62305153 and 62501260), Natural Science Foundation of Jiangsu Province (BK20243014), Quantum Science and Technology-National Science and Technology Major Project (2021ZD0300700), the China Postdoctoral Science Foundation (2024M761393 and 2025T180154), Basic Research Program of Jiangsu (BK20251253), and Jiangsu Funding Program for Excellent Postdoctoral Talent (2025ZB039). In addition, the authors gratefully acknowledge Dr. Lei Zhao and Dr. Qiyong Xu from the Yongjiang Laboratory for their assistance in the comparative experiments with the Pico Neo3 Pro eye tracker.

Author contributions

All authors provided active and valuable feedback on the paper. F.X., H.Z., and H.Y. initiated the concept and designed the studies; F.X. supervised the work; H.Z. led the experiments and collected the overall data; H.H. and Z.L. contributed to the recognition algorithm and eye tracking algorithm; Z.Q. contributed to the design and fabrication of the head-mounted eye tracker; S.Y., Y.F., and Y.N.X. contributed to the human experiment; Y.F.X., Y.C., and H.Y. advised on the experiment and manuscript; F.X. and H.Z. co-wrote the paper.

Peer review

Peer review information

Nature Communications thanks Cyril Lahuec, Yuan Lin, and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. A peer review file is available.

Data availability

The authors declare that all data supporting the results in this study are present in the paper, and the data sources are uploaded in the Supplementary Information with this paper. Any additional requests for information can be directed to and will be fulfilled by the corresponding authors. Source data are provided with this paper.

Code availability

The codes supporting this study’s findings are available from https://github.com/yiyinju/feature-recognition-for-EMFE-cosmetic-contact-lens53

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

The online version contains supplementary material available at 10.1038/s41467-026-68918-y.

References

  • 1.Valliappan, N. et al. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nat. Commun.11, 4553 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Chang, H. et al. Sleep microstructure organizes memory replay. Nature637, 1161–1169 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Gehmacher, Q. et al. Eye movements track prioritized auditory features in selective attention to natural speech. Nat. Commun.15, 3692 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Pärnamets, P. et al. Biasing moral decisions by exploiting the dynamics of eye gaze. Proc. Natl. Acad. Sci. USA112, 4170–4175 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Adhanom, I. B., MacNeilage, P. & Folmer, E. Eye Tracking in virtual reality: a broad review of applications and challenges. Virtual Real.27, 1481–1505 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Song, J.-H., van de Groep, J., Kim, S. J. & Brongersma, M. L. Non-local metasurfaces for spectrally decoupled wavefront manipulation and eye tracking. Nat. Nanotechnol.16, 1224–1230 (2021). [DOI] [PubMed] [Google Scholar]
  • 7.Novák, J. S. et al. Eye tracking, usability, and user experience: a systematic review. Int. J. Hum.-Comput. Interact.40, 4484–4500 (2024). [Google Scholar]
  • 8.Clark, R. et al. The potential and value of objective eye tracking in the ophthalmology clinic. Eye33, 1200–1202 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Fu, H. L. et al. Influence of cues on the safety hazard recognition of construction workers during safety training: evidence from an eye-tracking experiment. J. Civ. Eng. Educ.150, 1 (2024). [Google Scholar]
  • 10.Xu, J. W. et al. Left gaze bias between LHT and RHT: a recommendation strategy to mitigate human errors in left- and right-hand driving. IEEE Trans. Intell. Veh.8, 4406–4417 (2023). [Google Scholar]
  • 11.Homayounfar, S. Z. et al. Multimodal smart eyewear for longitudinal eye movement tracking. Matter3, 1275–1293 (2020). [Google Scholar]
  • 12.Robinson, D. A. A method of measuring eye movemnent using a scieral search coil in a magnetic field. IEEE Trans. Bio-Med. Electron.10, 137–145 (1963). [DOI] [PubMed] [Google Scholar]
  • 13.Houben, M. M. J., Goumans, J. & van der Steen, J. Recording three-dimensional eye movements: scleral search coils versus video oculography. Investig. Ophthalmol. Vis. Sci.47, 179–187 (2006). [DOI] [PubMed] [Google Scholar]
  • 14.Ebisawa, Y. & Fukumoto, K. Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras. IEEE Trans. Biomed. Eng.60, 2952–2960 (2013). [DOI] [PubMed] [Google Scholar]
  • 15.Chi, J.-N. et al. Key techniques of eye gaze tracking based on pupil corneal reflection. in 2009 WRI Global Congress on Intelligent Systems, pp 133–138 (2009).
  • 16.Frey, M., Nau, M. & Doeller, C. F. Magnetic resonance-based eye tracking using deep neural networks. Nat. Neurosci.24, 1772–1779 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Shi, Y. et al. Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface. Nat. Commun.14, 3315 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Villanueva, A. & Cabeza, R. A novel gaze estimation system with one calibration point. IEEE Trans. Syst. Man Cybern. Part B (Cybern.)38, 1123–1138 (2008). [DOI] [PubMed] [Google Scholar]
  • 19.Ponz, V. et al. Topography-based detection of the iris centre using multiple-resolution images. in 2011 Irish Machine Vision and Image Processing Conference, pp 32–37 (2011).
  • 20.Valenti, R., Sebe, N. & Gevers, T. Combining head pose and eye location information for gaze estimation. IEEE Trans. Image Process.21, 802–815 (2012). [DOI] [PubMed] [Google Scholar]
  • 21.Park, S., Spurr, A. & Hilliges, O. Deep pictorial gaze estimation. in Computer Vision - ECCV 2018, (Cham, 2018) pp 741–757.
  • 22.Martinikorena, I. et al. A. Fast and robust ellipse detection algorithm for head-mounted eye tracking systems. Mach. Vis. Appl.29, 845–860 (2018). [Google Scholar]
  • 23.Wu, Z. et al. EyeNet: a multi-task deep network for off-axis eye gaze estimation. in 2019 IEEE/CVF International Conference on Computer Vision Workshop, pp 3683–3687 (2019).
  • 24.Morimoto, C. H. et al. Pupil detection and tracking using multiple light sources. Image Vis. Comput.18, 331–335 (2000). [Google Scholar]
  • 25.Coutinho, F. L. & Morimoto, C. H., Free head motion eye gaze tracking using a single camera and multiple light sources. in 19th Brazilian Symposium on Computer Graphics and Image Processing, Manaus, BRAZIL, p 171 (2006).
  • 26.Mestre, C., Gautier, J. & Pujol, J. Robust eye tracking based on multiple corneal reflections for clinical applications. J. Biomed. Opt.23, 1–9 (2018). [DOI] [PubMed] [Google Scholar]
  • 27.Beymer, D. & Flickner, M. Eye gaze tracking using an active stereo head. in 2003IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, Madison, WI, USA, p II-451 (2003).
  • 28.Guestrin, E. D. & Eizenman, M. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng.53, 1124–1133 (2006). [DOI] [PubMed] [Google Scholar]
  • 29.Wang, J. et al. Accurate eye tracking from dense 3D surface reconstructions using single-shot deflectometry. Nat. Commun.16, 2902 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Tonsen, M. et al. Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, South Carolina, pp 139–142 (2016).
  • 31.Zhu, Z. & Ji, Q. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Underst.98, 124–154 (2005). [Google Scholar]
  • 32.Yao, G. et al. Snowflake-inspired and blink-driven flexible piezoelectric contact lenses for effective corneal injury repair. Nat. Commun.14, 3604 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Zhou, C. et al. Modulus-adjustable and mechanically adaptive dry microneedle electrodes for personalized electrophysiological recording. npj Flex. Electron.9, 77 (2025). [Google Scholar]
  • 34.Yao, G. et al. A programmable and skin temperature-activated electromechanical synergistic dressing for effective wound healing. Sci. Adv.8, eabl8379 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Yao, G. et al. Smart contact lenses: Catalysts for science fiction becoming reality. Innovation5, 100710 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Massin, L. et al. Multipurpose bio-monitored integrated circuit in a contact lens eye-tracker. Sensors22, 595 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Zhu, H. et al. Frequency-encoded eye tracking smart contact lens for human–machine interaction. Nat. Commun.15, 3588 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Gan, X. et al. Closed-eye intraocular pressure and eye movement monitoring via a stretchable bimodal contact lens. Microsyst. Nanoeng.11, 83 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Fradkin, I. M. et al. Contact lens with moiré labels for precise eye tracking. in Proc. SPIE 13414, Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR) VI, San Francisco, California, United States, p 134141W (2025).
  • 40.Khaldi, A. et al. A laser emitting contact lens for eye tracking. Sci. Rep.10, 14804 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Othéguy, M., Nourrit, V. & de Bougrenet de la Tocnaye, J.-L. Instrumented contact lens to detect gaze movements independently of eye blinks. Transl. Vis. Sci. Technol.13, 12 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Mokatren, M., Kuflik, T. & Shimshoni, I. 3D gaze estimation using RGB-IR cameras. Sensors23, 381 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Fuhl, W. et al. ElSe: ellipse selection for robust pupil detection in real-world environments. in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, South Carolina, pp 123–130 (2016).
  • 44.Fuhl, W. et al. ExCuSe: robust pupil detection in real-world scenarios. in Computer Analysis of Images and Patterns, Cham, pp 39–51 (2015).
  • 45.Santini, T., Fuhl, W. & Kasneci, E. PuRe: Robust pupil detection for real-time pervasive eye tracking. Comput. Vis. Image Underst.170, 40–50 (2018). [Google Scholar]
  • 46.Santini, T., Fuhl, W. & Kasneci, E., PuReST: robust pupil tracking for real-time pervasive eye tracking. in Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland, pp 1–5 (2018).
  • 47.Świrski, L., Bulling, A. & Dodgson, N., Robust real-time pupil tracking in highly off-axis images. in Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, California, pp 173–176 (2012).
  • 48.Zandi, B. et al. PupilEXT: flexible open-source platform for high-resolution pupillometry in vision research. Front. Neurosci.15, 676220 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Timm, F. & Barth, E. Accurate eye centre localisation by means of gradients. in International Conference on Computer Vision Theory and Applications, pp 125–130 (2011).
  • 50.Poletti, M., Rucci, M. & Carrasco, M. Selective attention within the foveola. Nat. Neurosci.20, 1413–1417 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Zen, C. & Jen-Bin, H. A vision-based method for the circle pose determination with a direct geometric interpretation. IEEE Trans. Robot. Autom.15, 1135–1140 (1999). [Google Scholar]
  • 52.Martina, P. An eye for detail: eye movements and attention at the foveal scale. Vis. Res.211, 108277 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Zhu, H. Spatial-Chromatic Encoding Cosmetic Contact Lenses for Enhanced Natural Eye Tracking, Feature recognition for EMFE cosmetic contact lens. 10.5281/zenodo.18254391 (2026). [DOI] [PMC free article] [PubMed]
  • 54.zhuoyi0904, Standing Man. https://skfb.ly/oQnnK.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

41467_2026_68918_MOESM2_ESM.pdf (81KB, pdf)

Description of Additional Supplementary Information

Supplementary Video 1 (70.2MB, mp4)
Supplementary Video 2 (39.8MB, mp4)
Supplementary Video 3 (92.4MB, mp4)
Reporting Summary (77.4KB, pdf)
Source data (935.2KB, xlsx)

Data Availability Statement

The authors declare that all data supporting the results in this study are present in the paper, and the data sources are uploaded in the Supplementary Information with this paper. Any additional requests for information can be directed to and will be fulfilled by the corresponding authors. Source data are provided with this paper.

The codes supporting this study’s findings are available from https://github.com/yiyinju/feature-recognition-for-EMFE-cosmetic-contact-lens53


Articles from Nature Communications are provided here courtesy of Nature Publishing Group

RESOURCES