Abstract
Background
Inasmuch as the conventional mouse is not an ideal input device for digital pathology, the aim of this study was to evaluate alternative systems with the goal of identifying a natural user interface (NUI) for controlling whole slide images (WSI).
Design
Four pathologists evaluated three webcam-based, head-tracking mouse emulators: Enable Viacam (eViacam, CREA Software), Nouse (JLG Health Solutions Inc), and Camera Mouse (CM Solutions Inc). Twenty WSI dermatopathological cases were randomly selected and examined with Image Viewer (Ventana, AZ, USA). The NASA-TLX was used to rate the perceived workload of using these systems and time was recorded. In addition, a satisfaction survey was used.
Results
The mean total time needed for diagnosis with Camera Mouse, eViacam, and Nouse was 18’57“, 19’37” and 22’32“, respectively (57/59/68 seconds per case, respectively). The NASA-TLX workload score, where lower scores are better, was 42.1 for eViacam, 53.3 for Nouse and 60.62 for Camera Mouse. This correlated with the pathologists’ degree of satisfaction on a scale of 1-5: 3.4 for eViacam, 3 for Nouse, and 2 for Camera Mouse (p< 0.05).
Conclusions
Head-tracking systems enable pathologists to control the computer cursor and virtual slides without their hands using only a webcam as an input device.
- Of the three software solutions examined, eViacam seems to be the best of those evaluated in this study, followed by Nouse and, finally, Camera Mouse.
- Further studies integrating other systems should be performed in conjunction with software developments to identify the ideal device for digital pathology.
Keywords: Digital Pathology, Natural User Interface, Ergonomics, Mouse, Input Device
Abstract
Introducción
Considerando que el ratón convencional no es el controlador ideal en patología digital, el objetivo del estudio fue evaluar sistemas alternativos y tratar de identificar una interfaz natural de usuario para controlar preparaciones digitalizadas.
Material y métodos
Cuatro patólogos evaluaron tres emuladores de ratón con reconocimiento facial a través de webcam: eViacam, Nouse y Camera Mouse. Se seleccionaron 20 casos digitalizados de dermatopatología aleatoriamente para su diagnóstico, empleando el software Image Viewer (Ventana, AZ, USA). Se utilizó el sistema NASA-TLX para registrar la carga de trabajo percibida y se grabaron los tiempos. Adicionalmente, se empleó un cuestionario de satisfacción.
Resultados
El tiempo medio requerido para diagnosticar con Camera Mouse, eViacam y Nouse fue de 18’57”, 19’37“y 22’32”, respectivamente (57/59/68 segundos por caso, respectivamente). La carga de trabajo NASA-TLX, donde registros menores implican menor carga, fue de 42,1 para eViacam, 53,3 para Nouse y 60,62 para Camera Mouse, correlacionándose con el grado de satisfacción de los patólogos en una escala de 1-5: 3,4 para eViacam (3,4), Nouse (3) y Camera Mouse (2) (p < 0,05).
Conclusiones
El reconocimiento facial posibilita a los patólogos el control del cursor y las preparaciones virtuales sin utilizar las manos, empleando únicamente una webcam como dispositivo de entrada.
- De los tres sistemas, eViacam es el mejor software evaluado en este estudio, seguido de Nouse y, finalmente, de Camera Mouse.
- Deben ser desarrollados estudios adicionales, integrando otros sistemas, en conjunción con el desarrollo de software para alcanzar el sistema ideal en patología digital.
Palabras clave: Patología digital, Interfaz natural de usuario, Ergonomía, Ratón, Dispositivo de entrada
Introduction
Only recently have pathologists started to implement digital pathology and the use of whole slide image (WSI) files, and the proper input device for dealing with a digital environment has not yet been established since the traditional mouse may not be optimal as a navigator tool for examining WSIs.1 In addition, reading WSIs is perceived to take considerably more time than conventional microscopy.2, 3 From the perspective of ergonomics and time optimization, few groups4, 5, 6, 7, 8 have shown interest in investigating alternatives to the conventional mouse, such as touchpads or multitouch screens,4 a vertical mouse, trackballs, a touchless device (LeapMotionTM),5 RollermouseTM, a videogame controller,6 a mimicker of a microscope stage: the so-called Ergopointer,TM a 6 degrees-of-freedom navigator7 and even speech recognition systems,8 among others.
Human–computer interaction (HCI) studies the way humans interact with computers using the senses of sight (graphic interface and video camera), hearing (speakers or headphones and microphone) and touch (input devices). The effector systems are mainly the hand and the fingers on the keyboard and the mouse. Voice and speech recognition systems,9, 10 as well as eye,11 face12 or body movement detection13 can also play a part, as they are considered natural user interfaces (NUI).
Gaze is also considered a natural mode of input. It is quite easy to focus on items only by looking at them.14 It is so intuitive that it requires very little or no training at all.15 Gaze-based systems are also very fast as pointing devices. Assuming the targets are large enough, they are faster than, though, in general, not as accurate as, the widely used conventional mouse.16 This concept has considered since at least the 1960s: if astronauts could control their maneuvering units with their eyes, then their hands could do a better job of controlling other parts of the spacecraft. However, controlling computers using gaze was first achieved in the 1980s. Moving one's eyes or even head is natural, requires little conscious effort and frees the hands for other tasks. Given that all on-screen objects reached by the gaze can potentially be activated (the so-called Midas touch effect), the dwell time (the time a user's eye rests on an interface element), configured in the settings for all these software systems, helps to avoid this issue. It can also be solved using other methods: mouth as a clicking button, voluntary blinking, a pedal or any other tool to activate the item previously selected by gaze.17
A low-cost alternative to infrared eye-tracking technologies is head-based interaction that allows for cursor control using webcam-based software, such as Nouse18 or CameraMouse.19 These mouse emulators are much cheaper than eye-trackers which normally cost several thousand dollars. All these technologies have been widely used in different fields. In medicine, eye-tracking analysis has been utilized for the study of neurological diseases.20 In the field of medical education and simulation, trainee performance improvement has also been investigated through the study of different gaze patterns. In this sense, both radiologists and pathologists have also used it to assess differences in expertise while reporting cases.21, 22 Finally, gaze is widely used as an interaction tool, improving communication for patients with physical disabilities that limit their movement.
Design
With the aim of identifying a natural user interface for controlling WSIs, we decided to compare 3 free head-tracking, webcam-based mouse emulators. The following three systems were included in this comparison: Enable Viacam (eViacam, CREA Software), Nouse (JLG Health Solutions Inc), and Camera Mouse (CM Solutions Inc). The webcam used was a middle market Logitech C170 and the display was a 30-inch Barco Coronis Fusion 4-MP. The tests were performed on a HP desktop running Windows 7 operating system (64-bit) with 8GB RAM. One of the authors of this study, a medical student from the International Federation of Medical Students’ Associations (IFMSA) exchange program, reviewed the performance of the 3 software solutions and a video tutorial was recorded (available at https://youtu.be/Muwy4moM2Ng). For this comparison, 20 WSIs were randomly selected from a general pool of 60 digitized dermatopathology cases of average difficulty, including inflammatory, melanocytic and non-melanocytic neoplastic lesions. They were scanned at 20x magnification using the iScan scanner (Ventana, AZ, USA), and these files were presented to four consultant histopathologists (37, 52, 57, and 58 years of age) using the Image Viewer software (Ventana, AZ, USA). The NASA Task Load Index (TLX), a well-known and widely used instrument, was employed to rate the perceived workload of using these systems to reach a diagnosis while time was recorded. NASA-TLX is a multidimensional rating procedure that provides an overall workload score based on a weighted average of ratings on six subscales: mental demands, physical demands, temporal demands, own performance, effort and frustration.23 In addition, a 5-point Likert scale satisfaction survey was used, which included the following components: comfort, adaptation time, cursor movement, goal achievement, prolonged use and general satisfaction. Statistical analysis was conducted using SPSS V24 (fig. 1 ).
Figure 1.
Video tutorial screenshot.
Results
Overall assessment (range 1-5) was better for eViacam (3.46), followed by Nouse (2.88), and Camera Mouse (2.33) (fig. 2 ). Even though adaptation time was considered poorer for Nouse than for the others, Nouse obtained a better rating for prolonged use. The mean total time needed for diagnosis with Camera Mouse, eViacam, and Nouse systems was 18’57“, 19’37” and 22’32“, respectively (57, 59 and 68 seconds per case, respectively). There was a significant correlation between time and perceived temporal demand (P< 0.05).
Figure 2.
Assessmentquestionnaire results.
The NASA-TLX weighted average workload score, where lower scores are better, was 42.1 for eViacam, 53.3 for Nouse and 60.6 for Camera Mouse (fig. 3 ). This correlated with the pathologists’ degree of satisfaction: 3.5 for eViacam, 3 for Nouse, and 2 for Camera Mouse (p< 0.05).
Figure 3.
NASA-TLXresults.
Posture and prolonged use ratings were inversely associated with physical demand (P< 0.05) and achievement was also inversely associated with effort (P< 0.05). A strong inverse correlation with performance was found for cursor movement and achievement (P< 0.01).
Conclusion
The head-tracking systems described in this study enable pathologists to control the computer cursor and virtual slides without their hands using only a webcam as an input device. Gaze recognition adequately emulates the functions of computer cursor movements and actions. It is easy to use with a short learning curve. Of the three software solutions examined, eViacam seems to be the best evaluated in this study, followed by Nouse and, finally, Camera Mouse. The time needed to report cases was in line with average values when dealing with dermatopathology biopsies.24, 25 Nevertheless, further studies are needed to compare the time spent by pathologists reporting digital and conventional slides after a period of training with these systems to verify whether gaze is faster than a traditional mouse, as other authors in other fields have reported.26
Head-tracking can be used not only by pathologists with physical impairments but also as an input device choice by those professionals who seek to avoid musculoskeletal disorders or have some sort of difficulty when using a conventional mouse. In addition, it is a viable option when extreme aseptic measures are required. In environments with high hygienic demands, these systems may be useful in providing interaction while requiring nothing be touched. Keyboards, mice, and more and more frequently touch screens are potential sources of infection or contamination, not only in operating rooms, but also in autopsy suites, gross rooms, and endoscopic units where rapid on-site evaluation is performed by cytopathologists or cytotechnicians and eventual review of previous studies is needed.27 Furthermore, recent situations like the COVID-19 pandemic, caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), makes touchless control of medical image viewers, including WSIs, an adequate option to limit exposure since the virus can remain viable on surfaces.28
From a different perspective, eye or head-tracking analysis offers further possibilities. In the field of artificial intelligence, gaze provides insight into how pathologists achieve diagnosis, making it a valuable source of information when designing artificial intelligence models through machine learning algorithms and deep convolutional neural networks in order to obtain automatic diagnoses.29
The aim of this comparison was to demonstrate that alternatives to the conventional mouse are available; as part of their proactive role, pathologists are the appropriate figure to further this investigation. Additional studies integrating these head-tracking interactions together with speech recognition systems or any of the input devices available on the market, should be performed in conjunction with software developments to achieve the ideal device for digital pathology from an ergonomics perspective, while providing intuitive and fast interaction. However, ultimately, the choice of one or another input device when designing workstations -pathologists’ cockpits- is a matter of personal preference.
Conflicts Of Interest
Authors have no conflicts of interest to disclose.
References
- 1.Al-Janabi S., Huisman A., Vink A., Leguit R.J., Offerhaus G.J., ten Kate F.J. Whole slide images for primary diagnostics of gastrointestinal tract pathology: a feasibility study. Hum Pathol. 2012;43:702–707. doi: 10.1016/j.humpath.2011.06.017. [DOI] [PubMed] [Google Scholar]
- 2.Al-Janabi S., Huisman A., Jonges G.N., Ten Kate F.J., Goldschmeding R., van Diest P.J. Whole slide images for primary diagnostics of urinary system pathology: a feasibility study. J Renal Inj Prev. 2014 Dec 1;3:91–96. doi: 10.12861/jrip.2014.26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Alcaraz-Mateos E., Caballero-Alemán F. Musculoskeletal disorders in Spanish pathologists. Prevalence and risk factors. Rev Esp Patol. 2015;48:9–13. doi: 10.1016/j.patol.2014.10.001. [DOI] [Google Scholar]
- 4.Wang Y., Williamson K.E., Kelly P.J., James J.A., Hamilton P.W. SurfaceSlide: a multitouch digital pathology platform. PLoS One. 2012;7:e30783. doi: 10.1371/journal.pone.0030783. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Alcaraz-Mateos E., Mèndez Ríos S., Martínez González-Moro I., Poblet E. Electromyographic analysis of muscle activation while using different input devices in digital pathology Abstracts: 31 st European Congress of Pathology. Virchows Arch. 2019 Sep;475(Suppl 1):1–436. doi: 10.1007/s00428-019-02631-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Yagi Y., Yoshioka S., Kyusojin H., Onozato M., Mizutani Y., Osato K. An ultra-high speed Whole Slide Image viewing system. Stud Health Technol Inform. 2012;179:23949. doi: 10.3233/978-1-61499-086-4-239. [DOI] [PubMed] [Google Scholar]
- 7.Molin J., Lundström C., Fjeld M. A comparative study of input devices for digital slide navigation. J Pathol Inform. 2015 24;6:7. doi: 10.4103/2153-3539.151894. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Alcaraz-Mateos E., Carceles F., Albarracin M., Hernandez R., Hernandez S., Hernandez L. Input device research for digital pathology: An ergonomic Outlook Abstracts: XXXI International Congress of the IAP and 28th Congress of the ESP. Virchows Arch. 2016;469(Suppl 1):S1–S346. doi: 10.1007/s00428-016-1997-7. [DOI] [Google Scholar]
- 9.Hartman D.J. Enhancing and customizing laboratory information systems to improve/Enhance pathologist workflow. Surg Pathol Clin. 2015;8:137–143. doi: 10.1016/j.path.2015.02.006. [DOI] [PubMed] [Google Scholar]
- 10.Singh M., Pal T.R. Voice recognition technology implementation in surgical pathology: advantages and limitations. Arch Pathol Lab Med. 2011;135:1476–1481. doi: 10.5858/arpa.2010-0714-OA. [DOI] [PubMed] [Google Scholar]
- 11.Man D.W., Wong M.S. Evaluation of computer-access solutions for students with quadriplegic athetoid cerebral palsy. Am J Occup Ther. 2007;61:355–364. doi: 10.5014/ajot.61.3.355. [DOI] [PubMed] [Google Scholar]
- 12.Kim D.G., Lee B.S., Lim S.E., Kim D.A., Hwang S.I., Yim Y.L. The selection of the appropriate computer interface device for patients with high cervical cord injury. Ann Rehabil Med. 2013;37:443–448. doi: 10.5535/arm.2013.37.3.443. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Juanes J.A., Gómez J.J., Peguero P.D., Ruisoto P. Digital Environment for Movement Control in Surgical Skill Training. J Med Syst. 2016 Jun;40:133. doi: 10.1007/s10916-016-0495-4. [DOI] [PubMed] [Google Scholar]
- 14.Stampe D.M., Reingold E.M. Selection by looking: A novel computer interface and its application to psychological research. In: Findlay J.M., Walker R., Kentridge R.W., editors. Eye movement research: Mechanisms, processes and applications. Elsevier Science.; Amsterdam:: 1995. pp. 467–478. [Google Scholar]
- 15.Majaranta P., Räihä K.J. Text Entry by Gaze: Utilizing Eye-Tracking. In: MacKenzie I.S., Tanaka-Ishii K., editors. Text entry systems: Mobility, accessibility, universality. San Francisco: Morgan Kaufmann.; 2007. pp. 175–187. [Google Scholar]
- 16.Sibert L.E., Jackob R.J.K. Evaluation of eye gaze interaction. CHI’00: Proceedings of the SIGCHI conference on Human Factors in Computing Systems. 2000:281–288. doi: 10.1145/332040.332445. [DOI] [Google Scholar]
- 17.Velichkovsky, Boris, Rumyantsev, Mikhail, Morozov, Mikhail New Solution to the Midas Touch Problem: Identification of Visual Commands Via Extraction of Focal Fixations. Procedia Computer Science. 39. 2014;39:75–82. doi: 10.1016/j.procs.2014.11.012. [DOI] [Google Scholar]
- 18.Mah J., Jutai J.W., Finestone H., Mckee H., Carter M. Usability of a Low-Cost Head Tracking Computer Access Method following Stroke. Assist Technol. 2015;27:158–171. doi: 10.1080/10400435.2015.1006343. [DOI] [PubMed] [Google Scholar]
- 19.Man D.W., Wong M.S. Evaluation of computer-access solutions for students with quadriplegic athetoid cerebral palsy. Am J Occup Ther. 2007;6:355–364. doi: 10.5014/ajot.61.3.355. [DOI] [PubMed] [Google Scholar]
- 20.Pereira M.L., Camargo M.V., Aprahamian I., Forlenza O.V. Eye movement analysis and cognitive processing: detecting indicators of conversion to Alzheimer's disease. Neuropsychiatr Dis Treat. 2014;10:1273–1285. doi: 10.2147/NDT.S55371. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Kelly B.S., Rainford L.A., Darcy S.P., Kavanagh E.C., Toomey R.J. The Development of Expertise in Radiology: In Chest Radiograph Interpretation”Expert“Search Pattern May Predate”Expert” Levels of Diagnostic Accuracy for Pneumothorax Identification. Radiology. 2016;280:252–260. doi: 10.1148/radiol.2016150409. [DOI] [PubMed] [Google Scholar]
- 22.Krupinski E.A., Tillack A.A., Richter L., Henderson J.T., Bhattacharyya A.K., Scott K.M. Eye-movement study and human performance using telepathology virtual slides: implications for medical education and differences with experience. Hum Pathol. 2006 Dec;37:1543–1556. doi: 10.1016/j.humpath.2006.08.024. [DOI] [PubMed] [Google Scholar]
- 23.Yurko Y.Y., Scerbo M.W., Prabhu A.S., Acker C.E., Stefanidis D. Higher mental workload is associated with poorer laparoscopic performance as measured by the NASA-TLX tool. Simul Healthc. 2010;5:267–271. doi: 10.1097/SIH.0b013e3181e3f329. [DOI] [PubMed] [Google Scholar]
- 24.Vyas N.S., Markow M., Prieto-Granada C., Gaudi S., Turner L., Rodriguez-Waitkus P. Comparing whole slide digital images versus traditional glass slides in the detection of common microscopic features seen in dermatitis. J Pathol Inform. 2016 Jul 26;7:30. doi: 10.4103/2153-3539.186909. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Velez N., Jukic D., Ho J. Evaluation of 2 whole-slide imaging applications in dermatopathology. Hum Pathol. 2008 Sep;39:1341–1349. doi: 10.1016/j.humpath.2008.01.006. [DOI] [PubMed] [Google Scholar]
- 26.Murata A., Fukunaga D. Extended Fitts’ model of pointing time in eye-gaze input system - Incorporating effects of target shape and movement direction into modeling. Appl Ergon. 2018 Apr;68:54–60. doi: 10.1016/j.apergo.2017.10.019. [DOI] [PubMed] [Google Scholar]
- 27.Ebert L.C., Hatch G., Ampanozi G., Thali M.J., Ross S. You can’t touch this: touch-free navigation through radiological images. Surg Innov. 2012;19:301–307. doi: 10.1177/1553350611425508. [DOI] [PubMed] [Google Scholar]
- 28.Pambuccian S.E. The COVID-19 pandemic: Implications for the cytology laboratory. Journal of the American Society of Cytopathology. 2020 doi: 10.1016/j.jasc.2020.03.001. (Epub ahead of print). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Rashidi H.H., Tran N.K., Betts E.V., Howell L.P., Green R. Artificial Intelligence and Machine Learning in Pathology: The Present Landscape of Supervised Methods. Acad Pathol. 2019 Sep 3;6 doi: 10.1177/2374289519873088. 2374289519873088. [DOI] [PMC free article] [PubMed] [Google Scholar]



