Skip to main content
. 2019 May 3;21(5):e11925. doi: 10.2196/11925

Table 3.

Summary of included studies evaluating other devices.

Study Device Aim Type of study Intervention Results/Conclusions
[53] Camera with Complementary Metal-Oxide-Semiconductor sensor To propose an architecture for a real-time multimodal system to provide a touchless user interface in surgery. Prototype user testing. Gesture detection in computer-assisted surgery. The preliminary results show good usability and rapid learning. The average time to click anywhere on the screen was less than 5 seconds. Lighting conditions affected the performance of the system. The surgeon showed strong interest in the system and satisfactorily assessed the use of gestures within the operating room.
[82] Webcam To describe a vision-based system that can interpret gestures in real time to manipulate objects within a medical data visualization environment. Prototype user testing. Manipulation of medical data (radiology images and selection of medical records) and movement of objects and windows on the screen. The system implemented in a sterile environment demonstrated performance rates between 95% and 100%.
[27] Canon VC-C4 color camera To describe a vision-based gesture capture system that interprets gestures in real time to manipulate medical images. Beta testing during a surgical procedure. Experiment. A beta test of a system prototype was conducted during a live brain biopsy operation, where neurosurgeons were able to browse through MRIa images of the patient’s brain using the sterile hand gesture interface. Gesture recognition accuracy was 96%. For every repeat of trials, the task completion time decreased by 28% and the learning curve levelled off at the 10th attempt. The gestures were learned very quickly and there was a significant decrease in the number of excess gestures. Rotation accuracy was reasonable. The surgeons rated the system as easy to use, with a rapid response, and useful in the surgical environment.
[26] Canon VC-C4 camera To evaluate the Gestix system. Prototype user testing. Manipulation of MRI images during a neurosurgical biopsy. The system setup time was 20 min. The surgeons found the Gestix system easy to use, with a rapid response, and easy to learn. The system does not require the use of wearable devices.
[59] Interaction with gestures in general Fieldwork focusing on work practices and interactions in an angiography suite and on understanding the collaborative work practices in terms of image production and use. Ethnographic study of minimally invasive image-guided procedures within an interventional radiology department. Manipulation of radiological images. The paper discusses the implications of the findings in the work environment for touchless interaction technologies, and suggests that these will be of importance in considering new input techniques in other medical settings.
[115] Commercial video camera To describe the development of Gestonurse, a robotic system for surgical instruments. Proof-of-concept. Surgical instrumentation using a robot. 95% of gestures were recognized correctly. The system was only 0.83 seconds slower when compared with the performance of a human instrument handler.
[65] Touchless interaction systems in general To understand and use common practices in the surgical setting from a proxemics point of view to uncover implications for the design of touchless interaction systems. The aim is to think of touchlessness in terms of its spatial properties. What does spatial separation imply for the introduction of the touchless control of medical images? Ethnographic study. Field observations of work practices in neurosurgery. Alternative ideas, such as multiple cameras, are the kind of solution that these findings suggest. Such reflections and considerations can be revealed through careful analysis of the spatial organization of activity and proxemics of particular interaction mechanisms. However, it is very important to study current practice in order to speculate about new systems, because they in turn may alter practice.
[122] Webcam To present a system for tracking the movement of MISb instruments based on an orthogonal webcam system installed in a physical simulator. Experiment. Recording the movements of the instrument within an imaginary cube. The results showed a resolution of 0.616 mm on each axis of work, linearity and repeatability in motion tracking, as well as automatic detection of the 3D position of the tip of the surgical instruments with sufficient accuracy. The system is a low-cost and portable alternative to traditional instrument tracking devices.
[52] MK, the LMCc, the Myo armband and voice control To evaluate the feasibility of using 3 different gesture control sensors (MK, the LMC and the Myo armband) to interact in a sterile manner with preoperative data as well as in settings of an integrated operating room during MIS. Pilot user study. 2 hepatectomies and 2 partial nephrectomies on an experimental porcine model. Natural user interfaces are feasible for directly interacting, in a more intuitive and sterile manner, with preoperative images and integrated operating room functionalities during MIS. The combination of the Myo armband and voice commands provided the most intuitive and accurate natural user interface.
[58] The Myo armband and the LMC To analyze the value of 2 gesture input modalities (the Myo armband and the LMC) versus 2 clinically established methods (task delegation and joystick control). User study. Comparative study. Simulating a diagnostic neuroradiological vascular treatment with 2 frequently used interaction tasks in an experimental operating room. Novel input modalities have the potential to carry out single tasks more efficiently than clinically established methods.

aMRI: magnetic resonance imaging.

bMIS: minimally invasive surgery.

cLMC: Leap Motion Controller.