Touchless interaction in surgery (original) (raw)

Interactional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgery

Computer Supported Cooperative Work (CSCW), 2014

While surgical practices are increasingly reliant on a range of digital imaging technologies, the ability for clinicians to interact and manipulate these digital representations in the operating theatre using traditional touch based interaction devices is constrained by the need to maintain sterility. To overcome these concerns with sterility, a number of researchers are have been developing ways of enabling interaction in the operating theatre using touchless interaction techniques such as gesture and voice to allow clinicians control of the systems. While there have been important technical strides in the area, there has been little in the way of understanding the use of these touchless systems in practice. With this in mind we present a touchless system developed for use during vascular surgery. We deployed the system in the endovascular suite of a large hospital for use in the context of real procedures. We present findings from a study of the system in use focusing on how, with touchless interaction, the visual resources were embedded and made meaningful in the collaborative practices of surgery. In particular we discuss the importance of direct and dynamic control of the images by the clinicians in the context of talk and in the context of other artefact use as well as the work performed by members of the clinical team to make themselves sensable by the system. We discuss the broader implications of these findings for how we think about the design, evaluation and use of these systems.

Approach for Intuitive and Touchless Interaction in the Operating Room

J, 2019

The consultation of medical images, 2D or 3D, has a crucial role for planned or ongoing surgical operations. During an intervention, this consultation induces a sterility loss for the surgeon due to the fact that the classical interaction devices are non-sterile. A solution to this problem would be to replace conventional devices by touchless interaction technologies, thereby enabling sterile interventions. In this paper, we present the conceptual development of an intuitive “gesture vocabulary” allowing the implementation of an effective touchless interactive system that is well adapted to the specificities of the surgical context. Our methodology and its implementation as well as our results are detailed. The suggested methodology and its implementation were both shown to be a valid approach to integrating this mean of interaction in the operating room.

Gesture-Based Interaction Using Touchless Technology for Medical Application

2015

In this paper, we present a touchless human-machine interaction dedicated to a science museum for children, in the short-term, and surgeons in operating room in the long-term. In fact, surgeons need to review medical images during minimally invasive surgery without interact directly with the medical devices. Touchless technology is an interesting solution despite of the many challenges as low accuracy, bad lighting conditions, occlusion and real time computing. In order to performed such application, an educational game, so-called Anatomia, is developed using Kinect and interacting with a 3D model of a human body. Many tests and surveys are carried out from children, adults and surgeons to evaluate the touchless technology and improve its ergonomics.

Touchless computer interfaces in hospitals: A review

Health Informatics Journal, 2018

The widespread use of technology in hospitals and the difficulty of sterilising computer controls has increased opportunities for the spread of pathogens. This leads to an interest in touchless user interfaces for computer systems. We present a review of touchless interaction with computer equipment in the hospital environment, based on a systematic search of the literature. Sterility provides an implied theme and motivation for the field as a whole, but other advantages, such as hands-busy settings, are also proposed. Overcoming hardware restrictions has been a major theme, but in recent research, technical difficulties have receded. Image navigation is the most frequently considered task and the operating room the most frequently considered environment. Gestures have been implemented for input, system and content control. Most of the studies found have small sample sizes and focus on feasibility, acceptability or gesture-recognition accuracy. We conclude this article with an agend...

Evaluation of user‐interfaces for controlling movements of virtual minimally invasive surgical instruments

The International Journal of Medical Robotics and Computer Assisted Surgery

Background: Recent tele-mentoring technologies for minimally invasive surgery (MIS) augments the operative field with movements of virtual surgical instruments as visual cues. The objective of this work is to assess different user-interfaces that effectively transfer mentor's hand gestures to the movements of virtual surgical instruments. Methods: A user study was conducted to assess three different user-interface devices (Oculus-Rift, SpaceMouse, Touch Haptic device) under various scenarios. The devices were integrated with a MIS tele-mentoring framework for control of both manual and robotic virtual surgical instruments. Results: The user study revealed that Oculus Rift is preferred during robotic scenarios, whereas the touch haptic device is more suitable during manual scenarios for tele-mentoring. Conclusion: A user-interface device in the form of a stylus controlled by fingers for pointing in 3D space is more suitable for manual MIS, whereas a user-interface that can be moved and oriented easily in 3D space by wrist motion is more suitable for robotic MIS. K E Y W O R D S minimally invasive surgery, surgical simulations, tele-mentoring, user-interfaces, virtual surgical instruments 1 | INTRODUCTION Tele-medicine is playing an ever-increasing role in clinical practice with the aim to provide clinical healthcare from a distance. 1,2 It entails the use of software/hardware technologies to share clinical information and edit its content in real-time. An aspect of telemedicine, when applied to surgical context, includes tele-mentoring and tele-collaboration during a surgery. 3-5 Augmented reality based enabling technologies have been developed to facilitate telementoring between an operating and a remote surgeon during a minimally invasive surgery (MIS). It involves the use of user interfaces that assist the mentor (the remote surgeon) to perform screen markings 6-8 or display augmented hands gestures 9-11 to the mentee (the operating surgeon). More sophisticated user interfaces allow the This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

A Non-Contact Mouse for Surgeon-Computer Interaction

2003

We have developed a system that uses computer vision to replace standard computer mouse functions with hand gestures. The system is designed to enable non- contact human-computer interaction (HCI), so that surgeons will be able to make more effective use of computers during surgery. In this paper, we begin by discussing the need for non-contact computer interfaces in the operating

Implementation and evaluation of a gesture-based input method in robotic surgery

2011

The introduction of robotic master-slave systems for minimally invasive surgery has created new opportunities in assisting surgeons with partial or fully autonomous functions. While autonomy is an ongoing field of research, the question of how the growing number of offered features can be triggered in a time-saving manner at the master console is not well investigated. We have implemented a gesturebased user interface, whereas the haptic input devices that are commonly used to control the surgical instruments, are used to trigger actions. Intuitive and customizable gestures are learned by the system once, linked to a certain command, and recalled during operation as the gesture is presented by the surgeon. Experimental user studies with 24 participants have been conducted to evaluate the efficiency, accuracy and user experience of this input method compared to a traditional menu. The results have shown the potential of gesture-based input, especially in terms of time savings and enhanced user experience.