A Comparative Study for Telerobotic Surgery Using Free Hand Gestures (original) (raw)

Implementation and evaluation of a gesture-based input method in robotic surgery

2011

The introduction of robotic master-slave systems for minimally invasive surgery has created new opportunities in assisting surgeons with partial or fully autonomous functions. While autonomy is an ongoing field of research, the question of how the growing number of offered features can be triggered in a time-saving manner at the master console is not well investigated. We have implemented a gesturebased user interface, whereas the haptic input devices that are commonly used to control the surgical instruments, are used to trigger actions. Intuitive and customizable gestures are learned by the system once, linked to a certain command, and recalled during operation as the gesture is presented by the surgeon. Experimental user studies with 24 participants have been conducted to evaluate the efficiency, accuracy and user experience of this input method compared to a traditional menu. The results have shown the potential of gesture-based input, especially in terms of time savings and enhanced user experience.

Hand position effects on precision and speed in telerobotic surgery

The International Journal of Medical Robotics and Computer Assisted Surgery, 2007

Background Many surgical robotic interfaces allow users to interact with robots over a wide potential range of motion, yet variation in operator performance across a range of motion remains unexamined. This research identifies and explores a new construct, the surgeon's 'comfortable working envelope' within the available range of motion, as a factor in surgical robotic interface design.

Evaluation of contactless human–machine interface for robotic surgical training

International Journal of Computer Assisted Radiology and Surgery

Purpose Teleoperated robotic systems are nowadays routinely used for specific interventions. Benefits of robotic training courses have already been acknowledged by the community since manipulation of such systems requires dedicated training. However, robotic surgical simulators remain expensive and require a dedicated humanmachine interface. Methods We present a low-cost contactless optical sensor, the Leap Motion, as a novel control device to manipulate the RAVEN-II robot. We compare peg manipulations during a training task with a contact-based device, the electro-mechanical Sigma.7. We perform two complementary analyses to quantitatively assess the performance of each control method: a metric-based comparison and a novel unsupervised spatiotemporal trajectory clustering. Results We show that contactless control does not offer as good manipulability as the contact-based. Where part of the metric-based evaluation presents the mechanical control better than the contactless one, the unsupervised spatiotemporal trajectory clustering from the surgical tool motions highlights specific signature inferred by the human-machine interfaces.

Mapping Surgeon's Hand/Finger Motion During Conventional Microsurgery to Enhance Intuitive Surgical Robot Teleoperation

2021

Purpose: Recent developments in robotics and artificial intelligence (AI) have led to significant advances in healthcare technologies enhancing robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties. However, current human-robot interfaces lack intuitive teleoperation and cannot mimic surgeon’s hand/finger sensing and fine motion. These limitations make tele-operated robotic surgery not suitable for micro-surgery and difficult to learn for established surgeons. We report a pilot study showing an intuitive way of recording and mapping surgeon’s gross hand motion and the fine synergic motion during cardiac microsurgery as a way to enhance future intuitive teleoperation. Methods: We set to develop a prototype system able to train a Deep Neural Network (DNN) by mapping wrist, hand and surgical tool real-time data acquisition (RTDA) inputs during mock-up heart micro-surgery procedures. The trained network was used to estimate the tools poses from refined hand join...

Touchless interaction in surgery

Communications of the ACM, 2014

With advances in medical imaging technologies in recent decades, we have seen their widespread adoption in the context of surgical procedures.

Ergonomic and gesture performance of robotized instruments for laparoscopic surgery

2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011

Shape and mechanical structure of instruments play a large part in the lack of ergonomics during laparoscopic surgery. Providing intra-abdominal mobility and rethinking handles design are two solutions to increase comfort and precision of gestures. Based on previous work that determined the optimal intra-abdominal kinematics, this study analyses the influence of handle design on both gesture performance and ergonomics. A virtual reality laparoscopic simulator was developed to perform an experimental comparison between two novel robotized instruments and standard ones. A group of 10 surgeons and 6 researchers in robotics carried out two representative surgical tasks with each instrument. Based on instrument and arm segments tracking, a gesture performance index and an ergonomics index were computed. The study demonstrates that distal mobilities combined with improved handle design and integration increase ergonomics during laparoscopy and facilitate complex gestures.

Evaluation of user‐interfaces for controlling movements of virtual minimally invasive surgical instruments

The International Journal of Medical Robotics and Computer Assisted Surgery

Background: Recent tele-mentoring technologies for minimally invasive surgery (MIS) augments the operative field with movements of virtual surgical instruments as visual cues. The objective of this work is to assess different user-interfaces that effectively transfer mentor's hand gestures to the movements of virtual surgical instruments. Methods: A user study was conducted to assess three different user-interface devices (Oculus-Rift, SpaceMouse, Touch Haptic device) under various scenarios. The devices were integrated with a MIS tele-mentoring framework for control of both manual and robotic virtual surgical instruments. Results: The user study revealed that Oculus Rift is preferred during robotic scenarios, whereas the touch haptic device is more suitable during manual scenarios for tele-mentoring. Conclusion: A user-interface device in the form of a stylus controlled by fingers for pointing in 3D space is more suitable for manual MIS, whereas a user-interface that can be moved and oriented easily in 3D space by wrist motion is more suitable for robotic MIS. K E Y W O R D S minimally invasive surgery, surgical simulations, tele-mentoring, user-interfaces, virtual surgical instruments 1 | INTRODUCTION Tele-medicine is playing an ever-increasing role in clinical practice with the aim to provide clinical healthcare from a distance. 1,2 It entails the use of software/hardware technologies to share clinical information and edit its content in real-time. An aspect of telemedicine, when applied to surgical context, includes tele-mentoring and tele-collaboration during a surgery. 3-5 Augmented reality based enabling technologies have been developed to facilitate telementoring between an operating and a remote surgeon during a minimally invasive surgery (MIS). It involves the use of user interfaces that assist the mentor (the remote surgeon) to perform screen markings 6-8 or display augmented hands gestures 9-11 to the mentee (the operating surgeon). More sophisticated user interfaces allow the This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

Master thesis: Usability Evaluation of the Kinect in Aiding Surgeon Computer Interaction

2013

Gesture-based interaction in the operating room can provide surgeons with a way to interact with medical images of the patient in a direct and sterile way as opposed to instructing an assistant. The purpose of this study was to determine whether a modern gesture-based interface using the Kinect is considered desirable and feasible during surgical procedures. To this end, we conducted a usability evaluation on a gesture-controlled medical image viewer with surgeons in a controlled operating room environment in the UMCG Skills Lab. Participants indicated that they would like to incorporate the system in the OR, even though performance measures indicated that the tested system is less accurate and slower when compared to asking an assistant. In a second study, we evaluated two popular gesture-based selection techniques (‘Dwell’ and ‘Push’) because of the importance of selection time and accuracy in surgical tasks. Results from this experiment indicated that the tested techniques were moderately less accurate while selection time was higher for gesture-based selection techniques when compared to the mouse condition. Overall, our studies show that GBI in the operating room is promising but possible refinements include additional functionality, recognition accuracy, more precise cursor control and more ‘natural’ gestures for certain functionality.

Comparative Assessment of a Novel Optical Human-Machine Interface for Laparoscopic Telesurgery

Lecture Notes in Computer Science, 2014

This paper introduces a novel type of human-machine interface for laparoscopic telesurgery that employs an optical sensor. A Raven-II laparascopic robot (Applied Dexterity Inc) was teleoperated using two different human-machine interfaces, namely the Sigma 7 electromechanical device (Force Dimension Sarl) and the Leap Motion (Leap Motion Inc) infrared stereoscopic camera. Based on this hardware platform, a comparative study of both systems was performed through objective and subjective metrics, which were obtained from a population of 10 subjects. The participants were asked to perform a peg transferring task and to answer a questionnaire. Obtained results allow to confirm that fine tracking of the hand could be performed with the Leap Motion sensor. Such tracking comprises accurate finger motion acquisition to control the robot's laparoscopic instrument jaws. Furthermore, the observed performance of the optical interface proved to be comparable to that of traditional electro-mechanical devices, such as the Sigma 7, during adequate execution of highly-dexterous laparascopic gestures.