Augmented reality haptic (ARH): an approach of electromagnetic tracking in minimally invasive surgery (original) (raw)

Augmented reality and haptic interfaces for robot-assisted surgery

The International Journal of Medical Robotics and Computer Assisted Surgery, 2011

An architecture of human-robot navigation system, based on ultra-wideband positioning, twofold ultrasonic sensors for heading, and an augmented reality smart-glasses interface, is presented. The position is obtained by a trilateration algorithm based on Extended Kalman Filter, and the heading by fusing the ultra-wideband position with the phase difference measured by the ultrasonic system. The phase difference received at the ultrasonic sensor is extract using the three parameter sine fitting algorithm. For this application in the CERN tunnel of the Large Hadron Collider, the inspection robot precedes the human during the navigation in the harsh environment, and collects temperature, oxygen percentage, and radiation level. The environment measurements are displayed by the smart-glasses at the operator, and in case of a dangerous condition, the operator is warned by the augmented reality interface. The navigation and monitoring system allows to maintain safety the relative human-robot position. Preliminary simulation results of the positioning and heading system are discussed to validate the main idea.

Augmented Reality in Surgical Procedures

Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.

Haptically enhanced VR surgical training system

Current Directions in Biomedical Engineering

This paper proposes a cost-effective VR surgical training system which computes haptic feedback forces when a VR surgical tool interacts with virtual tissues. A 3 degrees of freedom (DoF) reverse linear Delta mechanism is used to render computed force feedback data which are then received by the fingertip of the operator. Additionally, the moving plate allows rendering of surface properties and lateral forces occurring due to a tumor with different stiffness parameters below the skin surface. Controllers are designed and implemented to regulate the haptic feedback device’s end-effector position and applied force. The virtual surgical instruments are controlled by a 7DoF serial link manipulator which captures the operator’s movement by the utilization of various sensors. The controllers to regulate forces as well as the positions are evaluated with the proposed haptic feedback device. The mean RMSE of the force and mean error of the angular displacement are 0.0707 N and 1.95°, respec...

Augmented reality in surgical procedures

Human Vision and Electronic Imaging XIII, 2008

Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.

TrEndo, a device for tracking minimally invasive surgical instruments in training setups

Sensors and Actuators A: Physical, 2006

A novel, four degrees of freedom, low-cost device for tracking minimally invasive surgical instruments (MIS) in training setups was developed. This device consists of a gimbal mechanism with three optical computer mouse sensors. The gimbal guides the MIS instrument, while optical sensors measure the movements of the instrument. To test the feasibility of using optical mouse sensors to track MIS instruments, the accuracy of these sensors was tested depending on three conditions: distance between lens and object, velocity of movements, and surface characteristics. The results of this study were used for developing a prototype-TrEndo.

Laparoscopic instrument tip position estimation for visual and haptic guidance in the computer assisted surgical trainer

2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2017

A mathematical model of 4 degrees of freedom of forward and inverse kinematics is presented to provide visual and haptic guidance for laparoscopic surgery skills training. Also, a particle swarm optimization technique with color marker detection is used to estimate the instrument tip position accurately. Using a simple heuristic for the visual guidance, inverse kinematics solutions are calculated in real-time, which the haptic guidance uses as well. The experimental results illustrate the effectiveness of the proposed method.

Method for estimating dynamic EM tracking accuracy of surgical navigation tools

Medical Imaging 2006: Visualization, Image-Guided Procedures, and Display, 2006

Optical tracking systems have been used for several years in image guided medical procedures. Vendors often state static accuracies of a single retro-reflective sphere or LED. Expensive coordinate measurement machines (CMM) are used to validate the positional accuracy over the specified working volume. Users are interested in the dynamic accuracy of their tools. The configuration of individual sensors into a unique tool, the calibration of the tool tip, and the motion of the tool contribute additional errors. Electromagnetic (EM) tracking systems are considered an enabling technology for many image guided procedures because they are not limited by line-of-sight restrictions, take minimum space in the operating room, and the sensors can be very small. It is often difficult to quantify the accuracy of EM trackers because they can be affected by field distortion from certain metal objects. Many high-accuracy measurement devices can affect the EM measurements being validated. EM Tracker accuracy tends to vary over the working volume and orientation of the sensors. We present several simple methods for estimating the dynamic accuracy of EM tracked tools. We discuss the characteristics of the EM Tracker used in the GE Healthcare family of surgical navigation systems. Results for other tracking systems are included.

The WHaSP: A Wireless Hands-Free Surgical Pointer for Minimally Invasive Surgery

IEEE/ASME Transactions on Mechatronics, 2000

To address the challenges of surgical instruction during minimally invasive surgery (MIS), a wireless hands-free pointer system has been developed. The Wireless Hands-free Surgical Pointer system incorporates infrared and inertial tracking technologies to address the need for hands-free pointing during MIS. The combination of these technologies allows for optimal movement of the pointer and excellent accuracy while the user is located at a realistic distance from the surgical monitor. Several experimental evaluations were performed to optimize the settings of the sensors, and to validate the system when compared to a commercially available hands-free pointing system. The results show improved performance with the proposed technology as measured by the total trajectory travelled by the pointer and the smoothness of the curve. The technology presented has the potential to significantly improve surgical instruction and guidance during MIS.

Preliminary Design and Evaluation of an Interfacing Mechanism for Maneuvering Virtual Minimally Invasive Surgical Instruments

2022 International Symposium on Medical Robotics (ISMR)

Augmenting the motion of virtual surgical instruments onto a minimally invasive surgical field acts as a visual cue for the operating surgeon. In this work we propose an interfacing mechanism to provide input for maneuvering such virtual surgical instruments. Specifically, an interface in the form of a 3D-printed dodecahedron pen with attached binary squared planar markers is employed. The proposed tracking mechanism computes the pose of the interface from a realtime video feed acquired from a camera. The system provides accurate pose estimation with mean errors of 0.27 ± 0.06 mm in translation and 0.37 ± 0.04 degrees in rotation. The object pose estimation takes ∼6 ms. Utilized Azure Kinect camera with frame rate of 30 FPS and 1280 x 720 image resolution video, the tracking speed of the proposed system is ∼25 FPS. The easy to integrate, cost effective setup makes the interfacing mechanism particularly suitable for remote surgical tele-mentoring applications.