Augmented reality haptic (ARH): an approach of electromagnetic tracking in minimally invasive surgery (original) (raw)

Augmented reality in surgical procedures

Human Vision and Electronic Imaging XIII, 2008

Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.

Augmented Reality in Surgical Procedures

Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.

Augmented reality and haptic interfaces for robot-assisted surgery

The International Journal of Medical Robotics and Computer Assisted Surgery, 2011

An architecture of human-robot navigation system, based on ultra-wideband positioning, twofold ultrasonic sensors for heading, and an augmented reality smart-glasses interface, is presented. The position is obtained by a trilateration algorithm based on Extended Kalman Filter, and the heading by fusing the ultra-wideband position with the phase difference measured by the ultrasonic system. The phase difference received at the ultrasonic sensor is extract using the three parameter sine fitting algorithm. For this application in the CERN tunnel of the Large Hadron Collider, the inspection robot precedes the human during the navigation in the harsh environment, and collects temperature, oxygen percentage, and radiation level. The environment measurements are displayed by the smart-glasses at the operator, and in case of a dangerous condition, the operator is warned by the augmented reality interface. The navigation and monitoring system allows to maintain safety the relative human-robot position. Preliminary simulation results of the positioning and heading system are discussed to validate the main idea.

Haptically enhanced VR surgical training system

Current Directions in Biomedical Engineering

This paper proposes a cost-effective VR surgical training system which computes haptic feedback forces when a VR surgical tool interacts with virtual tissues. A 3 degrees of freedom (DoF) reverse linear Delta mechanism is used to render computed force feedback data which are then received by the fingertip of the operator. Additionally, the moving plate allows rendering of surface properties and lateral forces occurring due to a tumor with different stiffness parameters below the skin surface. Controllers are designed and implemented to regulate the haptic feedback device’s end-effector position and applied force. The virtual surgical instruments are controlled by a 7DoF serial link manipulator which captures the operator’s movement by the utilization of various sensors. The controllers to regulate forces as well as the positions are evaluated with the proposed haptic feedback device. The mean RMSE of the force and mean error of the angular displacement are 0.0707 N and 1.95°, respec...

Evaluation of Electromagnetic Tracking for Stereoscopic Augmented Reality Laparoscopic Visualization

Lecture Notes in Computer Science, 2014

Without the requirement of line-of-sight, electromagnetic (EM) tracking is increasingly studied and used in clinical applications. We designed experiments to evaluate a commercial EM tracking system in three situations: using the EM sensor by itself; fixing the sensor onto the handle of a stereoscopic (i.e., 3D) laparoscope; and placing the sensor on the outside surface of the head of a laparoscopic ultrasound (LUS) transducer. The 3D laparoscope and the LUS transducer are core elements in our stereoscopic laparoscopic augmented reality visualization system, which overlays real-time LUS image on real-time 3D laparoscopic video for minimally invasive laparoscopic surgery. Jitter error, positional static and dynamic accuracies were assessed with the use of LEGO basic bricks and building plates. The results show that the EM tracking system being tested yields satisfactory accuracy results and the attachment of the sensor to the planned positions on the probes is possible.

Design of a haptic system for medical image manipulation using augmented reality

Health and Technology, 2019

This work discusses the development of an augmented reality (AR) interface that allows users to manipulate 3D anatomical structures (segmented and reconstructed from real medical studies) through haptic interactions by means of a fivedegrees-of-freedom hand-exoskeleton. The exoskeleton allows the hand to move throughout its nominal range of motion. Our interface allows users to manipulate virtual objects more naturally thanks to its three points of contact, unlike most commercial haptic systems, which only have one point of contact. The preliminary results indicated a proper 3D segmentation and reconstruction of the anatomical structures of interest, as well as an adequate interaction between the haptic exoskeleton and the AR interface.

Augmented Reality in Minimally Invasive Surgery

Lecture Notes in Electrical Engineering, 2010

The advantages of the Minimally Invasive Surgery are evident for the patients, but these techniques have some limitations for the surgeons. In medicine, the Augmented Reality (AR) technology allows surgeons to have a sort of "Xray" vision of the patient's body and can help them during the surgical procedures. In this paper we present two applications of Augmented Reality that could be used as support for a more accurate preoperative surgical planning and also for an imageguided surgery. The first AR application can support the surgeon during the needle insertion for the Radiofrequency Ablation of the liver tumours in order to guide the needle and to have a precise placement of the instrument within the lesion. The augmented visualization can avoid as much as possible to destroy healthy cells of the liver. The second AR application can support the surgeon in the preoperative surgical planning by means of the visualization of the 3D models of the organs built from patient's medical images and in the choice of the best insertion points of the trocars in the patient's body.

Surgical navigation systems based on augmented reality technologies

ArXiv, 2021

This study considers modern surgical navigation systems based on augmented reality technologies. Augmented reality glasses are used to construct holograms of the patient's organs from MRI and CT data, subsequently transmitted to the glasses. This, in addition to seeing the actual patient, the surgeon gains visualization inside the patient's body (bones, soft tissues, blood vessels, etc.). The solutions developed at Peter the Great St. Petersburg Polytechnic University allow reducing the invasiveness of the procedure and preserving healthy tissues. This also improves the navigation process, making it easier to estimate the location and size of the tumor to be removed. We describe the application of developed systems to different types of surgical operations (removal of a malignant brain tumor, removal of a cyst of the cervical spine). We consider the specifics of novel navigation systems designed for anesthesia, for endoscopic operations. Furthermore, we discuss the construct...