Target tracking in 3D ultrasound volumes by direct visual servoing (original) (raw)

Improving ultrasound intensity-based visual servoing: Tracking and positioning tasks with 2D and bi-plane probes

2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011

Real time and non invasive, the ultrasound imaging modality can easily be used in minimally invasive surgery or needle insertion procedures to visualize an organ or a tumor to reach. However the manual stabilization of the ultrasound image while the organ moves with patient breathing or heart beating can be very tricky. In this paper, we present an intensity-based approach to control both in-plane and outof-plane motions of an ultrasound probe held by a robotic arm in order to reach and follow one organ cross section. Two methods are proposed to improve the accuracy of this intensity-based approach, by estimating on-line the 3D image gradient required in the control law and by considering a bi-plane sensor. Robotic experiments are performed with two different ultrasound sensors on a realistic abdominal phantom and validate this visual servoing approach.

Visual servoing for robot-assisted diagnostic ultrasound

2000

A robot system for positioning an ultrasound probe has been developed . Ultrasound visual servoing can be used to help ultrasound technicians position the ultrasound probe. The feasibility of visual servoing for motion in the plane of the ultrasound probe in one dimension has been addressed here. Tracking of the carotid artery for a long period of time has been demonstrated in real-time. Two different image processing methods have been evaluated, namely, the Star algorithm and a modified discrete snake algorithm.

Image-guided control of a robot for medical ultrasound

IEEE Transactions on Robotics and Automation, 2002

A robot-assisted system for medical diagnostic ultrasound has been developed by the authors. This paper presents the visual servo controller used in this system. While the ultrasound transducer is positioned by a robot, the operator, the robot controller, and an ultrasound image processor have shared control over its motion. Ultrasound image features that can be selected by the operator are recognized and tracked by a variety of techniques. Based on feature tracking, ultrasound image servoing in three axes has been incorporated in the interface and can be enabled to automatically compensate, through robot motions, unwanted motions in the plane of the ultrasound beam. The accuracy of the system is illustrated through a 3-D reconstruction of an ultrasound phantom. An Internet-based robot-assisted teleultrasound system has also been demonstrated.

Towards ultrasound image-based visual servoing

Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006., 2006

Robotized ultrasound examination offers on one hand more accuracy for the medical diagnostic and on the other hand a suitable assistance to radiologists. In the literature, very few researches deal with ultrasound image-based control of a medical robot holding the ultrasound probe. This paper presents a new visual servoing approach allowing the automatic guidance of the ultrasound probe. This technique is based on the modeling of the coupling between the ultrasound probe and egg shaped objects. The robotic task consists in reaching a desired image from an arbitrary initial position. The aim of this task is to assist the radiologist diagnostic by automatically moving the ultrasound probe in order to find the ultrasound image view of a tumour cross-section registered in a previous date. This work presents simulation results of the proposed ultrasound visionbased control scheme.

Guidance of an ultrasound probe by visual servoing

Advanced Robotics, 2006

A new visual servoing technique based on two-dimensional (2-D) ultrasound (US) image is proposed in order to control the motion of an US probe held by a medical robot. In opposition to a standard camera which provides a projection of the three-dimensional (3-D) scene to a 2-D image, US information is strictly in the observation plane of the probe and consequently visual servoing techniques have to be adapted. In this paper the coupling between the US probe and a motionless crossed string phantom used for probe calibration is modeled. Then a robotic task is developed which consists of positioning the US image on the intersection point of the crossed string phantom while moving the probe to different orientations. The goal of this task is to optimize the procedure of spatial parameter calibration of 3-D US systems.

Automatic guidance of an ultrasound probe by visual servoing based on B-mode image moments

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2008

We propose a new visual servo approach to automatically control in real-time the full motion of a 2D ultrasound (US) probe held by a medical robot in order to reach a desired image of motionless soft tissue object in B-mode ultrasound imaging. Combinations of image moments of the observed object cross-section are used as feedback information in the visual control scheme. These visual features are extracted in real-time from the US image thanks to a fast image segmentation method. Simulations performed with a static US volume containing an egg-shaped object, and ex-vivo experiments using a robotized US probe that interacts with a motionless rabbit heart immersed in water, show the validity of this new approach and its robustness to different perturbations. This method shows promise for a variety of US-guided medical interventions that require real-time servoing.

Automatic calibration of a robotized 3D ultrasound imaging system by visual servoing

Robotics and Automation, 2006. ICRA 2006. …, 2006

Three-dimensional free-hand ultrasound imaging consists of capturing a set of ultrasound images with a 2D ultrasound system and their respective locations in order to position them in a 3D reference frame. Usually the clinician performs the acquisition manually through the use of an optical or magnetic localization system attached to the ultrasound probe. To assist the clinician, we propose to use a robotic system to automatically move the ultrasound probe and measure its position. As for manual 3D ultrasound imaging, it is crucial to know precisely the spatial calibration parameters of the ultrasound system in order to perform accurate 3D imaging. Therefore, we propose to automate the spatial calibration procedure. A robotic task is developed to automatically position the ultrasound image on the intersection point of a cross-wire phantom used for spatial calibration. To perform this task, a new visual servoing technique based on 2D ultrasound images is used to control automatically the motion of the ultrasound probe held by a medical robot.

Modeling and 3D local estimation for in-plane and out-of-plane motion guidance by 2D ultrasound-based visual servoing

2009 IEEE International Conference on Robotics and Automation, 2009

This paper presents a new model-free visual servoing that is able to servo a robotized 2D ultrasound probe that interacts with a soft tissue object. It makes direct use of the B-mode ultrasound images in order to reach a desired one. This approach does not require the 3D model of the object nor its location in the 3D space. The visual features are based on image moments. The exact analytical form of the interaction matrix relating the image moments variation to the probe velocity is modelled. To perform model-free servoing, the approach combines the image points coordinates with the probe pose to estimate efficiently 3D parameters required in the control law. The approach is validated with simulation and experimental results showing its robustness to different errors and perturbations.

3D ultrasound registration-based visual servoing for neurosurgical navigation

International Journal of Computer Assisted Radiology and Surgery, 2017

We present a fully image-based visual servoing framework for neurosurgical navigation and needle guidance. The proposed servo-control scheme allows for compensation of target anatomy movements, maintaining high navigational accuracy over time, and automatic needle guide alignment for accurate manual insertions. Method: Our system comprises a motorized 3D ultrasound (US) transducer mounted on a robotic arm and equipped with a needle guide. It continuously registers US sweeps in real-time with a pre-interventional plan based on CT or MR images and annotations. While a visual control law maintains anatomy visibility and alignment of the needle guide, a force controller is employed for acoustic coupling and tissue pressure. We validate the servoing capabilities of our method on a geometric gel phantom and real human anatomy, and the needle targeting accuracy using CT images on a lumbar spine gel phantom under neurosurgery conditions. Results: Despite the varying resolution of the acquired 3D sweeps, we achieved direction-independent positioning errors of 0.35 ± 0.19 mm and 0.61 • ± 0.45 • , respectively. Our method is capable of compensating movements of around 25 mm/s and works reliably on human anatomy with errors of 1.45 ± 0.78 mm. In all four manual insertions by an expert surgeon, a needle could be successfully inserted into the facet joint, with an estimated targeting accuracy of 1.33 ± 0.33 mm, su-† , ‡ These authors respectively contributed equally to this work.