Distributed Augmented Reality with 3D Lung Dynamics - A Planning Tool Concept (original) (raw)

Distributed Augmented Reality With 3-D Lung Dynamics—A Planning Tool Concept

IEEE Transactions on Information Technology in Biomedicine, 2000

Augmented Reality (AR) systems add visual information to the world by using advanced display techniques. The advances in miniaturization and reduced costs make some of these systems feasible for applications in a wide set of fields. We present a potential component of the cyber infrastructure for the operating room of the future; a distributed AR based softwarehardware system that allows real-time visualization of 3D lung dynamics superimposed directly on the patient's body. Several emergency events (e.g. closed and tension pneumothorax) and surgical procedures related to the lung (e.g. lung transplantation, lung volume reduction surgery, surgical treatment of lung infections, lung cancer surgery) could benefit from the proposed prototype.

A Visualisation of 3D Lung Anatomy with Augmented Reality as Interactive Medical Learning

Journal of Physics: Conference Series, 2019

Breathing is the process of breathing air or the movement of oxygenated (O₂) external air into the body or lungs and exhaling air containing carbon dioxide (CO₂) as residual oxidation out of the body. There are several organs that act as the respiratory system organs such as the nose, oesophagus, pharynx, larynx, trachea, bronchus, and lungs. Augmented reality can be utilized as a learning medium for the recognition of anatomy of the respiratory system organs presented in 3D with animation. In this research, to build the object to be used, researchers use blender software for object modelling process. To be able to display 3D objects, researchers apply target image techniques, namely by using a marker obtained from the book Sobotta Human Atlas Anatomy is adapted to the needs of preclinical medical students. When AR Camera captures the marker, the camera detects the pattern contained in the marker, and then matches the data stored in the database. When the pattern is found and there is conformity to that stored in the database, the application will display the 3D objects of the organ of the respiratory system with Augmented Reality technology. The optimal distance of marker detection is 15-35 cm, with a 30 0-90 0 inclination angles.

Development of a training tool for endotracheal intubation: Distributed Augmented Reality

Studies in health …, 2003

The authors introduce a tool referred to as the Ultimate Intubation Head (UIH) to train medical practitioners' hand-eye coordination in performing endotracheal intubation with the help of augmented reality methods. In this paper we describe the integration of a deployable UIH and present methods for augmented reality registration of real and virtual anatomical models. The assessment of the 52 degrees field of view optics of the custom-designed and built head-mounted display is less than 1.5 arc minutes in the amount of blur and astigmatism, the two limiting optical aberrations. Distortion is less than 2.5%. Preliminary results of the registration of a physical phantom mandible on its virtual counterpart yields less than 3mm rms. in registration. Finally we describe an approach to distributed visualization where a given training procedure may be visualized and shared at various remote locations. Basic assessments of delays within two scenarios of data distribution were conducted and reported. 16 hospital study conducted by the National Emergency Airway Registry between August 1997 and October 1998, out of 2392 recorded ETIs, 309 complications were reported, with 132 of these difficulties resulting from intubation procedures . Many anesthesiologists believe that the most common reason for failure of intubation is the inability to visualize the vocal cords. In fact, failed intubation is one of the leading causes of anesthesia-related morbidity and mortality . Thus, there is international concern for the need to extensively train paramedics in pre-hospital emergency situations .

A Distributed Augmented Reality System for Medical Training and Simulation

Augmented Reality (AR) systems describe the class of systems that use computers to overlay virtual information on the real world. AR environments allow the development of promising tools in several application domains. In medical training and simulation the learning potential of AR is significantly amplified by the capability of the system to present 3D medical models in real-time at remote locations. Furthermore the simulation applicability is broadened by the use of real-time deformable medical models.

Application of augmented reality to visualizing anatomical airways

2002

Visualizing information in three dimensions provides an increased understanding of the data presented. Furthermore, the ability to manipulate or interact with data visualized in three dimensions is superior. Within the medical community, augmented reality is being used for interactive, three-dimensional (3D) visualization. This type of visualization, which enhances the real world with computer generated information, requires a display device, a computer to generate the 3D data, and a system to track the user. In addition to these requirements, however, the hardware must be properly integrated to insure correct visualization. To this end, we present components of an integrated augmented reality system consisting of a novel head-mounted projective display, a Linux-based PC, and a commercially available optical tracking system. We demonstrate the system with the visualization of anatomical airways superimposed on a human patient simulator.

Application of augmented reality to visualizing anatomical airways

Helmet- and Head-Mounted Displays VII, 2002

Visualizing information in three dimensions provides an increased understanding of the data presented. Furthermore, the ability to manipulate or interact with data visualized in three dimensions is superior. Within the medical community, augmented reality is being used for interactive, three-dimensional (3D) visualization. This type of visualization, which enhances the real world with computer generated information, requires a display device, a computer to generate the 3D data, and a system to track the user. In addition to these requirements, however, the hardware must be properly integrated to insure correct visualization. To this end, we present components of an integrated augmented reality system consisting of a novel head-mounted projective display, a Linux-based PC, and a commercially available optical tracking system. We demonstrate the system with the visualization of anatomical airways superimposed on a human patient simulator.

Virtual reality mixed reality and augmented reality for robotically assisted thoracoscopic anatomic resections

Virtual reality (VR), mixed reality (MR) and augmented reality (AR) are new ways of interaction with three-dimensional (3D) elements. Video-assisted thoracoscopic surgery (VATS) and robotic VATS (RVATS) anatomic pulmonary resections benefit from 3D understanding of the patient’s anatomy. These procedures demand precision and planning for improved outcomes mainly for conservative surgery as sublobar anatomic resections. Display of patient’s 3D data through VR, MR and AR is a useful tool for surgical planning by providing the surgeon with a true and spatially accurate representation of the patient’s anatomy. An illustrative case is presented to exemplify the application of this new technology.

Virtual reality, mixed reality and augmented reality in surgical planning for video or robotically assisted thoracoscopic anatomic resections for treatment of lung cancer

Journal of Visualized Surgery

Virtual reality (VR), mixed reality (MR) and augmented reality (AR) are new ways of interaction with three-dimensional (3D) elements. Video-assisted thoracoscopic surgery (VATS) and robotic VATS (RVATS) anatomic pulmonary resections benefit from 3D understanding of the patient's anatomy. These procedures demand precision and planning for improved outcomes mainly for conservative surgery as sublobar anatomic resections. Display of patient's 3D data through VR, MR and AR is a useful tool for surgical planning by providing the surgeon with a true and spatially accurate representation of the patient's anatomy. An illustrative case is presented to exemplify the application of this new technology.

Real-Time Simulation and Visualization of Subject-Specific 3D Lung Dynamics

19th IEEE Symposium on Computer-Based Medical Systems (CBMS'06), 2006

In this paper we discuss a framework for modeling the 3D lung dynamics of normal and diseased human subjects and visualizing them using an Augmented Reality (AR) based environment. The framework is based on the results obtained from pulmonary function tests and lung image-data of human subjects obtained from 4D High-Resolution Computed Tomography (HRCT). The components of the framework include a parameterized pressure-volume (PV) relation estimated from normal human subjects, and a physics and physiology-based 3D deformable lung model extracted from the 4D HRCT data of normal and tumor-influenced human subjects. The parameterized PV relation allows modeling different breathing conditions of a human subject. The 3D deformable lung model allows visualizing the 3D shape changes of the lung for the breathing condition simulated by the PV relation. Additionally, the 3D lung model is deformed using a graphics processing unit (GPU) and its vertex shaders, which satisfies the real-time frame-rate requirements of the AR environment.

Simulating 3-D Lung Dynamics Using a Programmable Graphics Processing Unit

IEEE Transactions on Information Technology in Biomedicine, 2007

Medical simulations of lung dynamics promise to be effective tools for teaching and training clinical and surgical procedures related to lungs. Their effectiveness may be greatly enhanced when visualized in an augmented reality (AR) environment. However, the computational requirements of AR environments limit the availability of the central processing unit (CPU) for the lung dynamics simulation for different breathing conditions. In this paper, we present a method for computing lung deformations in real time by taking advantage of the programmable graphics processing unit (GPU). This will save the CPU time for other AR-associated tasks such as tracking, communication, and interaction management. An approach for the simulations of the three-dimensional (3-D) lung dynamics using Green's formulation in the case of upright position is taken into consideration. We extend this approach to other orientations as well as the subsequent changes in breathing. Specifically, the proposed extension presents a computational optimization and its implementation in a GPU. Results show that the computational requirements for simulating the deformation of a 3-D lung model are significantly reduced for point-based rendering.