Validation System of MR Image Overlay and Other Needle Insertion Techniques

Similar documents
Intraoperative Prostate Tracking with Slice-to-Volume Registration in MR

ABSTRACT 1. INTRODUCTION

Introduction to Digitization Techniques for Surgical Guidance

Real-time self-calibration of a tracked augmented reality display

A high accuracy multi-image registration method for tracking MRIguided

An Accuracy Approach to Robotic Microsurgery in the Ear

The team. Disclosures. Ultrasound Guidance During Radiation Delivery: Confronting the Treatment Interference Challenge.

Needle Insertion in CT Scanner with Image Overlay Cadaver Studies

Virtual Remote Center of Motion Control for Needle Placement Robots

3D Ultrasound System Using a Magneto-optic Hybrid Tracker for Augmented Reality Visualization in Laparoscopic Liver Surgery

Closed-Loop Control in Fused MR-TRUS Image-Guided Prostate Biopsy

Introducing Computer-Assisted Surgery into combined PET/CT image based Biopsy

An Image Overlay System with Enhanced Reality for Percutaneous Therapy Performed Inside CT Scanner

NUMEROUS studies have demonstrated the potential efficacy

Virtual Remote Center of Motion Control for Needle Placement Robots

Mech. Engineering, Comp. Science, and Rad. Oncology Departments. Schools of Engineering and Medicine, Bio-X Program, Stanford University

Software Strategy for Robotic Transperineal Prostate Therapy in Closed-Bore MRI

Image Overlay Guidance for Needle Insertion in CT Scanner

Robot-assisted MRI-guided prostate biopsy using 3D Slicer

Optimum Robot Control for 3D Virtual Fixture in Constrained ENT Surgery

Improved Navigated Spine Surgery Utilizing Augmented Reality Visualization

Medicale Image Analysis

Non-Rigid 2D-3D Registration with Catheter Tip EM Tracking for Patient Specific Bronchoscope Simulation

NIH Public Access Author Manuscript Proc Soc Photo Opt Instrum Eng. Author manuscript; available in PMC 2014 July 28.

Sensor-aided Milling with a Surgical Robot System

Assessing Accuracy Factors in Deformable 2D/3D Medical Image Registration Using a Statistical Pelvis Model

Imaging protocols for navigated procedures

MR IMAGE OVERLAY: AN AUGMENTED REALITY SYSTEM FOR NEEDLE GUIDANCE

C-mode Real Time Tomographic Reflection for a Matrix Array Ultrasound Sonic Flashlight

Lecture 6: Medical imaging and image-guided interventions

Scale-Invariant Registration of Monocular Endoscopic Images to CT-Scans for Sinus Surgery

Ultrasound Probe Tracking for Real-Time Ultrasound/MRI Overlay and Visualization of Brain Shift

Calibration Method for Determining the Physical Location of the Ultrasound Image Plane

Prostate Brachytherapy Seed Reconstruction Using C-Arm Rotation Measurement and Motion Compensation

A Novel Laser Guidance System for Alignment of Linear Surgical Tools: Its Principles and Performance Evaluation as a Man Machine System

Navigation System for ACL Reconstruction Using Registration between Multi-Viewpoint X-ray Images and CT Images

Bone registration with 3D CT and ultrasound data sets

A model-based approach for tool tracking in laparoscopy

Real-Time Data Acquisition for Cardiovascular Research

Image Thickness Correction for Navigation with 3D Intra-cardiac Ultrasound Catheter

Data Fusion Virtual Surgery Medical Virtual Reality Team. Endo-Robot. Database Functional. Database

Towards an Ultrasound Probe with Vision: Structured Light to Determine Surface Orientation

Vision-based endoscope tracking for 3D ultrasound image-guided surgical navigation [Yang et al. 2014, Comp Med Imaging and Graphics]

Depth-Layer-Based Patient Motion Compensation for the Overlay of 3D Volumes onto X-Ray Sequences

Robotic Kidney and Spine Percutaneous Procedures Using a New Laser-Based CT Registration Method

A New Approach to Ultrasound Guided Radio-Frequency Needle Placement

Markerless human motion capture through visual hull and articulated ICP

Design and Validation of an Image-Guided Robot for Small Animal Research

Development of a novel laser range scanner

Fluoroscopy-Based X-Ray Navigation in Orthopedics Using Electromagnetic Tracking: A Practical Study

Active Echo: A New Paradigm for Ultrasound Calibration

Medical Robotics. Control Modalities

Navigated and Robotized Transcranial Magnetic Stimulation based on 3D Laser Scans

Skull registration for prone patient position using tracked ultrasound

Intraoperative Ultrasound Probe Calibration in a Sterile Environment

Real-time Tissue Tracking with B-Mode Ultrasound Using Speckle and Visual Servoing

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

ENABLING TECHNOLOGIES FOR MRI GUIDED INTERVENTIONAL PROCEDURES. Gregory Scott Fischer

Video Registration Virtual Reality for Non-linkage Stereotactic Surgery

Infrared Camera Imaging Algorithm to Augment CT-Assisted Biopsy Procedures

A Study of Medical Image Analysis System

A Navigation System for Minimally Invasive Abdominal Intervention Surgery Robot

AUTOMATIC DETECTION OF ENDOSCOPE IN INTRAOPERATIVE CT IMAGE: APPLICATION TO AUGMENTED REALITY GUIDANCE IN LAPAROSCOPIC SURGERY

ULTRASOUND (US) has become one of the standard

Endoscopic Reconstruction with Robust Feature Matching

Fiducial localization in C-arm based Cone-Beam CT

An Iterative Framework for Improving the Accuracy of Intraoperative Intensity-Based 2D/3D Registration for Image-Guided Orthopedic Surgery

Lecture 8: Registration

Suturing in Confined Spaces: Constrained Motion Control of a Hybrid 8-DoF Robot

Trackerless Surgical Image-guided System Design Using an Interactive Extension of 3D Slicer

SlicerRT Image-guided radiation therapy research toolkit for 3D Slicer

On the Design and Experiments of a Fluoro-Robotic Navigation System for Closed Intramedullary Nailing of Femur

A SYSTEM FOR COMPUTER-ASSISTED SURGERY WITH INTRAOPERATIVE CT IMAGING

Endoscopic Navigation for Minimally Invasive Suturing

Markerless Endoscopic Registration and Referencing

Integrated Planning and Image-Guided Control for Planar Needle Steering

ABSTRACT 1. INTRODUCTION 2. METHODS

Photoacoustic Image Guidance for Robot-Assisted Skull Base Surgery

CARS 2008 Computer Assisted Radiology and Surgery

Telemanipulation of Snake-Like Robots for Minimally Invasive Surgery of the Upper Airway

Summary of Research Projects

Hybrid Navigation Interface for Orthopedic and Trauma Surgery

Precision and repeatability analysis of Optotrak Certus as a tool for gait analysis utilizing a 3D robot

Incorporation of Prior Knowledge for Region of Change Imaging from Sparse Scan Data in Image-Guided Surgery

Introduction. Biomedical Image Analysis. Contents. Prof. Dr. Philippe Cattin. MIAC, University of Basel. Feb 22nd, of

INTRODUCTION TO MEDICAL IMAGING- 3D LOCALIZATION LAB MANUAL 1. Modifications for P551 Fall 2013 Medical Physics Laboratory

Computational Medical Imaging Analysis Chapter 4: Image Visualization

Augmented reality visualization for orthopedic surgical guidance with pre- and intra- operative multimodal image data fusion

3D Guide Wire Navigation from Single Plane Fluoroscopic Images in Abdominal Catheterizations

Real-Time 4D Ultrasound Mosaicing and Visualization

Medical Image Analysis

A Multiple-Layer Flexible Mesh Template Matching Method for Nonrigid Registration between a Pelvis Model and CT Images

CONTRACTING ORGANIZATION: Johns Hopkins University Baltimore, MD

DETC A 4-DOF ROBOT FOR POSITIONING ULTRASOUND IMAGING CATHETERS

SIGMI Meeting ~Image Fusion~ Computer Graphics and Visualization Lab Image System Lab

Fast CT-CT Fluoroscopy Registration with Respiratory Motion Compensation for Image-Guided Lung Intervention

MR-Guided Mixed Reality for Breast Conserving Surgical Planning

Closed-Loop Actuated Surgical System Utilizing Real-Time In-Situ MRI Guidance

Self-Calibration of Cone-Beam CT Using 3D-2D Image Registration

Biomechanically Constrained Ultrasound to Computed Tomography Registration of the Lumbar Spine

Transcription:

Submitted to the Proceedings fro Medicine Meets Virtual Reality 15 February 2007, Long Beach, California To appear in Stud. Health Inform Validation System of MR Image Overlay and Other Needle Insertion Techniques Gregory S FISCHER a, Eva DYER a, Csaba CSOMA a, Anton DEGUET a, and Gabor FICHTINGER a a Engineering Research Center, Johns Hopkins University, Baltimore, MD Abstract. In order to develop accurate and effective augmented reality (AR) systems used in MR and CT guided needle placement procedures, a comparative validation environment is necessary. Clinical equipment is prohibitively expensive and often inadequate for precise measurement. Therefore, we have developed a laboratory validation system for measuring operator performance using different assistance techniques. Electromagnetically tracked needles are registered with the preoperative plan to measure placement accuracy and the insertion path. The validation system provides an independent measure of accuracy that can be applied to varying methods of assistance ranging from augmented reality guidance methods to tracked navigation systems and autonomous robots. In preliminary studies, this validation system is used to evaluate the performance of the image overlay, bi-plane laser guide, and traditional freehand techniques. Keywords. validation system, augmented reality, image overlay, mri, needle placement, percutaneous procedures, image guided surgery, Introduction A comparative validation environment is necessary for an efficacious analysis of CT/MRI guided assistance techniques to be used in needle placement procedures. Clinical equipment is prohibitively expensive and often inadequate for precise validation. Precise measurement of placement accuracy by MRI is greatly limited by paramagnetic needle artifact and lack of distinct small targets. Scanner time cost can exceed $500/hour making statistically significant trials impractical. Therefore, we have developed a laboratory validation system for measuring operator performance of different assistance techniques. The validation system can be applied to varying methods of assistance ranging from augmented reality guidance methods to tracked navigation systems and autonomous robots. Preliminary accuracy assessment of our MR image overlay system has been performed, but the excessive cost of scanner time has thwarted a large-scale study of the accuracy of this system. Therefore, an off-line validation system has been created in order to study needle placement accuracy; in particular we look at the accuracy of the image overlay and compare it to that of other insertion guidance methods. This system will also provide a means to study the trajectory and gestures throughout the insertion procedure in addition to the endpoint accuracy. The study of hand gestures for each of these methods will provide useful information that can be used to help minimize the number of re-insertion attempts needed, as each re-insertion causes significant discomfort to the pa-

tient. This system ensures a less resource exhaustive and more accurate means by which to validate needle insertion procedures. In this paper, we describe the validation system shown in Fig. 2, and its use for comparative analysis of the virtual image overlay, the bi-plane laser guide and unassisted freehand techniques. The image overlay displays CT/MR images and a virtual needle guide over the patient [1] and is calibrated such that the overlay a MR or CT image and virtual needle guide appears to be floating inside the patient in the correct size and position as shown in Fig. 2(a,c). The bi-plane laser guide uses intersecting transverse and adjustable parasaggital laser planes to mark the trajectory of insertion [2], as shown in Fig. 2(b,d). Figure 1. The validation environment shown with the image overlay system. 1. Validation System Electromagnetic (EM) tracking (Aurora, Northern Digital, Waterloo, Ontario) is utilized to provide the position of the tip and orientation of the shaft of an instrumented needle as described in [3]. All necessary components must be registered with one another in order to track the needle with respect to the preoperative plan generated on the MR/CT images. The components of the system include: the Aurora EM Tracker, a tracked needle, the tracked phantom, the MR/CT images used for pre-operative planning and the AR guidance system. The system is shown in Fig. 1. 1.0.1. Phantom Design A human cadaver lumbar spine phantom was designed to mimic the anatomy of a patient and aid in the process of registration. Lumbar vertebrae and simulated intravertebral discs are placed in proper alignment are embedded into a layered tissue mimicking gel (SimTest, Corbin, White City, OR) of two different densities emulating fat and muscle tissue. The gel phantom with lumbar spine is placed into an acrylic enclosure which was accurately laser-cut with 24 different pivot points spread over four sides for rigid-body registration. Stereotactic fiducial markers (MR-Spots, Beekley, Bristoll, CT) were placed on the phantom in precisely positioned laser-cut slots. The markers were placed in a Z shape pattern on three sides allowing for automatic registration between anatomical images and the phantom. The acrylic enclosure was designed such that different phantoms can be placed inside it for studying other procedures. 2

Figure 2. Image overlay (a,c) and bi-plane laser guide (b,d) AR needle placement systems with spine phantom. MR scanner feasibility trials (a,b) and laboratory validation system with tracked needle (c,d). 1.0.2. MR Image Registration In order to register preoperatively obtained MR or CT images (and their respective preoperative plans) to their corresponding physical space, techniques similar to those described in [4] are used. The Z-frame registration uses three stereotactic fiducial markers in the shape of a Z on each of the left, right, and bottom faces of the phantom (Fig. 4). Axial images are taken near the center of the phantom; the locations of other images of the phantom are known with respect to this reference. The central image is used for registration, where the nine fiducial markers are segmented by applying an adaptive threshold and morphological operations to the image. The centroid of each marker was then found and the position of each marker with respect to the Dicom image was recorded into a set of nine points. After the nine distinct points were identified, the transformation from the scanner s image space to the phantom s coordinate system was computed. The RMS error incurred in the image to phantom space registration for a typical MR image was 1.26mm. 1.0.3. Electromagnetic Tracker Registration The NDI Aurora EM tracking system is used to localize an instrumented needle with respect to the phantom. A 6 degree-of-freedom (DOF) reference tool is fixed to the phantom and a calibrated pointer tool is used for rigid-body registration of the phantom to the tracker. Data was obtained by pivoting about the 24 pre-defined divot points with the pointer. These points were used for registration between phantom coordinate system and that of the EM tracker by finding the transformation which aligns the known point locations obtained from the mechanical design specifications with the collected data points. The RMS error incurred in the rigid-body registration was 0.93mm. Fig. 3 illustrates the placement of the fiducial markers in the image and phantom space. 3

Figure 3. Phantom design showing spine partially embedded in gel and fiducial markers both on the phantom and their corresponding MR image. This registration process requires only 5-10 minutes and is necessary only when the 6-DOF reference body tool is repositioned on the phantom. The rest of the registration process is automatic. Once both steps in registration are complete, an instrumented needle may be tracked as it maneuvers along a planned path within the phantom. To maximize the system s accuracy, future efforts will include distortion mapping and error compensation as described in [5]. Figure 4. Frame transformations for the registration process shown on the spine phantom and its corresponding MR images. The tracked needle is represented in the original image space where the preoperative plan was made. 2. Experimental Methods Prior to beginning trials, numerous needle paths were created in the EasySlice planning software that the author s have developed and is described in [1]. The software stores the insertion and target points for each planned path as well as the angle of insertion needed 4

to accurately reach the desired target. For each of the three needle insertion methods (image overlay, bi-plane laser guide and freehand interventions) presented, subjects were randomly assigned three different paths in three different axial MR slices. The entire insertion attempt was recorded with the tracking software. The software then provides insertion and target point error, both in and out of the image plane. Needle axis orientation error is also computed. Simple forms of gesture tracking are now provided, including distances from the trajectory during insertion and the number of re-insertion attempts. 2.1. Results To demonstrate workflow, four needle insertions were performed with each technique in a clinical MRI environment. As expected, accuracy could not be assessed due to large artifacts as shown in Fig. 5(a). In the validation testbed, the measured needle trajectories were graphically overlaid on the plan and targeting MR image as shown in Fig. 5(b). Twenty insertions were performed with each technique. Position and orientation errors were measured. Initial analysis showed that the results correlate with direct validation performed using fluoroscopy described in [2]. The image overlay s mean error in the image plane was 1.4mm and 2.5 o with standard deviations of 0.5mm and 1.9 o respectively. The laser guide s average error was 1.8mm and 2.0 o (1.2mm and 1.8 o standard deviation), and freehand produced average errors of 2.0mm and 5.2 o (1.4mm and 2.3 o standard deviation). Figure 5. Typical results from an MR image (a) and a tracked path (b). Logged trajectory showing multiple corrections (c) and direct path (d). 3. Discussion Initial assessments of the image overlay, laser guide, and freehand needle insertions were performed with the validation system. Experiments with experienced radiologists are 5

currently underway. Future experiments will provide independent, large scale accuracy assessment of needle insertion procedures using commercial surgical navigation systems, image overlay, laser guidance, and traditional techniques. The goal is to quantitatively compare placement accuracy, consistency, and other important characteristics such as the needle trajectories throughout the entire placement procedure. In typical needle placement procedures, the interventionalist will often probe the patient s anatomy until the desired target is reached. This probing action can result in a great deal of discomfort to the patient as well as significant bruising to the area. Analysis of the needle trajectory can provide information about the number of insertion and repositioning attempts that were made during an intervention. Fig. 5(c) illustrates a trajectory that resulted from repeated reinsertions and Fig. 5(d) shows an insertion with minimal repositioning attempts. This information enables researchers to study the systems ability to minimize discomfort to the patient during the procedure. We intend to use gesture tracking techniques similar to those described in [6]. We also hope to implement the tracking system in clinical trials within a CT scanner, while utilizing the gesture tracking information given by the system for planning interventions. Future applications of this system may also include: providing realtime accuracy and position feedback to a user in vivo and evaluating the accuracy of needle placement procedures in clinical training settings. Applications of this system may also be extended further into autonomous robotic systems and many other augmented reality systems. Acknowledgements Partial funding for this project was provided by NSF Engineering Research Center Grant #EEC-97-31478, Siemens Corporate Research and William R. Kenan, Jr. Fund. References [1] Fischer GS, Deguet A, Schlattman D, Taylor RH, Fayad L, Zinreich SJ, Fichtinger G. MRI image overlay: applications to arthrography needle insertion. Studies in health technology and informatics - Medicine Meets Virtual Reality 14, 2006; 119:150 155. [2] Fischer GS, Wamsley C, Zinreich SJ, Fichtinger G. Laser-Assisted MRI-Guided Needle Insertion and Comparison of Techniques. Int Soc for Comp Asst Orthopaedic Surg, Jun 2006. [3] Wood BJ, Zhang H, Durrani A, Glossop N, Ranjan S, Lindisch D, Levy E, Banovac F, Borgert J, Krueger S, Kruecker J, Viswanathan A, Cleary C. Navigation with Electromagnetic Tracking for Interventional Radiology Procedures: A Feasibility Study. J Vasc Interv Radiol. 2005; 16:493-505. [4] Lee S, Fichtinger G, Chirikjian GS. Novel algorithms for robust registration of fiducials in CT and MRI. J Med Phys 2002;29(8):1881Ű1891. [5] Fischer GS, Taylor RH. Electromagnetic Tracker Measurement Error Simulation and Tool Design, In Proceedings of Medical Image Computing and Computer Assisted Intervention. Med Img Comp & Comp Asst Int. 2005; LNCS 3750:73-80. [6] Lin HC, Shafran I, Murphy T, Okamura AM, Yuh DD, Hager GD. Automatic Detection and Segmentation of Robot-Assisted Surgical Motions. In Proceedings of Medical Image Computing and Computer Assisted Intervention. Med Img Comp & Comp Asst Int. 2005; LNCS 3749:802-810. 6