Image Thickness Correction for Navigation with 3D Intra-cardiac Ultrasound Catheter

Similar documents
Virtual Touch : An Efficient Registration Method for Catheter Navigation in Left Atrium

Image Guided Navigation for Minimally Invasive Surgery

Calibration Method for Determining the Physical Location of the Ultrasound Image Plane

A New Approach to Ultrasound Guided Radio-Frequency Needle Placement

A Novel Phantom-Less Spatial and Temporal Ultrasound Calibration Method

C-mode Real Time Tomographic Reflection for a Matrix Array Ultrasound Sonic Flashlight

CALIBRATION ALGORITHM FOR FREEHAND ULTRASOUND

4D Shape Registration for Dynamic Electrophysiological Cardiac Mapping

Image Guidance of Intracardiac Ultrasound with Fusion of Pre-operative Images

Navigation System for ACL Reconstruction Using Registration between Multi-Viewpoint X-ray Images and CT Images

Direct Co-Calibration of Endobronchial Ultrasound and Video

Light and refractive index

Chapter 1. Linear Equations and Straight Lines. 2 of 71. Copyright 2014, 2010, 2007 Pearson Education, Inc.

Projection-Based Needle Segmentation in 3D Ultrasound Images

Light and the Properties of Reflection & Refraction

All forms of EM waves travel at the speed of light in a vacuum = 3.00 x 10 8 m/s This speed is constant in air as well

Towards an Ultrasound Probe with Vision: Structured Light to Determine Surface Orientation

Lesson 1: Creating T- Spline Forms. In Samples section of your Data Panel, browse to: Fusion 101 Training > 03 Sculpt > 03_Sculpting_Introduction.

UNM - PNM STATEWIDE MATHEMATICS CONTEST XLI. February 7, 2009 Second Round Three Hours

REVIT ARCHITECTURE 2016

ADVANCED EXERCISE 09B: EQUATION OF STRAIGHT LINE

Chapter 34. Images. In this chapter we define and classify images, and then classify several basic ways in which they can be produced.

Backward-Warping Ultrasound Reconstruction for Improving Diagnostic Value and Registration

Comparison of Vessel Segmentations Using STAPLE

Lecture 6: Medical imaging and image-guided interventions

2 Geometry Solutions

Suggested problems - solutions

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19

MEDICAL IMAGING 2nd Part Computed Tomography

Strategy. Using Strategy 1

MEDICAL EQUIPMENT: COMPUTED TOMOGRAPHY. Prof. Yasser Mostafa Kadah

Advanced Image Reconstruction Methods for Photoacoustic Tomography

Translational Computed Tomography: A New Data Acquisition Scheme

Boardworks Ltd KS3 Mathematics. S1 Lines and Angles

Volumetric Ultrasound Panorama Basedon3DSIFT

ME 111: Engineering Drawing. Geometric Constructions

Discrete Estimation of Data Completeness for 3D Scan Trajectories with Detector Offset

High-resolution X-ray CT Inspection of Honeycomb Composites Using Planar Computed Tomography Technology

Suggested problems - solutions

LIGHT: Two-slit Interference

Physics for Scientists & Engineers 2

Information page for written examinations at Linköping University TER2

On Fig. 7.1, draw a ray diagram to show the formation of this image.

Intraoperative Ultrasound Probe Calibration in a Sterile Environment

Simulating the RF Shield for the VELO Upgrade

Table of Contents & Introduction 2 Product features & Function. 3. Calibration RS232C Serial communication...6 Specifications

Unit 6: Connecting Algebra and Geometry Through Coordinates

UT2-DSE-MATH-CP 1 Solution

LESSON 2 MODELING BASICS

Lecture 7 Notes: 07 / 11. Reflection and refraction

Enabling Technologies for Robot Assisted Ultrasound Tomography

3D Design with 123D Design

Introduction to Google SketchUp

Material for Chapter 6: Basic Principles of Tomography M I A Integral Equations in Visual Computing Material

GPU Ultrasound Simulation and Volume Reconstruction

Active Echo: A New Paradigm for Ultrasound Calibration

Basics of treatment planning II

Data Fusion Virtual Surgery Medical Virtual Reality Team. Endo-Robot. Database Functional. Database

Geometry Pre AP Graphing Linear Equations

3D Ultrasound System Using a Magneto-optic Hybrid Tracker for Augmented Reality Visualization in Laparoscopic Liver Surgery

Chapter 26 Geometrical Optics

Measurements using three-dimensional product imaging

A Comprehensive Method for Geometrically Correct 3-D Reconstruction of Coronary Arteries by Fusion of Intravascular Ultrasound and Biplane Angiography

Sizing and evaluation of planar defects based on Surface Diffracted Signal Loss technique by ultrasonic phased array

Time: 3 hour Total Marks: 90

3D Guide Wire Navigation from Single Plane Fluoroscopic Images in Abdominal Catheterizations

What is it? How does it work? How do we use it?

Available online at ScienceDirect. Energy Procedia 69 (2015 )

3DReshaper Help DReshaper Beginner's Guide. Surveying

Shape Estimation for Image-Guided Surgery with a Highly Articulated Snake Robot

Volume Rendering. Computer Animation and Visualisation Lecture 9. Taku Komura. Institute for Perception, Action & Behaviour School of Informatics

1. What is the law of reflection?

Homework Set 3 Due Thursday, 07/14

.(3, 2) Co-ordinate Geometry Co-ordinates. Every point has two co-ordinates. Plot the following points on the plane. A (4, 1) D (2, 5) G (6, 3)

Geometry Activity: Segment and Shape Construction. CALCULATORS: Casio: ClassPad 300 Texas Instruments: TI-89, TI-89 Titanium

ABSTRACT 1. INTRODUCTION

An approximate cone beam reconstruction algorithm for gantry-tilted CT

Pop Quiz 1 [10 mins]

Creating T-Spline Forms

Adaptive Multiscale Ultrasound Compounding Using Phase Information

AIM To determine the frequency of alternating current using a sonometer and an electromagnet.

Appendix 1: Manual for Fovea Software

Activity 21 OBJECTIVE. MATERIAL REQUIRED Cardboard, white paper, adhesive, pens, geometry box, eraser, wires, paper arrow heads.

Direction Fields; Euler s Method

34.2: Two Types of Image

Real-Time 4D Ultrasound Mosaicing and Visualization

Geometry / Integrated II TMTA Test units.

3D Finite Element Software for Cracks. Version 3.2. Benchmarks and Validation

연구용유방초음파질관리 원광대학병원김혜원

CS 4204 Computer Graphics

CCGPS UNIT 5 Semester 2 COORDINATE ALGEBRA Page 1 of 38. Transformations in the Coordinate Plane

Interactive Collision Detection for Engineering Plants based on Large-Scale Point-Clouds

Edge detection. Convert a 2D image into a set of curves. Extracts salient features of the scene More compact than pixels

Edge and local feature detection - 2. Importance of edge detection in computer vision

GEOMETRIC OPTICS MIRRORS

Essential Physics I. Lecture 13:

Monocular Vision Based Autonomous Navigation for Arbitrarily Shaped Urban Roads

Euclid s Axioms. 1 There is exactly one line that contains any two points.

Comments on Consistent Depth Maps Recovery from a Video Sequence

Transcription:

Image Thickness Correction for Navigation with 3D Intra-cardiac Ultrasound Catheter Hua Zhong 1, Takeo Kanade 1,andDavidSchwartzman 2 1 Computer Science Department, Carnegie Mellon University, USA 2 University of Pittsburgh Medical Center, USA Abstract. In this paper we present an algorithm to correct 3D reconstruction errors of 3D ultrasound catheter caused by ultrasound image thickness. We also provide a method to quickly measure ultrasound image plane s thickness. With thickness correction registration accuracy of navigation system using 3D ultrasound catheters can be improved by 20%. 1 Introduction Recent years, many navigation systems are developed for minimally invasive heart surgery. Both research systems [1] [2] and commercial available system (Carto Merge and EnSite Fusion) use position sensor tracked catheters to touch heart walls at several locations during an operation and register them with preoperative images to enable instrument navigation. Recent navigation system [3] uses an ultrasound catheter to quickly scan heart wall and reconstruct 3D heart surfaces points during an operation and register them with pre-operative CT scans. Such system can greatly improve the efficiency (hundreds of times faster has been claimed) collecting intra-operative surface data for registration. To reconstruct 3D heart surface points, this system uses edge detection algorithms to find first edge pixel in ultrasound images from transducer s center corresponding the first reflected sound. With a position sensor on the ultrasound catheter 3D coordinates of those pixels can be computed. This method assumes ultrasound image plane is infinitely thin but in reality ultrasound image plane has thickness. 1.1 Error Caused by Image Plane Thickness Figure 1 shows an ultrasound image plane (bold black line) with finite thickness (thin black lines) intersecting an object surface (horizontal blue line). Because the image plane is not perpendicular to the object surface, at point a, part of the image plane first hits the object surface and reflects some ultrasound energy. o is where the center image plane hits the object surface and reflects energy. Eventually, the object surface in ultrasound image will be a wide band (Figure 1 ), not an infinitely thin line as it should be with zero thickness image plane. In this case, if we just detect the first edge pixel from the transducer in ultrasound image (represents the first reflection of sound waves) as where the surface is, o will be taken as a point on the object surface while the real 3D point D. Metaxas et al. (Eds.): MICCAI 2008, Part II, LNCS 5242, pp. 485 492, 2008. c Springer-Verlag Berlin Heidelberg 2008

486 H. Zhong, T. Kanade, and D. Schwartzman O Fig. 1. Surface intersect with ultrasound image plane with finite thickness on the object surface should be a. This error is proportional to the ultrasound image plane s thickness at the depth o. Thickness of an intra-cardiac ultrasound catheter s image plane ranges from 3 to 6mm. Such error cause by thickness of image plane has been observed[4]. In the following sections, we first propose a method to measure the thickness of an ultrasound image plane (section 2). And we provide an algorithm to correct 3D surface points errors with measured thickness information (section 3) and register (section 3.2) the corrected points with pre-operative 3D heart surface models. A phantom model test and its result analysis will be presented in section 4 to verify the improvement with our algorithm. 2 Ultrasound Image Thickness Measurement Thickness of ultrasound image plane (or beam width) is not uniform everywhere and can be thought as a function of depth (distance from the ultrasound transducer). It can be measured by carefully built phantom models [5][6]. The basic idea is to intersect the ultrasound image plane with a flat surface at 45 degree angle. In that case, the width of the band in ultrasound image equals to the thickness of the image plane at the depth. Then either move the ultrasound transducer up and down or use multiple parallel surfaces to measure the distance at different depth. These methods need carefully built models and accurate movement of transducers. In this paper we reduce many of such restrictions to as simple as a single slope surface with any angle (0-90 degree) to a flat surface (water tank bottom) and give a general formula to measure the thickness of an ultrasound image. 2.1 Phantom Model Setup Our method requires only a single slope on a flat surface as shown in Figure 2. There is no restriction to the slope s angle α. Aclampisusedtoholdtheultrasound catheter so that the ultrasound image plane (blue plane) is perpendicular to the flat surface. It can be verified by rotating the catheter along its proximate direction, when the white band in ultrasound image representing the surface is at its thinnest, the image plane is perpendicular to it. This thin straight line in

Image Thickness Correction for Navigation 487 (c) Fig. 2. Measure thickness of an ultrasound image plane ultrasound image is called our reference line. Later we will need it to compute thickness. Then the slope surface is moved into the image plane. As we move the slope back and forth, the white band representing the slope surface should sweep across the ultrasound image at different depth. Here we need to sweeps most part of the ultrasound image multiple times to make sure we have enough samples. 2.2 Compute the Thickness Function For an ultrasound image, the image plane intersects the sloped surface and generate a wide band in it. As shown in Figure 2 (c), blue plane aba b is the slope and yellow plane bcb c is the center of the image slice. Because the image plane has a finite thickness, it hits the slope from a to a. Their projection on the center of the image plane are c and c. Figure 2 is the corresponding ultrasound image. The white band is the reflection from the slope surface. The thin white line is the reference line (tank bottom). It is perpendicular to the image plane. c, c and o correspond to the same point in Figure 2 (c). Given that ac bcb c, cc bc and bc is parallel to the reference line, we draw a line cd in plane abc so that cd ad, thenod ab. So odc is the slope s angle α which can be measured. obc is the angle between reference line and the center line of the white band generated by slope surface. It can be measured in ultrasound image (using line detecting algorithm with Radon transformation). We call it β. cc is the width of the white band along the direction perpendicular to the reference line, which can be measured automatically too. We call it w. Then oc is w 2. ac is half of the thickness of image plane at point o. With all the perpendicular relations mentioned above, we can get: Thickness= w 1 tan2 α tan 2 β (1)

488 H. Zhong, T. Kanade, and D. Schwartzman Previous methods are special cases of our method which have α=45 and β =0 and Equation (1) becomes Thickness= w. We use Equation (1) to compute the thickness for each ultrasound image sampled at various depth. For depth with no samples, we can interpolate its thickness with neighboring samples. Then a continuous thickness function T hickness = f(depth) can be reconstructed. Figure 3 shows the result we have for the Acuson ICE catheter (at 8.5MHz) used in our experiment. The middle range with least thickness is the focus region of the image plane. As depth decreases and increases from the focus region, thickness increases. This measured thickness actually is not the real thickness of ultrasound image plane, but the visible thickness. When the ultrasound energy spread too much it will not generate visible signals in ultrasound images. And since our measurement is based on those visible features in images, thus the measured thickness is only the visible part. Since we use this measured result for 3D reconstruction which is also based on visible edges in ultrasound images, it serves the purpose well. Only the image settings on those 3D reconstruction images should be the same as those on the images for measuring thickness. 7 Measured Image Plane Thicness 6 Image Plane Thickness (mm) 5 4 3 2 1 0 10 15 20 25 30 35 40 45 50 55 60 Image Depth (mm) Fig. 3. Measured thickness of ultrasound image plane. 3D visualization of the thickness of an ultrasound image. The thickness function we measured. Zero thickness means no sample has been captured at that depth. 3 Image Thickness Correction for Registration 3.1 Correction of Error from Image Thickness Figure 4 shows an ultrasound image plane (yellow surface) with first edge point from transducer at o and transducer s center at t. With3Dtracker,we know the normal of ultrasound image plane and 3D coordinate of t. Thereare two possible object surface point which can generate the edge at o in ultrasound image: a and b. Suppose we know the normal of the object surface near o, we can draw a plane with the object surface normal through b as the red plane in Figure 4. As we can see this plane intersect with line segment ot which means if o is not the first edge pixel in the ultrasound image from the transducer. Then it is contradict with the fact that o is detected as the first edge from the transducer. So b can not be on the object surface. Similarly we can create a plane through a with object surface normal as the green plane shown in Figure 4 (c).

Image Thickness Correction for Navigation 489 (c) Fig. 4. 3D position correction It doesn t intersect with line segment ot. Soa should be the true point on the object surface. By applying this logic to every 3D object surface point, we correct errors caused by ultrasound image thickness. Now we only need to know the object surface normal at point o. Itcanbe estimated by first registering the un-corrected 3D points to the 3D surface model of the object (usually from pre-operative CT or MRI). After registration, we take the normal of the closest point on the surface model to o as the estimated object surface normal. Because we only use this normal to determine which one of a and b is the true object surface point, a rough estimation will work. 3.2 Registration with Thickness Correction The registration process using 3D ultrasound catheter with thickness correction can be summarized as the following: 1. Scan the object surface with 3D ultrasound catheter s image plane and reconstruct un-corrected 3D surface points. 2. Register the un-corrected 3D points to the pre-operative surface model and estimate object surface normal for every un-corrected 3D surface point o. 3. Correct the position of o based on the method described in 3.1. 4. After correction, use the corrected 3D surface points to do a final registration to the 3D object surface model. 4 Experiment and Result 4.1 Test Setup We use a simple shape phantom model as shown in Figure 5. Because its shape only consists of several flat surfaces and is not rotational symmetric, it reduces registration error caused by the shape itself. The 3D surface model is shown in Figure 5. Registration Error Measurement. Inordertogiveusmorerealisticerror measurement we use a separated set of points called evaluation point set whose corresponding points on object surface are known, as shown in Figure 5 the

490 H. Zhong, T. Kanade, and D. Schwartzman Fig. 5. Phantom model in registration test: the model its 3D model blue stars. During the test, we first use 3D ultrasound catheter to scan the model to capture surface points for registration. Then we use a catheter whose tip is tracked by a 3D position sensor to touch the corners of the model as the blue points shown in Figure 5 and record their coordinates as our evaluation points. After registration (evaluation points do not participate registration), we apply the transformation matrix found by registration to evaluation points and measure how far they are from their corresponding points on the surface model. This is a more realistic error measurement since it shows the distance between where the registered model tells some interesting positions are and their true locations. Accuracy Improvement And Intersecting Angles. Error caused by image plane thickness is related to the intersecting angle of the image plane and the object surface. If the image plane is perpendicular to the object surface, image thickness will not cause error. Smaller the intersecting angle is, larger the error will be. To understand how the intersecting angle will affect registration error and how our thickness correction algorithm performs, we did a series of tests. First we scanned the phantom model with ultrasound image plane with various intersecting angles (0-90) and saved all the images. Then we sampled them to form several subsets of ultrasound images each with a different average intersecting angles. For example one subset has images whose average intersecting angle is 80 degree and another set has an average intersecting angle of 40 degree. Theoretically registration error with un-corrected points from the 80 degree set should have less error than that from the 60 degree set. While after thickness correction, they should all be improved and have similar errors. The relation among expected registration errors should be: Pun corrected 40 >P80 un corrected >P40 corrected = P corrected 80 (2) where Pun corrected x means registration error with un-corrected points from a subset whose average intersecting angle is x and Pcorrected x means registration error with corrected points. 4.2 Result and Analysis Figure 6 and shows the overall and zoomed in view of thickness corrected (red x ) and un-corrected (black + ) surface points. We can see the corrected

Image Thickness Correction for Navigation 491 (c) Fig. 6. Corrected (red x ) and un-corrected (black + ) surface registration points and result of evaluation points Registration Error (mm) 4.5 4.4 4.2 4 3.8 3.6 3.4 3.2 3 2.8 2.6 2.4 2.2 2 1.8 1.6 1.4 1.2 Un corrected Points Corrected Points Registration Error for Corrected and Un corrected Points 3.0618 4.1875 2.6343 4.0050 2.0863 2.8336 2.6857 1.9645 2.3277 2.2093 2.2099 2.1775 2.1199 2.0879 2.0033 2.0828 2.0828 1 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 Average Intersection Angle 2.5353 1.8802 2.0116 1.9169 1.8864 1.9447 1.9835 1.6056 1.5107 1.7953 Fig. 7. Registration accuracy and average intersecting angle between image planes and model s surface points have a tighter fit than the un-corrected ones. After registration, we can see the result of evaluation points in Figure 6 (c), the result from corrected points (red x ) are closer to ground truth (blue * ) than un-corrected points (black + ). Full results are shown in Figure 7. X-axis is average intersecting angle. Y- axis is registration error. If we look at the range from 40 to 70 degree, it fits the Equation 2 well. Corrected points always have less error than their un-corrected counterparts. With thickness correction, registration error can be reduced by 20.45% averagely. More important, our result shows the algorithm achieving consistent accuracy independent of intersecting angles. In reality, catheter flexibility and the size of human heart chambers may prevent doctors from scanning with near 90 degree intersecting angles. With our algorithm, it will not be a problem. Thus it makes the registration process even easier.

492 H. Zhong, T. Kanade, and D. Schwartzman 5 Conclusion and Future Work We provided an algorithm that can correct 3D reconstruction error intra-cardiac ultrasound catheter and proved its effectiveness (20.45% improvement) on phantom model. In the future we will have more test on real patient data to evaluate the full potential of this algorithm in reality. References 1. Verma, A., Marrouche, N.F., Natale, A.: Novel method to integrate threedimensional computed tomographic images of left atrium with real-time electroanatomic mapping. Cardiovasc Electrophysiol 26, 365 370 (2004) 2. Zhong, H., Kanade, T., Schwartzman, D.: Sensor guided ablation procedure for left atrium endocardium. In: Duncan, J.S., Gerig, G. (eds.) MICCAI 2005. LNCS, vol. 3750, pp. 1 8. Springer, Heidelberg (2005) 3. Zhong, H., Kanade, T., Schwartzman, D.: virtual touch : An efficient registration method for catheter navigation in left atrium. In: Larsen, R., Nielsen, M., Sporring, J. (eds.) MICCAI 2006. LNCS, vol. 4190, pp. 437 444. Springer, Heidelberg (2006) 4. Prager, R.W., Rohling, R.N., Gee, A.H., Berman, L.: Rapid calibration for 3-d freehand ultrasound. Ultrasound in Medicine and Biology 24(6), 855 869 (1998) 5. Richard, B.: Test object for measurement of section thickness at us. Radiology 211, 279 282 (1999) 6. Skolnick, M.: Estimation of ultrasound beam width in the elevation (section thickness) plane. Radiology 180, 286 288 (1991)