A User Interface for Robot-Assisted Diagnostic Ultrasound

Similar documents
A New Algorithm for Measuring and Optimizing the Manipulability Index

Research Subject. Dynamics Computation and Behavior Capture of Human Figures (Nakamura Group)

Visual Tracking of a Hand-eye Robot for a Moving Target Object with Multiple Feature Points: Translational Motion Compensation Approach

Real-time Tissue Tracking with B-Mode Ultrasound Using Speckle and Visual Servoing

REDUCED END-EFFECTOR MOTION AND DISCONTINUITY IN SINGULARITY HANDLING TECHNIQUES WITH WORKSPACE DIVISION

Development of a Master Slave System with Force Sensing Using Pneumatic Servo System for Laparoscopic Surgery

AMR 2011/2012: Final Projects

Cecilia Laschi The BioRobotics Institute Scuola Superiore Sant Anna, Pisa

Visual Servoing Utilizing Zoom Mechanism

Dynamics Analysis for a 3-PRS Spatial Parallel Manipulator-Wearable Haptic Thimble

Singularity Handling on Puma in Operational Space Formulation

A Study of Medical Image Analysis System

Torque-Position Transformer for Task Control of Position Controlled Robots

Spherical wrist dimensional synthesis adapted for tool-guidance medical robots

Introduction to Robotics

Robot Vision Control of robot motion from video. M. Jagersand

Computed tomography - outline

Kinematic Control Algorithms for On-Line Obstacle Avoidance for Redundant Manipulators

Flexible Modeling and Simulation Architecture for Haptic Control of Maritime Cranes and Robotic Arms

Real Time Control of the MIT Vehicle Emulator System

Development of an Operating Robot System for Die and Mold Polishing

Redundancy Resolution by Minimization of Joint Disturbance Torque for Independent Joint Controlled Kinematically Redundant Manipulators

Keehoon Kim, Wan Kyun Chung, and M. Cenk Çavuşoğlu. 1 Jacobian of the master device, J m : q m R m ẋ m R r, and

Automatic Control Industrial robotics

autorob.github.io Inverse Kinematics UM EECS 398/598 - autorob.github.io

6-dof Eye-vergence visual servoing by 1-step GA pose tracking

Development of an optomechanical measurement system for dynamic stability analysis

3D Tracking Using Two High-Speed Vision Systems

Resolution of spherical parallel Manipulator (SPM) forward kinematic model (FKM) near the singularities

LOGIQ. V2 Ultrasound. Part of LOGIQ Vision Series. Imagination at work LOGIQ is a trademark of General Electric Company.

10/25/2018. Robotics and automation. Dr. Ibrahim Al-Naimi. Chapter two. Introduction To Robot Manipulators

CALIBRATION ALGORITHM FOR FREEHAND ULTRASOUND

A Tool for Kinematic Error Analysis of Robots/Active Vision Systems

Ultrasound To Go. The MySono U5 -

Visual Tracking of Unknown Moving Object by Adaptive Binocular Visual Servoing

The team. Disclosures. Ultrasound Guidance During Radiation Delivery: Confronting the Treatment Interference Challenge.

Medical Robotics. Control Modalities

Table of Contents Introduction Historical Review of Robotic Orienting Devices Kinematic Position Analysis Instantaneous Kinematic Analysis

Kinematics of Closed Chains

Written exams of Robotics 2

Projection-Based Needle Segmentation in 3D Ultrasound Images

A New Algorithm for Measuring and Optimizing the Manipulability Index

Keeping features in the camera s field of view: a visual servoing strategy

Trajectory Planning of Redundant Planar Mechanisms for Reducing Task Completion Duration

Robust and Adaptive Teleoperation for Compliant Motion Tasks

TRAINING A ROBOTIC MANIPULATOR

Optimal Remote Center-of-Motion Location for Robotics-Assisted Minimally-Invasive Surgery

UNIVERSITY OF OSLO. Faculty of Mathematics and Natural Sciences

Extension of Usable Workspace of Rotational Axes in Robot Planning

PSO based Adaptive Force Controller for 6 DOF Robot Manipulators

Lecture «Robot Dynamics»: Kinematic Control

Image Moments-based Ultrasound Visual Servoing

Operation Trajectory Control of Industrial Robots Based on Motion Simulation

Jacobian: Velocities and Static Forces 1/4

Clinical Importance. Aortic Stenosis. Aortic Regurgitation. Ultrasound vs. MRI. Carotid Artery Stenosis

A Control Technique for a 6-DOF Master-Slave Robot Manipulator System for Miniature Tasks

MCE/EEC 647/747: Robot Dynamics and Control. Lecture 3: Forward and Inverse Kinematics

CS4610/CS5335: Homework 1

Simulation. x i. x i+1. degrees of freedom equations of motion. Newtonian laws gravity. ground contact forces

INSTITUTE OF AERONAUTICAL ENGINEERING

Properties of Hyper-Redundant Manipulators

Automatic calibration of a robotized 3D ultrasound imaging system by visual servoing

Robot Inverse Kinematics Asanga Ratnaweera Department of Mechanical Engieering

Using Artificial Neural Networks for Prediction Of Dynamic Human Motion

Suture knot manipulation with a robot

Advanced Image Reconstruction Methods for Photoacoustic Tomography

25 Hz. 25 Hz. X M(z) u c + u X. C(z) u ff KF. Θ, α. V(z) Θ m. 166 Hz Θ. Θ mot. C(z) Θ ref. C(z) V(z) 25 Hz. V(z) Θ, α. Interpolator.

Force control of redundant industrial robots with an approach for singularity avoidance using extended task space formulation (ETSF)

Stackable 4-BAR Mechanisms and Their Robotic Applications

Exercise 2b: Model-based control of the ABB IRB 120

DETERMINING suitable types, number and locations of

Cobots

Table of Contents. Chapter 1. Modeling and Identification of Serial Robots... 1 Wisama KHALIL and Etienne DOMBRE

Control of a Robot Manipulator for Aerospace Applications

Path planning and kinematics simulation of surfacing cladding for hot forging die

Increasing the accuracy with a rich sensor system for robotic laser osteotomy

Constraint and velocity analysis of mechanisms

A multi-plane approach for ultrasound visual servoing : application to a registration task

Automatic Tracking of an Organ Section with an Ultrasound Probe: Compensation of Respiratory Motion

Probabilistic Tracking and Model-based Segmentation of 3D Tubular Structures

FREE SINGULARITY PATH PLANNING OF HYBRID PARALLEL ROBOT

Short Papers. Cobot Implementation of Virtual Paths and 3-D Virtual Surfaces

The University of Missouri - Columbia Electrical & Computer Engineering Department EE4330 Robotic Control and Intelligence

IMPROVEMENT OF CONSPICUITY BY FUSION OF PULSE-ECHO DATA

Research on the Control Strategy of Decoupled 3-DOF Joystick for Teleoperation

Improving ultrasound intensity-based visual servoing: tracking and positioning tasks with 2D and bi-plane probes

5-Axis Flex Track Drilling Systems on Complex Contours: Solutions for Position Control

Ultrasound To Go. MySono U5

Research on time optimal trajectory planning of 7-DOF manipulator based on genetic algorithm

IntroductionToRobotics-Lecture02

Reinforcement Learning for Appearance Based Visual Servoing in Robotic Manipulation

Mech. Engineering, Comp. Science, and Rad. Oncology Departments. Schools of Engineering and Medicine, Bio-X Program, Stanford University

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Intermediate Desired Value Approach for Continuous Transition among Multiple Tasks of Robots

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Theory of Robotics and Mechatronics

Design & Kinematic Analysis of an Articulated Robotic Manipulator

CMPUT 412 Motion Control Wheeled robots. Csaba Szepesvári University of Alberta

Kinematics and dynamics analysis of micro-robot for surgical applications

PPGEE Robot Dynamics I

Transcription:

A User Interface for Robot-Assisted Diagnostic Ultrasound P. Abolmaesumi, S.E. Salcudean, W.H. Zhu, S.P. DiMaio and M.R. Sirouspour The University of British Columbia Department of Electrical and Computer Engineering Vancouver, B.C., Canada, V6T 1Z4 puranga@ece.ubc.ca, tims@ece.ubc.ca Abstract A robot-assisted system for medical diagnostic ultrasound has been developed by the authors. This paper presents key features of the user interface used in this system. While the ultrasound transducer is positioned by a robot, the operator, the robot controller, and an ultrasound image processor have shared control over its motion. Ultrasound image features that can be selected by the operator are recognized and tracked by a variety of techniques. Based on feature tracking, ultrasound image servoing in three axes has been incorporated in the interface and can be enabled to automatically compensate, through robot motions, unwanted motions in the plane of the ultrasound beam. The stability and accuracy of the system is illustrated through a 3D reconstruction of an ultrasound phantom. 1 Introduction Medical ultrasound exams often require that ultrasound technicians hold the transducers in awkward positions for prolonged periods of time, sometimes exerting large forces. A number of studies indicate that they suffer from an unusually high incidence of musculoskeletal disorders (e.g. [16]). Motivated initially by the need to alleviate these problems and to present a more ergonomic interface to the ultrasound technicians, a teleoperation approach to diagnostic ultrasound has been proposed. The system consists of a master hand controller, a slave manipulator that carries the ultrasound probe, and a computer control system that allows the operator to remotely position the ultrasound transducer relative to the patient s body. An inherently safe, light, backdrivable, counterbalanced robot has been designed and tested for use in carotid artery examinations, the purpose of which is to diagnose occlusive disease in the left and right common carotid arteries a major cause of strokes [14]. The motion of the robot arm is based on measured positions and forces, acquired ultrasound images, and/or taught position and force trajectories. The system uses a shared control approach that is capable of achieving motion, force and image control simultaneously. The ability to interactively position the ultrasound probe via a teleoperated system, while being assisted with force and image controllers, has major advantages over other similar interfaces for ultrasound examination. In [1], a Mitsubishi PA-1 industrial robot was used with a force controller to assist ultrasound technicians to move the ultrasound probe against the patient s body. The ultrasound probe can only be moved by the robot, through a pre-specified trajectory, which limits the flexibility of the examination. No shared control, teleoperation or ultrasound image servoing was reported. Other approaches, such as [4, 9], focus primarily on providing an interface for 3D ultrasound image reconstruction. In addition to the ergonomic benefits, the ability to remotely position the ultrasound probe could also be used in teleradiology [1]. Although a number of interfaces for teleultrasound examinations have been proposed in the literature [15, 5], all of them require that an ultrasound technician be present at the examination site in order to manipulate the probe under the supervision of a remote radiologist, via real-time visual and voice interaction. Our proposed system allow the radiologist to view and to manipulate the ultrasound transducer at the remote site. The ability to automatically guide the ultrasound probe as a function of its acquired images, an approach termed ultrasound image servoing, could be a useful feature for diagnostic examinations when used in conjunction with human supervisory control, in order to reduce operator fatigue [2]. During the ultrasound examination, the operator interacts with a graphical user interface and a hand controller. The resulting op-

Figure 1: Block diagram of the ultrasound robot system. erator commands are coordinated with a local visual servoing system in order to control the robot, and thus the ultrasound-probe motion. Image servoing in the control of three degrees of freedom of the robot is being addressed in this paper. Section 2 describes the system setup. Section 3 presents the theory behind ultrasound image servoing, along with the experimental results. An application of the system to reconstruct an ultrasound phantom in 3D is described in Section 4. Finally, Section 5 provides a summary and concluding remarks. 2 System Setup Figure 1 shows the block-diagram of the experimental setup. The system consists of a user interface, a slave manipulator carrying the ultrasound probe, and a computer control system. 2.1 The User Interface The operator interacts with the system through the user interface, which consists of a master hand controller (a SpaceMouse/Logitech Magellan [6]) and a graphical user interface (GUI). The GUI was written using the gtk+ library under the Linux operating system. Figure 2 shows a screen shot of the GUI. It allows the operator to activate/deactivate the robot, to enable/disable the force control and the visual servoing and to enable/disable different degrees of freedom of the robot. Other features such as changing the sensitivity of the robot to the user positioning commands and switching the robot working frame from the world to the probe and vice versa are also incorporated in the GUI. Ultrasound images are captured in real-time and are displayed in the GUI. A 3D rendered model of the ultrasound probe, which displays the orientation of the transducer in real-time, is incorporated within the GUI. One or more features can be selected from the ultrasound image by using the mouse pointer. These features are passed to the image controller to compensate for motions in the plane of the ultrasound beam, the goal of which is to maintain features at the desired position in the image. The velocity of all axes of the robot can be controlled by the Magellan mouse. All buttons and sliders in the GUI can be controlled by the keypad of the Magellan mouse. 2.2 Ultrasound Robot A lightweight robot with limited force capability has been designed for teleoperated ultrasound examinations [12]. The robot moves fast enough to allow ultrasound examination to take place at a pace close to that achieved by the unassisted sonographer. In addition, the robot joints are backdrivable so that the arm can be pushed out of the way if necessary and controlled effectively in force mode. During the ultrasound examination, the ultrasound transducer is carried by the end-effector of the robot. Small motor driving torques are required to generate a maximum force of 1 N at the end-effector, due to the fully counterbalanced mechanical design. This results in a system that is safe in the case of a power failure. A resolution of.1mm for translation and.9 for rotation of the end-effector has been achieved.

Figure 2: Graphical user interface. 2.3 Robot Controller The control approach is explained in [17]. Its objective is to let a linear combination of the velocity and scaled force of the ultrasound probe track the hand controller command (displacement). There is no explicit switching between the contact and free motion states. Safety is insured by including a control command reset or disturbance accommodation function that never allows position errors to be large. The controller uses the measured probe positions and forces, the acquired ultrasound images, and/or the commanded position and force trajectories simultaneously in a shared control approach to control the robot arm. The controller runs on a VxWorks TM real-time operating system. 3 Ultrasound Image Servoing One of the main features of the current system is its ability to visually track features in ultrasound images in real-time. This could help ultrasound technicians in guiding the motion of the ultrasound probe during the examination. The image processing system captures the ultrasound images at a rate of 3 frames/s by using a MV-delta frame grabber. The Star-Kalman motion tracking algorithm [2, 3] provides the image controller with the required feature coordinates (e.g. the center of the carotid artery) to control the robot. The feasibility of the ultrasound image servoing to control three axes of the robot can be determined by examining the ultrasound image Jacobian, which relates differential changes in the image features to differential changes in the configuration of the robot. Figure 3 illustrates the concept. Let P i =[u i,v i ] T be a feature point in the plane of the ultrasound image with coordinates [ e x, e y, e z] T in the probe-attached frame. Assuming an orthographic projection model [8] with scale a for the ultrasound image and that P i remains in the image plane, the coordinates of P i in the two-dimensional ultrasound image become [u i u,,v i ] T =[a e x,,a e z] T, where [u,v ] T is the center of the ultrasound image. It can be shown that [ ] ui = v i [ a vi a u i u ] e Ẋ = J e vi Ẋ (1) where e Ẋ are the translational and angular endeffector velocities in end-effector coordinates and J vi R 2n 6 is the ultrasound image Jacobian matrix. If several points are considered in the image, similar pairs of rows will be added to (1). The rank of the resulting Jacobian is at most three. Two or more feature points with motions not along the lines connecting them to each other will generate a Jacobian of rank three. Thus, as expected, with non-trivial ultrasound images, it is possible to control the motion of the ultrasound transducer in its image plane in three degrees of freedom. The visual controller, depicted in Figure 4, is described by: e Ẋ = J v i [ K d (f i f r ) f r ], (2)

Figure 3: Definition of the frames for the ultrasound robot. where f i =[u i,v i ] T is the actual image feature location in the ultrasound image, f r is its desired location, K d is the controller gain and J v i is the pseudo-inverse of J vi. Replacing J vi from (1) in (2) and assuming that the dynamics of the robot control loop are negligible relative to the image control loop, the following equation is derived: f i + K d f i = K d f r + f r, (3) which guarantees that image feature servoing (f i ) (f r ) (4) can be achieved. An ultrasound phantom has been designed to validate the visual servoing algorithm. Figure 5 illustrates a CAD model of the phantom. Three plastic tubes are positioned in a solution [11] along three different axes in the phantom. Ultrasound image servoing at rates as high as 3 Hz has been achieved to control up to three axes of the robot, while tracking one or two features in the ultrasound image in real-time. With the probe coordinate frame illustrated in Figure 3, the control axes are the translations along the x-axis and z-axis, while the rotation is about the y-axis. Figure 6 shows the ultrasound image servoing performance for one of the axes. For this experiment, a feature (center of one of the pipes in the phantom ultrasound image) is selected before enabling the visual servoing. While the operator is moving the probe along the y axis, the feature position is maintained in the center of the image automatically. The performance of the system while tracking two features (center of two pipes in the phantom ultrasound image) simultaneously is shown in Figure 7. Figure 5: A CAD model of the ultrasound phantom. The two features are moved away from their reference points at t =.7s by moving the robot along the x-axis and are moved back by the image servoing action. In this experiment, K d = I 4 4 Hz. 4 3D Ultrasound Imaging The current system can be used for other applications such as 3D ultrasound imaging. Since the location of the ultrasound transducer can be determined via the forward kinematics of the slave manipulator, three-dimensional ultrasound images can be reconstructed from a series of two-dimensional image slices. The robot is used to reconstruct a 3D model of the tubes in the phantom. The ultrasound image servoing has been used to center the pipes in the image during the ultrasound examination. Two different 3D reconstruction methods have been implemented as discussed in the subsections that follow. 4.1 Stradx Stradx [7] is a tool for the acquisition and visualization of 3D ultrasound images using a conventional 2D ultrasound machine and an Ascension Bird TM position sensor. The contours that specify an anatomic region of interest are drawn in a series of two-dimensional image slices by the operator. These contours are mapped to a 3D space by using the position information provided by the sensor. In this application, the measured robot/probe position was substituted in place of the sensor data as input to the Stradx program. Since the resolution of the robot is higher than that of the Bird sensor and since it is independent of metal in the surrounding environment, the system provides a powerful tool for accurate 3D reconstruction of ultrasound images. Figure 8 shows the 3D reconstruction of the ultrasound phantom using this approach.

Figure 4: Ultrasound image controller. Y Displacement (cm) 4 3 2 1 1 1 2 3 4 5 6 Displacement (pixels) 4 3 2 1 1 1 2 3 4 5 X Displacement (pixels) 1 5 1 2 3 4 5 6 Time (s) Figure 6: Experimental results for image servoing in single axis. 4.2 Star-Kalman Based Reconstruction By using the Star-Kalman algorithm [2] to extract each pipe s contour in the ultrasound image in realtime and the inverse kinematics of the robot to map each contour to the world coordinates, a 3D image of each pipe is reconstructed. Figure 9 shows a partially reconstructed image. No smoothing filter has been used. In contrast to the Stradx reconstruction approach, where the operator has to specify the contour of the desired region in each ultrasound image frame separately, this method extracts each pipe s contour automatically in real-time. In addition, the data storage requirements are much lower than that of the Stradx approach, as only the contour coordinates are necessary for 3D reconstruction. However, the image is substantially noisier. Displacement (pixels) 4 3 2 1 1 1 2 3 4 5 Time (s) Figure 7: Experimental results for image servoing in three axes; the position of the two features are changed by moving the robot in x direction. 5 Summary and Conclusions A new medical ultrasound examination system that uses a robot to remotely position the ultrasound probe is presented. The system uses a shared control approach that is capable of achieving motion, force and image control simultaneously. An ultrasound image servoing capability is embedded in the system. Ultrasound visual servoing to control three axes of the robot has been demonstrated. An application of the system in 3D ultrasound image reconstruction has also been described. In the future, the passive mouse will be replaced by a PowerMouse haptic interface [13], in order to realize bilateral teleoperation with force feedback. Also, the system will be used and tested for ultrasound diagnostic at the Vancouver General Hospital, UBC. A human factors study to compare the presented approach with that of the conventional ultrasound examination is also

Figure 8: Partial 3D image reconstruction of the ultrasound phantom..44.435.43.425.42.415.41.45.4.3.28.26.24.22.735.73.725 Figure 9: Partial 3D image reconstruction of the ultrasound phantom by using the Star-Kalman contour extraction method. under way. References [1] P. Abolmaesumi, S.P. DiMaio, S.E. Salcudean, R. Six, W.H. Zhu, and L. Filipozzi. Teleoperated robotassisted diagnostic ultrasound. Demo Presentation, IRIS/PRECARN 2, Montreal. [2] P. Abolmaesumi, S.E. Salcudean, and W.H. Zhu. Visual servoing for robot-assisted diagnostic ultrasound. World Congress on Medical Physics and Biomedical Engineering, Chicago, 2. [3] P. Abolmaesumi, M.R. Sirouspour, and S.E. Salcudean. Real-time extraction of carotid artery contours from ultrasound images. Computer-Based Medical Systems, pages 181 186, Texas, June 2. [4] K. Baba, K. Satch, S. Satamoto, T. Okai, and I. Shiego. Developement of an ultrasonic system for.72.715.71.75.7.695 three-dimensional reconstruction of the foetus. J. Perinat. Med., 17, 1989. [5] W.J. Chimiak, R.O. Rainer, N.T. Wolfman, and W. Covitz. Architecture for a high-performance teleultrasound system. Medical Imaging: PACS Design and Evaluation: Engineering and Clinical Issues, 2711:459 465, Newport Beach, CA, 1996. [6] J. Dietrich, G. Plank, and H. Kraus. Optoelectronic system housed in plastic sphere. Europ. Patent No. 24 23; US-Patent No. 4,785,18; JP-Patent No. 1 763 62. [7] A.H. Gee and R.W. Prager. Sequential 3d diagnostic ultrasound using the stradx system. MICCAI 99, Cambridge, UK, September 1999. [8] S. Hutchinson, G. Hager, and P.I. Corke. A tutorial on visual servo control. IEEE Trans. on Rob. Auto., 12(5):651 67, October 1996. [9] R. Ohbuchi, D. Chen, and H. Fuchs. Incremental volume reconstruction and rendering for 3-d ulrasound imagine. Vis. Biomed. Comput., pages 312 323, 1992. [1] F. Pierrot, E. Domre, and E. Dgoulange. Hippocrate: A safe robot arm for medical applications with force feedback. Medical Image Analysis, 3:285 3, 1999. [11] D.W. Rickey, P.A. Picot, D.A. Christopher, and A. Fenster. A wall-less vessel phantom for doppler ultrasound studies. Ultrasound in Medicine and Biology, 21(9):1163 76, 1995. [12] S.E. Salcudean, G. Bell, S. Bachmann, W.H. Zhu, P. Abolmaesumi, and P.D. Lawrence. Robot-assisted diagnostic ultrasound - design and feasibility experiments. MICCAI 99, Cambridge, UK, September 1999. [13] S.E. Salcudean and N.R. Parker. 6-dof desk-top voicecoil joystick. 6 th Symp. Haptic Inter. for Virtual Env. and Teleop. Sys., Intl. Mech. Eng. Congr. Exp., (ASME Winter Annual Meeting), 61:131 138, November 16-21 Dallas, Texas, 1997. [14] S.E. Salcudean, W.H. Zhu, P. Abolmaesumi, S. Bachmann, and P.D. Lawrence. A robot system for medical ultrasound. ISRR 99, pages 152 159, Oct. 9 12 Snowbird, Utah, 1999. [15] J. Sublett, B. Dempsey, and A.C. Weaver. Design and implementation of a digital teleultrasound system for real-time remote diagnosis. Computer-Based Medical Systems, pages 292 299, June Texas, 1995. [16] H.E. Vanderpool, E.A. Friis, B.S. Smith, and K.L. Harms. Prevalence of carpal tunnel syndrome and other work-related musculoskeletal problems in cardiac sonographers. Journal of Occupational Medicine, 35:64 61, June 1993. [17] W.H. Zhu, S.E. Salcudean, S. Bachman, and P. Abolmaesumi. Motion/force/image control of a diagnostic ultrasound robot. Proc. IEEE Int. Conf. Rob. Auto., San Francisco, 2.