Increasing the accuracy with a rich sensor system for robotic laser osteotomy

Similar documents
Sensor-aided Milling with a Surgical Robot System

Advanced Reconstruction Techniques Applied to an On-Site CT System

Fully Automatic Endoscope Calibration for Intraoperative Use

Geometric Calibration in Active Thermography Applications

A Simulation Study and Experimental Verification of Hand-Eye-Calibration using Monocular X-Ray

Design and Development of Cartesian Robot for Machining with Error Compensation and Chatter Reduction

RoboVib Structural Test Station. Automated Experimental Modal Testing Product Brochure

Measurements using three-dimensional product imaging

TABLE OF CONTENTS. Page 2 14

Tracked surgical drill calibration

OUT OF MACHINE CALIBRATION TECHNIQUE FOR ANALOG PROBES

10/25/2018. Robotics and automation. Dr. Ibrahim Al-Naimi. Chapter two. Introduction To Robot Manipulators

PPGEE Robot Dynamics I

Optimized Design of 3D Laser Triangulation Systems

Advanced Computed Tomography System for the Inspection of Large Aluminium Car Bodies

Seam tracking for fillet welds with scanner optics

Torque-Position Transformer for Task Control of Position Controlled Robots

Towards Optimal 3D Point Clouds

Robotics kinematics and Dynamics

Cecilia Laschi The BioRobotics Institute Scuola Superiore Sant Anna, Pisa

Reinforcement Learning for Appearance Based Visual Servoing in Robotic Manipulation

The Anfatec Level AFM a short description. Atomic Force Microscopy - approved devices for affordable prices

Motion Control of Wearable Walking Support System with Accelerometer Considering Swing Phase Support

Methodology to Determine Counterweights for Passive Balancing of a 3-R Orientation Sensing Mechanism using Hanging Method

Robot-to-Camera Calibration: A Generic Approach Using 6D Detections

Virtual Interaction System Based on Optical Capture

Data Fusion Virtual Surgery Medical Virtual Reality Team. Endo-Robot. Database Functional. Database

Dynamic Characterization of KUKA Light-Weight Robot Manipulators

FAB verses tradition camera-based motion capture systems

Team Description Paper Team AutonOHM

ROFIN SWS. Scanner Welding System Highly dynamic and robotically guided.

AMR 2011/2012: Final Projects

HOW TO RECONSTRUCT DAMAGED PARTS BASED ON PRECISE AND (PARTLY-)AUTOMATISED SCAN METHODS

Efficient and Large Scale Monitoring of Retaining Walls along Highways using a Mobile Mapping System W. Lienhart 1, S.Kalenjuk 1, C. Ehrhart 1.

Automatic Control Industrial robotics

Visual Servoing for Floppy Robots Using LWPR

Suture knot manipulation with a robot

Design approach for a Highly Accurate Patient Positioning System for NPTC

Mirror positioning on your fingertip. Embedded controller means tiny size plus fast, easy integration. Low power for hand-held systems

INSTITUTE OF AERONAUTICAL ENGINEERING

Inverse Kinematics. Given a desired position (p) & orientation (R) of the end-effector

Micro coordinate measuring machine for parallel measurement of microstructures

Behavior-based Arm Control for an Autonomous Bucket Excavator

Creating a distortion characterisation dataset for visual band cameras using fiducial markers.

The team. Disclosures. Ultrasound Guidance During Radiation Delivery: Confronting the Treatment Interference Challenge.

Introduction To Robotics (Kinematics, Dynamics, and Design)

Constraint-Based Task Programming with CAD Semantics: From Intuitive Specification to Real-Time Control

Overview of Active Vision Techniques

3D Point Cloud Processing

Rigorous Scan Data Adjustment for kinematic LIDAR systems

Flexure-Based 6-Axis Alignment Module for Automated Laser Assembly

Robotized Assembly of a Wire Harness in Car Production Line

6 AXIS ROBOTIC ABRASIVEJET ADVANCEMENTS IN ACCURACY FOR QUALITY AND PRODUCTION

3D-2D Laser Range Finder calibration using a conic based geometry shape

Robot-Based Solutions for NDT Inspections: Integration of Laser Ultrasonics and Air Coupled Ultrasounds for Aeronautical Components

A NEW AUTOMATIC SYSTEM CALIBRATION OF MULTI-CAMERAS AND LIDAR SENSORS

Calibration and Synchronization of a Robot-Mounted Camera for Fast Sensor-Based Robot Motion

Mixed-Reality for Intuitive Photo-Realistic 3D-Model Generation

Improving Vision-Based Distance Measurements using Reference Objects

New Tools in Aircraft Accident Reconstruction to Assist the Insurer. Overview. Overview 4/30/2015

CS4610/CS5335: Homework 1

6-dof Eye-vergence visual servoing by 1-step GA pose tracking

Accurate 3D Face and Body Modeling from a Single Fixed Kinect

Robot Vision Control of robot motion from video. M. Jagersand

Serial Manipulator Statics. Robotics. Serial Manipulator Statics. Vladimír Smutný

CinCam CCD - Technical Data -

CHomework Assignment /655 Fall 2017 (Circle One)

Intuitive and Model-based On-line Programming of Industrial Robots: A Modular On-line Programming Environment

Positioning system of a metrological AFM: design considerations

R.H. (2016) 2016 IEEE

Analysis of Different Reference Plane Setups for the Calibration of a Mobile Laser Scanning System

Mikro-und Nanostrukturierung mit Ultrakurzpulslasern für Werkzeugtechnik und funktionale Oberflächen

Navigation System for ACL Reconstruction Using Registration between Multi-Viewpoint X-ray Images and CT Images

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Application-Oriented Development of Parallel Kinematic Manipulators with Large Workspace

A 100Hz Real-time Sensing System of Textured Range Images

A 3-D Scanner Capturing Range and Color for the Robotics Applications

Exam in DD2426 Robotics and Autonomous Systems

EEE 187: Robotics Summary 2

A Vision-Based Endpoint Trajectory and Vibration Control for Flexible Manipulators

UNITEST - A new device for the static and dynamic testing of unconventional machine structures

Written exams of Robotics 2

UNIVERSITY OF OSLO. Faculty of Mathematics and Natural Sciences

Physiological Motion Compensation in Minimally Invasive Robotic Surgery Part I

3D Ultrasound System Using a Magneto-optic Hybrid Tracker for Augmented Reality Visualization in Laparoscopic Liver Surgery

25 Hz. 25 Hz. X M(z) u c + u X. C(z) u ff KF. Θ, α. V(z) Θ m. 166 Hz Θ. Θ mot. C(z) Θ ref. C(z) V(z) 25 Hz. V(z) Θ, α. Interpolator.

New shortest-path approaches to visual servoing

Introducing Computer-Assisted Surgery into combined PET/CT image based Biopsy

On-ground experimental verification of a torque controlled free-floating robot

Study on Synchronization for Laser Scanner and Industrial Robot

Robot Vision without Calibration

Basilio Bona ROBOTICA 03CFIOR 1

Range Sensors (time of flight) (1)

An Improved Tracking Technique for Assessment of High Resolution Dynamic Radiography Kinematics

Digital Volume Correlation for Materials Characterization

Inverse Kinematics with Closed Form Solutions for Highly Redundant Robotic Systems

Study on Gear Chamfering Method based on Vision Measurement

Measurement of Head-to-Trunk Orientation Using Handheld 3D Optical Apparatus

OptimizationOf Straight Movement 6 Dof Robot Arm With Genetic Algorithm

Improved Accuracy of Unguided Articulated Robots. Russell DeVlieg Electroimpact, Inc. Todd Szallay Northrop Grumman Co.

Transcription:

Increasing the accuracy with a rich sensor system for robotic laser osteotomy Holger Mönnich moennich@ira.uka.de Daniel Stein dstein@ira.uka.de Jörg Raczkowsky rkowsky@ira.uka.de Heinz Wörn woern@ira.uka.de Abstract In laser osteotomy bone is cut precisely with a laser. A robot is used to position an end-effector with a scanhead, which deflects the cutting laser. An optical tacking system (OTS) tracks the position of the bone, in our case a skull, and the end effector. The skull is registered to its model, based on a segmented CT data, as well as the robot to the OTS. With this a complete transformation chain between the model and the robot coordinate system is given. Using the transformation chain the positions, where the laser beam is emitted and where it has to incise the skull, can be computed for the robot coordinate system. The complete registration processes introduce different errors in every step. To achieve clean cuts the correct distance between scanhead and ablated bone segment is crucial. Therefore a distance sensor is attached to the end-effector, measuring the distance to the skull with an accuracy of 100µm. The value is used to adjust the positioning to the correct distance. I. INTRODUCTION One of the main benefits of laser cutting on bone is the possibility of very narrow cuts. An ablation with a single laser pulse ejects a piece of bone with a diameter of down to 200µm. Since the ablation process works best in the focus of the cutting laser an inaccurate distance might not only cause a less inefficient ablation through a lower energy density, but also cause carbonisation of the surrounding tissue [1]. The focus point of the laser with respect to the scanhead was determined in [2]. The complete registration processes introduce different errors in every step. The first inaccuracies are introduced by the CT data set, e.g. the slice thickness of the data and the segmentation of the screws. The titanium screws on the skull are commonly registered with a pointer device, which introduces errors due to acquiring the positions and the fitting to the positions in the model. The transformation is calculated using the method of Horn et al [3]. The transformation between the tracking system and the robot is estimated via a modified Gauss-Newton algorithm. Although this algorithm works well, the robot causes new errors due to the flexibility of the joints [4], despite repeatability up to 50 µm. To circumvent this registration error the robot is controlled via visual servoing with the optical tracking system. In a last step the distance between skull and end effector is corrected. Therefore the distance between the robot and the skull is measured with a laser distance sensor and compared with the distance computed from the model. The position of the robot is corrected to get the best possible focusing of the used CO2 laser. II. MATERIAL & METHODS A. Registration The marker based standard registration method is used for the registration. Marker spheres are attached to the skull to track it. These spheres define the skulls local coordinate system. Additionally titanium screws are drilled into the skull and a CT scan is done, see figure 1. The titanium screws can be identified in the model including the segmented CT data of the skull. On the real skull a standard pointer device is used. Both are tracked by the OTS. The tip of the pointer device is put to the screw and moved. The tracked positions of this pointer acquired in the skulls local coordinate system are fitted to a sphere; the centre of this sphere is the position of the screw. Each titanium screw is pivotized using the pointer. 978-1-4244-5335-1/09/$26.00 2009 IEEE 1684 IEEE SENSORS 2009 Conference

The resulting point cloud is then registered to the point cloud from the CT data set according to Horn et al. [3] to acquire the transformation between the model and the skulls local coordinate system, see figure 2. Fig. 1: Bone with marker spheres, titanium screws and pointer and the segmented 3D model The orientation is not unambiguously defined for this task. The vector between the incision point of the laser on the skull and the virtual scanhead defines the orientation for the laser when the mirrors in the scanhead are in zero joint position, see figure 2. This is also the x-axis of the end effector s local coordinate system. A rotation of the scanhead around this axis gives poses, which are equally useful to cut the bone. Only the planning for the mirror movements in the scanhead has to be adjusted to the final orientation of the scanhead. To find a well defined orientation which is most likely reachable the following method is used. The x-axis is given by the scanhead position and the incision point. The robot base and the x-axis define a plane; the y-axis is given by the normal vector to the x-axis inside the plane. The cross product of those gives the z-axis. B. Registration of robot and OTS Marker spheres added to the robotic tool allows recording simultaneously the position of the robot in the robot coordinate system and the OTS coordinate system. A modified Gauss Newton algorithm is then used to calculate the transformation matrix between the robot and the tracking system. Because of the joints deflection the positions of the TCP given by the robot include an inherent bias depending on the joint configuration, so does the transformation matrix. After the complete registration process the arbitrary positions can be translated between the coordinate systems of the robot and local coordinate system of the model. Hence it is possible to assign a point in the model and to move the robot to this point as well as to calculate the actual position of the robot in the model. The method is described in detail in [5]. Fig. 3: Calculated position for the robot the direction of the laser D. Lasercutting The laser used during the experiments is CO2 laser from Rofin (Rofin Sinar SCx10). The laser has maximum output power of 100 W with a wavelength of 10 µm. The maximum repetition rate is 100 Hz with a pulse duration time of ~80µs. The complete system, consisting of laser, control pc, articulated mirror arm and an ARGES scan head is called OsteoLas x10, which was developed by CAESAR, Bonn for laser ablation of bone. The used specimen was a femur of cattle. The bone material was cut to slices [6] [7], see figure 4 & 5. Fig. 2: Kuka lightweight robot with laser distance sensor is positioned relatively to the skull C. Pose Calculation The position of the robot is defined inside the model. The user can define the position of the scan head and the program shows the corresponding target point for the laser on the skull, see figure 3. Using the position of the scan head with the transformations acquired from the registration the position of the robot in the coordinates of the tracking system is computed. 1685

Fig. 4: Experimental system for laser ablation on bone with a KUKA lightweight robot Three experiments were performed. In a first experiment the bone was cut while the robot holds its position. This showed the principal feasibility of laser cutting with the lightweight robot and is used as reference for the following experiments regarding the cleanness of cuts (figure 5, right ). The second experiment tested the effects of repeated approach on the cut. Therefore cut was partly performed, the robot moved, again moved back and so on. Regarding repeatability the used lightweight robot already proved to be already adequate for laser cutting. Repeated approach to the target does not result in a visible widening of the cut (figure 5, middle). In the third experiment the effect of the so called nullspace movement on the cutting was tested. Nullspace movement is holding the pose of the end effector while changing the joint configuration in case of redundant kinematics. When the robot performs nullspace movements while the laser cutting is performed the cut widens slightly. (figure 5, left). This problem arises from minimal inaccuracies from the swinging of the robot. Since the registration error between OTS and robot is compensated by the control schema, the main error source is the registration of the patient. Therefore the precision is already good enough but the accuracy is improvable. The propagated error through the complete registration and calibration chain is up to 1mm so far. Fig. 5: Results of the first cutting experiments. The right cut was performed while the robot was holding its position. The left cut was performed with nullspace movement of the robot, a movement of the redundant kinematics with identical pose. The middle cut was performed with a E. Accuraccy repeated approach of the to robot the target and to test the the tracking repeatability system The accuracy of the KUKA lightweight robot is measured with a measurement arm from FARO, see figure 2. This measurement arm has a total accuracy of 50 µm. To test the robot a cube is scanned of 20 cm with a step size of 10 cm, in total 27 points. For all combination of two points (27*13) the distance is calculated and compared with the planned distance. Fig. 6: Comparison between the accuracies of ART tracking system and KUKA lightweight robot to the FARO measurement arm as reference The deviation is calculated and the average for same distances is computed. The Faro measurement arm is used as a reference. The test is performed with the OTS and with the robot. The result is a linear drift behavior of the optical tracking system, see figure 6. The general tendency for the robot is that bigger distances from the origin results in bigger errors, comparable to the tracking system. But the error is non linear, because the main error comes from the non linear kinematics joint elasticity s. The lightweight robot is measuring the force in each joint and performs model based compensation of the weight of the end-effector. This already results in quite good results regarding the repeatability, but deflection and calibration errors that cannot be measured result in decreased accuracy compared to the repeatability. 1686

F. Laser Distance Sensor A laser distance sensor mounted to the robotic end-effector is used to determine the distance to the patient, see figure 2. The distance sensor outperforms the tracking system in terms of accuracy, 0.1 mm for the laser sensor and 0.3 for the tracking system, and speed, 60 Hz for the tracking system and 1 Khz for the distance sensor. Also the delay is lower with <10 ms compared to 80 ms for the optical tracking system and 100 ms to the robot. The error introduced by the registration can be solved regarding to the distance to the patient. For this reason the laser sensor is use to reposition the robot on the axis defined inside the model between the incision point on the skull and the point defined by the user to get the best position for the focus point of the laser. The accuracy on this axis is between 0.2-0.1 mm. In the case of deviation of the distance the laser can be turned off before the tracking system delivers the positioning data. Fig. 8: Positioning error with visual servoing H. Robotic End-Effector G. Visual Servoing Control Schema Read Position from 3D CT Modell P d + - Calculate closest distance to the skull λ Transform from Modell to Tracking- System Robot + target Calculate Orientation Fig. 7: Schema for the calculation of the desired position and the control schema for the visual servoing application The position for the robot is calculated inside the 3D model. The visual servoing control loop reads this data from the segmented 3D model and with the transformation matrixes from the registration and calibration process the position of the skull in the model und in reality is transformed to the coordinate system from the optical tracking system [8]. The difference with an applied threshold filter, to avoid high speeds, is send to the robot. To avoid oscillation of the robot the error is damped when sent to the robot, because of the high delay from the different systems (OTS 80 ms and robot 160ms). P Fig. 9: End-effector with camera, laser distance sensor, scan-head, marker spheres and pilot laser. In gravity mode it is possible to guide the robot with end effector by hand. The end-effector shown in figure 9 integrates all needed parts. On top of the end-effector calibration spheres are used to track the position of the end-effector within the working area, by using the optical multi-camera tracking system from ART. A laser-distance sensor measure and control the distance to the target. It works with an accuracy of 100 µm. Its working-distance is 5 cm; this explains the length of the plate it is placed on, which was specially designed to assure a proper functioning of the distance-sensor. Using these sensors the accuracy increases from 1-2 mm to 0.2 mm for the distance and between 0.4-0.5 mm for the total accuracy. The CO2 cutting laser is delivered through an articulated mirror arm. This arm is attached to the scan-head through a coupling plate. III. RESULTS The setup shows the influence of different error sources: registration, calibration, absolute accuracy and repeatability of the robot, modeling and measurement. The error influenced by the robot is 1-2 mm and can be reduced to 0.2-0.6 mm with the optical tracking system with visual servoing. The error is about 1 mm for the complete process including the positioning of the robot and the transformation between the model, the optical tracking system and the robot coordinate system before the correction of the distance. With the laser distance sensor the error for the distance to the skull and the focus point can be reduced to 0.2 mm. This shows the benefit of a multi sensor concept for the accuracy for the laser cutting. 1687

IV. OUTLOOK Currently different approaches are investigated to increase the accuracy of the system. Firstly a camera attached to the endeffector is used to track the body for the tracking system, this data could be used to increase the accuracy of the tracking system. The data of the laser distance sensor could be integrated better into the system with a data fusion approach, like a Kalman filter that fuse the positioning data of the robot, tracking system and the distance sensor. Also the camera could be used to track the position of the pilot laser to prove the scanhead calibration. The camera could also be used to supervise the cutting trajectory. Also a motion compensation algorithm is tested at the moment to decrease the error in motion mainly introduced due the delay of the different systems. For the motion compensation we are even testing to use additional laser scanner or distance sensors to acquire faster the error in 3D than it is possible with the optical tracking system. This data could be used to decrease the error in motion further by using the scanhead that can be controlled faster than the robot. ACKNOWLEDGMENT This work has been funded by the European Commission s Sixt Framework Programme within the project Accurate Robot Assistant - ACCUROBAS under grant no. 045201. REFERENCES [1] M. Werner: Ablation of hard biological tissue and osteotomy with pulsed CO2lasers. Dissertation, Heinrich-Heine University of Düsseldorf, Germany, 2006 [2] Jessica Burgner, Jörg Raczkowsky, Heinz Wörn, End-effector calibration and registration procedure for robot assisted laser material processing: Tailored to the particular needs of short pulsed CO2 laser bone ablation, Kobe, Japan, ICRA 2009 [3] B. K. P. Horn, Closed-form solution of absolute orientation using unit quaternions, JOSA A, vol. 4, pp. 629 642, 1987 [4] A. Albu-Schäffer, G. Hirzinger, Cartesian Impedance Control Techniques for Torque Controlled Light-Weight Robots, Washington DC, USA, ICRA 2002 [5] Daniel Stein, H. Mönnich, J. Raczkowsky and H. Wörn, Automatic and hand guided self-registration between a robot and an optical tracking system, ICAR 2009 [6] H. Peters, H. Knoop, W. Korb, et al.: Bringing laser for osteotomy into the operation theatre. CARS 2005, Berlin, Germany June 2005. [7] http://www.caesar.de/, 2009 [8] Seth Hutchinson, Greg Hager, Peter Corke, A tutorial on visual servo control, IEEE Transactions on Robotics and Automation, 1996 1688