High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound



Similar documents
Automotive Applications of 3D Laser Scanning Introduction

How To Fix Out Of Focus And Blur Images With A Dynamic Template Matching Algorithm

Low-resolution Character Recognition by Video-based Super-resolution

Optimization of image quality through online motion correction

Gated Radiotherapy for Lung Cancer

High Quality Image Magnification using Cross-Scale Self-Similarity

The process components and related data characteristics addressed in this document are:

High speed 3D capture for Configuration Management DOE SBIR Phase II Paul Banks

Automatic Labeling of Lane Markings for Autonomous Vehicles

Removing Moving Objects from Point Cloud Scenes

How To Program A Laser Cutting Robot

Tracking of Small Unmanned Aerial Vehicles

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Proper Implementation of Industrial CT Scanning to Reduce Inspection Costs & Get to Production Faster. Jesse Garant, JG&A Metrology Center

Frequently Asked Questions About VisionGauge OnLine

SUPER RESOLUTION FROM MULTIPLE LOW RESOLUTION IMAGES

CASE STUDY LANDSLIDE MONITORING

Integrated sensors for robotic laser welding

Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System

Projection Center Calibration for a Co-located Projector Camera System

Correcting the Lateral Response Artifact in Radiochromic Film Images from Flatbed Scanners

Medical Image Processing on the GPU. Past, Present and Future. Anders Eklund, PhD Virginia Tech Carilion Research Institute

Image Analysis for Volumetric Industrial Inspection and Interaction

Three Dimensional Ultrasound Imaging

Accurate and robust image superresolution by neural processing of local image representations

Deformable Registration for Image-Guided Radiation Therapy

Mean-Shift Tracking with Random Sampling

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS

Static Environment Recognition Using Omni-camera from a Moving Vehicle

Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication

Automatic 3D Mapping for Infrared Image Analysis

Development of an automated Red Light Violation Detection System (RLVDS) for Indian vehicles

Interactive Visualization of Magnetic Fields

Basler. Line Scan Cameras

Classifying Manipulation Primitives from Visual Data

An Iterative Image Registration Technique with an Application to Stereo Vision

ROBUST VEHICLE TRACKING IN VIDEO IMAGES BEING TAKEN FROM A HELICOPTER


Digital Photogrammetric System. Version USER MANUAL. Block adjustment

National Performance Evaluation Facility for LADARs

DICOM Correction Item

Normalisation of 3D Face Data

Integration Services

3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM

Application Report. Propeller Blade Inspection Station

Vision-based Walking Parameter Estimation for Biped Locomotion Imitation

Building an Advanced Invariant Real-Time Human Tracking System

LOCAL SURFACE PATCH BASED TIME ATTENDANCE SYSTEM USING FACE.

EFFICIENT VEHICLE TRACKING AND CLASSIFICATION FOR AN AUTOMATED TRAFFIC SURVEILLANCE SYSTEM

Leeds the Automation Degree in the Processing of Laser Scan Data to Better Final Products?

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

Application Example: Quality Control of Injection-Molded Parts

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

Build Panoramas on Android Phones

Differentiation of 3D scanners and their positioning method when applied to pipeline integrity

Real time vehicle detection and tracking on multiple lanes

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network

GOM Optical Measuring Techniques. Deformation Systems and Applications

M D Anderson Cancer Center Orlando TomoTherapy s Implementation of Image-guided Adaptive Radiation Therapy

Research-Grade Research-Grade. Capture

Robust and accurate global vision system for real time tracking of multiple mobile robots

Application Example: Automotive Testing: Optical 3D Metrology improves Safety and Comfort

Pipeline External Corrosion Analysis Using a 3D Laser Scanner

Synthetic Sensing: Proximity / Distance Sensors

Mouse Control using a Web Camera based on Colour Detection

PRODUCT DATA. Uses and Features

IMPROVEMENT OF DIGITAL IMAGE RESOLUTION BY OVERSAMPLING

The Big Data methodology in computer vision systems

Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database

ACCURACY ASSESSMENT OF BUILDING POINT CLOUDS AUTOMATICALLY GENERATED FROM IPHONE IMAGES

VEHICLE LOCALISATION AND CLASSIFICATION IN URBAN CCTV STREAMS

IGRT. IGRT can increase the accuracy by locating the target volume before and during the treatment.

MoveInspect HF HR. 3D measurement of dynamic processes MEASURE THE ADVANTAGE. MoveInspect TECHNOLOGY

Automated Optical Inspection is one of many manufacturing test methods common in the assembly of printed circuit boards. This list includes:

Advances in scmos Camera Technology Benefit Bio Research

Superresolution images reconstructed from aliased images

Cloud tracking with optical flow for short-term solar forecasting

ON THE AUTOMATION OF THE REGISTRATION OF POINT CLOUDS USING THE METROPOLIS ALGORITHM

VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

Application Report: Running µshape TM on a VF-20 Interferometer

4D Scanning. Image Guided Radiation Therapy. Outline. A Simplified View of the RT Process. Outline. Steve B. Jiang, Ph.D.

SYMMETRIC EIGENFACES MILI I. SHAH

International Year of Light 2015 Tech-Talks BREGENZ: Mehmet Arik Well-Being in Office Applications Light Measurement & Quality Parameters

Form Design Guidelines Part of the Best-Practices Handbook. Form Design Guidelines

Topographic Change Detection Using CloudCompare Version 1.0

RESOLUTION IMPROVEMENT OF DIGITIZED IMAGES

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

Poker Vision: Playing Cards and Chips Identification based on Image Processing

CT RADIATION DOSE REPORT FROM DICOM. Frank Dong, PhD, DABR Diagnostic Physicist Imaging Institute Cleveland Clinic Foundation Cleveland, OH

Image Registration and Fusion. Professor Michael Brady FRS FREng Department of Engineering Science Oxford University

HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT

Transcription:

High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound Ralf Bruder 1, Florian Griese 2, Floris Ernst 1, Achim Schweikard 1 1 Institute for Robotics and Cognitive Systems, University Lübeck 2 Univerity Lübeck bruder@rob.uni-luebeck.de Abstract. Real-time target localization in ultrasound is useful in many clinical and scientific areas. For example in radiation therapy tumors can be localized in real-time and irradiated with a high accuracy. To measure the position of an ultrasound target in a global coordinate system or to extend the tracking volume by moving the ultrasound transducer an optical marker is attached to it and observed by an optical tracking system. The necessary calibration matrices from marker to ultrasound volume are obtained using hand-eye calibration algorithms which take sets of corresponding observations of the optical marker and an ultrasound target as input. The quality of these calibration matrices is highly dependent on the measured observations. While the accuracy of optical tracking systems is very high, accurate tracking in ultrasound is difficult because of the low resolution of the ultrasound volume, artifacts and noise. Therefore accurate hand-eye calibration is difficult between ultrasound and optical tracking systems. We have tested different phantoms, matching- and sub-pixel strategies to provide highly accurate tracking results in 3D ultrasound volumes as basis for hand-eye calibration. Tests have shown that using the described methods - calibration results with RMS errors of less than 1mm between observed and calibrated targets can be reached. 1 Introduction With the recent enhancements in imaging quality and speed three-dimensional ultrasound has become attractive for automatic guidance during robotized interventions and high-accuracy target tracking applications. We use a modified GE Vivid 7 dimension 3D cardiovascular ultrasound station for real-time volume processing and direct target localization in radio-surgery [1]. This station is capable of providing ultrasound volume scans of the target region with more than 20 frames per second. A framework was established to run image-processing algorithms directly on the ultrasound machine in order to track targets inside the ultrasound volume. To map the obtained target positions to world coordinates the ultrasound transducer is itself tracked using an attached optical marker and an accutrack

180 Bruder et al. 250 (atracsys LLC, Bottens, CH) optical tracking system to locate the marker. The position of the ultrasound target in world coordinates can be calculated using both tracking results and the static translation between ultrasound transducer and optical marker. Among other strategies the best results for this translation are usually obtained [2] using a hand-eye calibration algoritm [3] and the calibration setup shown in figure 1. The equation T i B U i = C is solved for the unknown calibration matrix B by capturing multiple sets of corresponding optical and ultrasound positions T i and U i for different transducer positions. One problem of hand-eye calibration is that the quality of the measured position sets highly influences the calibration result. While the measurements of optical tracking systems are highly accurate, tracking in ultrasound is difficult because of the low resolution of the volume data, artifacts and noise. To obtain an accurate calibration this tracking has to be optimized. 2 Materials and Methods A calibration setup is used (Fig. 1). An application has been developed to capture both ultrasound and optical tracking results. To obtain corresponding datasets from both systems the capture time of both measurements is calculated using high-accuracy timestamps and an estimate for the system latencies. Then the optical tracking result is chosen to meet the capture time of the corresponding ultrasound volume. For an intuitive handling during hand-held sample acquisition an automatic capture trigger algorithm has been implemented besides the manually triggered capture process. This algorithm triggers in phases with minimal movements in both tracking results. In this way motion artifacts in the measured tracking results are further minimized. After completion of a predefined sample count the result of the hand-eye calibration is calculated and the distance error between the calibration and the measured points is computed. Depending on these values an iterative post-processing algorithm identifies up to Fig. 1. Setup for hand-eye calibration

Hand-eye calibration 181 ten percent of the samples with high distances, eliminates these samples and recalculates the tracking result. In this way errors such as wrong target detections in ultrasound are eliminated. Two types of phantoms are used for tracking in ultrasound: A single target (Figure 1a) is used to maximize tracking accuracy while lateron a complex phantom (Fig. 1b) with multiple features is uses to increase targeting accuracy and reduce the amount of necessary count of measured transducer positions. 2.1 Tracking a single target A lead bulb on a nylon wire in a liquid tank is used as target. The lead bulb reflects a high amount of ultrasound and can be easily detected in the ultrasound volume using a maximum intensity search. As our ultrasound volume has a worst-case resolution of more than 1.5mm per pixel this method is not sufficient for high-accuracy tracking. To overcome this limitation we use cubic splines to interpolate the target region around a detected position with maximum intensity. The extreme value of the interpolated volume is used as sub-pixel approximation of the lead bulb position. Another tracking possibility is template matching. Using a predefined pattern of the lead bulb, the sum of square distances (SSD) is calculated to find an optimal match between the pattern and the lead bulb. With sub-pixel enhancement through iterative interpolation of the target region a high accuracy tracking result can be optained. Figure 3b shows the different tracking positions for a two-dimensional slice of the target. To test the sub-pixel quality of each strategy a lead bulb is moved through a liquid tank on a linear trajectory while being continuously tracked (Figure 3c). While the tracking accuracy is increased with both strategies, template matching shows the best linearity and highest accuracy. 2.2 Tracking multiple targets Three-dimensional ultrasound offers the possibility to track multiple targets inside one volume simultanously. We use a complex wire phantom (Dansk Fantom Fig. 2. Ultrasound transducer with attached optical marker on phantoms. Left: lead bulb phantom; right: complex phantom.

182 Bruder et al. Service, Fig. 4a) to find a defined geometry of twisted nylon wires inside the volume. The wires are clearly visible in the ultrasound volume and can be easily tracked using a repeated maximum intensity search with an additional distance criterium, so that features in close proximity to already located points are not detected. Alternatively template matching can be used with multiple, predefined patterns for wires in different angles to the beam. The obtained feature positions are matched against a predefined virtual phantom or the first captured phantom using the ICP algorithm [4] which may be exchanged replaced by RANSAC [5] to eliminate erroneous feature detections. The resulting transformation matrix between virtual and detected feature cloud is used as tracking result U i. Figure 4 shows the alignment results. (a) (b) (c) Fig. 3. A captured lead bulb with artifacts is shown in (a). The crosses in (b) show the maximum intensity (red), the spline-interpolated maximum intensity (black) and the center position of a found template using sub-pixel matching (blue). (c) shows the observed position changes for a linear lead bulb movement using the same color scheme. (a) (b) Fig. 4. The complex wire phantom in ultrasound is shown in (a). In (b) the extracted feature cloud (green) is registered to the predefined phantom (blue). While RANSAC (black) works as expected the ICP result (red) is rotated due to false feature detections.

Hand-eye calibration 183 Table 1. Calibration results. Calibration method Positions RMS error Lead bulb, standard resolution 100 1.811mm Lead buld, sub-pixel enhanced 50 0.773mm Complex phantom, standard resolution 3 positions, 50 targets each 2.330mm Complex phantom, sub-pixel enhanced 3 positions, 50 targets each 0.505mm 3 Results To measure the calibration quality we compute the RMS distance error between all calibrated and measured target points. Table 1 shows this quality value for the different calibration scenarios. For the single target an automatic calibration with 100 captured points is performed without manual interaction. Measurements at three different positions with 50 located targets in each volume are performed for the complex phantom. Both calibration methods show acceptable results with RMS errors of less than 2mm with standard resolution. With high-accuracy ultrasound target localization both methods can be significantly enhanced. For single targets the RMS errors can be reduced to less than 1mm using only half of captured points. As ICP is very sensitive to position errors, the accuracy gain for complex phantom calibration due to better target localization is even higher. 4 Conclusion Two calibration methods are implemented and enhanced with sub-pixel target localization. Especially with complex phantoms a stable calibration can be performed in only three steps which makes the method attractive for common use. References 1. Bruder R, Ernst F, Schlaefer A, et al. Real-time tracking of the pulmonary veins in 3D ultrasound of the beating heart. In: 51st Annual Meeting of the AAPM. vol. 36 of Med Phys; 2009. p. 2804. 2. Poon TC, Rohling RN. Comparison of calibration methods for spatial tracking of a 3-D ultrasound probe. Eur J Ultrasound. 2005;31(8):1095 108. 3. Tsai RY, Lenzr RK. A new technique for fully autonomous and efficient 3D robotics hand/eye calibration. IEEE Trans Rob Autom. 1989;5(3):345 58. 4. Rusinkiewicz S, Levoy M. Efficient variants of the ICP algorithm. In: Proc Int Conf 3D Digit Imaging Model; 2001. p. 145 52. 5. Chen C, Hung Y, Cheng J. RANSAC-based DARCES: a new approach to fast automatic registration of partially overlapping range images. IEEE Trans Pattern Anal Mach Intell. 1999;21:1229 34.