Parallel Tracking and Mapping for Small AR Workspaces

Similar documents
Situated Visualization with Augmented Reality. Augmented Reality

Parallel Tracking and Mapping for Small AR Workspaces

MetropoGIS: A City Modeling System DI Dr. Konrad KARNER, DI Andreas KLAUS, DI Joachim BAUER, DI Christopher ZACH

ACCURACY ASSESSMENT OF BUILDING POINT CLOUDS AUTOMATICALLY GENERATED FROM IPHONE IMAGES

Feature Tracking and Optical Flow

Incremental Surface Extraction from Sparse Structure-from-Motion Point Clouds

A Prototype For Eye-Gaze Corrected

Keyframe-Based Real-Time Camera Tracking

Removing Moving Objects from Point Cloud Scenes

Fusing Points and Lines for High Performance Tracking

MVTec Software GmbH.

Live Feature Clustering in Video Using Appearance and 3D Geometry

Point Cloud Simulation & Applications Maurice Fallon

Interactive Dense 3D Modeling of Indoor Environments

Character Animation from 2D Pictures and 3D Motion Data ALEXANDER HORNUNG, ELLEN DEKKERS, and LEIF KOBBELT RWTH-Aachen University

Real-Time Camera Tracking Using a Particle Filter

OVER the past decade, there has been a tremendous

RGB-D Mapping: Using Kinect-Style Depth Cameras for Dense 3D Modeling of Indoor Environments

CS 4810 Introduction to Computer Graphics

Tracking performance evaluation on PETS 2015 Challenge datasets

Flow Separation for Fast and Robust Stereo Odometry

Live Metric 3D Reconstruction on Mobile Phones

Real-Time SLAM Relocalisation

AUGMENTED Reality (AR) makes the physical world a part

Tracking in flussi video 3D. Ing. Samuele Salti

Chapter 1. Animation. 1.1 Computer animation

Blender 3D Animation

Using Photorealistic RenderMan for High-Quality Direct Volume Rendering

Randomized Trees for Real-Time Keypoint Recognition

Digital Image Increase

Real-time 3D Model-Based Tracking: Combining Edge and Texture Information

EFFICIENT VEHICLE TRACKING AND CLASSIFICATION FOR AN AUTOMATED TRAFFIC SURVEILLANCE SYSTEM

Off-line programming of industrial robots using co-located environments

3D U ser I t er aces and Augmented Reality

Entwicklung und Testen von Robotischen Anwendungen mit MATLAB und Simulink Maximilian Apfelbeck, MathWorks

ACE: After Effects CS6

Real-Time Camera Tracking: When is High Frame-Rate Best?

Online Environment Mapping

1-Point RANSAC for EKF Filtering. Application to Real-Time Structure from Motion and Visual Odometry

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

Wide Area Localization on Mobile Phones

An Iterative Image Registration Technique with an Application to Stereo Vision

Multispectral stereo acquisition using 2 RGB cameras and color filters: color and disparity accuracy

INTRODUCTION TO RENDERING TECHNIQUES

BUILDING TELEPRESENCE SYSTEMS: Translating Science Fiction Ideas into Reality

Tracking and integrated navigation Konrad Schindler

Computer Graphics AACHEN AACHEN AACHEN AACHEN. Public Perception of CG. Computer Graphics Research. Methodological Approaches

Spatial Accuracy Assessment of Digital Surface Models: A Probabilistic Approach

2. Colour based particle filter for 3-D tracking

Face Model Fitting on Low Resolution Images

Wii Remote Calibration Using the Sensor Bar

GPS-aided Recognition-based User Tracking System with Augmented Reality in Extreme Large-scale Areas

A new Optical Tracking System for Virtual and Augmented Reality Applications

Announcements. Active stereo with structured light. Project structured light patterns onto the object

Interactive Computer Graphics

Tutorial on Visual Odometry

VIRTUAL TRIAL ROOM USING AUGMENTED REALITY

3D Tracking in Industrial Scenarios: a Case Study at the ISMAR Tracking Competition

A Study on SURF Algorithm and Real-Time Tracking Objects Using Optical Flow

Epipolar Geometry and Visual Servoing

Interactive Segmentation, Tracking, and Kinematic Modeling of Unknown 3D Articulated Objects

Envisor: Online Environment Map Construction for Mixed Reality

Online Learning of Patch Perspective Rectification for Efficient Object Detection

Reliable automatic calibration of a marker-based position tracking system

Tracking Densely Moving Markers

CS 4300 Computer Graphics. Prof. Harriet Fell Fall 2012 Lecture 33 November 26, 2012

Robust Pedestrian Detection and Tracking From A Moving Vehicle

Force and Visual Control for Safe Human Robot Interaction

Introduction Epipolar Geometry Calibration Methods Further Readings. Stereo Camera Calibration

ACE: After Effects CC

How To Analyze Ball Blur On A Ball Image

Are we ready for Autonomous Driving? The KITTI Vision Benchmark Suite

CS 4620 Practicum Programming Assignment 6 Animation

KinectFusion: Real-time 3D Reconstruction and Interaction Using a Moving Depth Camera*

Fully Automated and Stable Registration for Augmented Reality Applications

SIMULTANEOUS LOCALIZATION AND MAPPING FOR EVENT-BASED VISION SYSTEMS

CS 534: Computer Vision 3D Model-based recognition

Epipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R.

Digital 3D Animation

Adobe Certified Expert Program

From Ideas to Innovation

Natural Feature Tracking on a Mobile Handheld Tablet

Segmentation of building models from dense 3D point-clouds

HPC with Multicore and GPUs

Terrain Traversability Analysis using Organized Point Cloud, Superpixel Surface Normals-based segmentation and PCA-based Classification

Test Results Using the DTM Software of PCI

Edge tracking for motion segmentation and depth ordering

Augmented Reality for Construction Site Monitoring and Documentation

Real-Time 3D Reconstruction Using a Kinect Sensor

Head Tracking via Robust Registration in Texture Map Images

GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM

A Learning Based Method for Super-Resolution of Low Resolution Images

A Short Introduction to Computer Graphics

Animator V2 for DAZ Studio. Reference Manual

Remote Graphical Visualization of Large Interactive Spatial Data

3D/4D acquisition. 3D acquisition taxonomy Computer Vision. Computer Vision. 3D acquisition methods. passive. active.

High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound

Develop Computer Animation

How does the Kinect work? John MacCormick

Transcription:

Parallel Tracking and Mapping for Small AR Workspaces Georg Klein and David Murray Active Vision Lab, Oxford This is a PDF of the slides of the talk given at ISMAR 2007

Aim AR with a hand-held camera Visual Tracking provides registration

Aim AR with a hand-held camera Visual Tracking provides registration

Aim AR with a hand-held camera Visual Tracking provides registration Track without prior model of world

Aim AR with a hand-held camera Visual Tracking provides registration Track without prior model of world Challenges: Speed Accuracy Robustness Interaction with real world

Existing attempts: SLAM Simultaneous Localisation and Mapping Well-established in robotics (using a rich array of sensors)

Existing attempts: SLAM Simultaneous Localisation and Mapping Well-established in robotics (using a rich array of sensors) Courtesy of Oxford Mobile Robotics Group

Existing attempts: SLAM Simultaneous Localisation and Mapping Well-established in robotics (using a rich array of sensors) Demonstrated with a single handheld camera by Davison 2003 Courtesy of Oxford Mobile Robotics Group

SLAM applied to AR Davison et al 2004 Williams et al ICCV 2007 Reitmayr et al ISMAR 2007 Checklov et al ISMAR 2007

Model-based tracking vs SLAM Lepetit, Vachetti & Fua ISMAR 2003

Model-based tracking vs SLAM Model-based tracking is More robust More accurate Why? SLAM fundamentally harder?

Frame-by-frame SLAM DIFFICULT! Find features Update camera pose and entire map (Many DOF) Draw graphics One frame Time

Frame-by-frame SLAM Updating entire map every frame is expensive Mandates sparse map of high-quality features - A. Davison Our approach Use dense map (of low-quality features) Don't update the map every frame: Keyframes Split the tracking and mapping into two threads

Frame-by-frame SLAM DIFFICULT! Find features Update camera pose and entire map (Many DOF) Draw graphics One frame

Parallel Tracking and Mapping Easy! :-) Find features Update camera pose (6-DOF) Draw graphics Thread 1: Tracking Thread 2: Mapping One frame Update map

Thread 1: Tracking Tracking thread: Mapping thread: - Responsible estimation of camera pose and rendering augmented graphics - Responsible for providing the map - Must run at 30Hz - Can take lots of time per keyframe - Make as robust and accurate as possible - Make as rich and accurate as possible Thread 2: Mapping

Mapping thread Stereo Initialisation Wait for new keyframe Add new map points Optimise map Map maintenance Tracker

Stereo Initialisation Use five-point-pose algorithm (Stewenius et al '06) Requires a pair of frames and feature correspondences Provides initial map User input required: Two clicks for two keyframes Smooth motion for feature correspondence

Wait for new keyframe Tracker Keyframes are only added if: There is a baseline to the other keyframes Tracking quality is good When a keyframe is added: The mapping thread stops whatever it is doing All points in the map are measured in the keyframe New map points are found and added to the map

Add new map points Want as many map points as possible Check all maximal FAST corners in the keyframe: Check Shi-Tomasi score Check if already in map Epipolar search in a neighbouring keyframe Triangulate matches and add to map Repeat in four image pyramid levels

Optimise map Use batch SFM method: Bundle Adjustment* Adjusts map point positions and keyframe poses Minimises reprojection error of all points in all keyframes (or use only last N keyframes) Cubic complexity with keyframes, linear with map points Compatible with M-estimators (we use Tukey) * - According to Engels, Stewenius and Nister, this rules

Map Maintenance When camera is not exploring, mapping thread has idle time use this to improve the map Data association in bundle adjustment is reversible Re-attempt outlier measurements Try to measure new map features in all old keyframes

Mapping thread Stereo Initialisation Wait for new keyframe Add new map points Optimise map Map maintenance Tracker

Tracking thread Stereo Initialisation Wait for new keyframe Add new map points Optimise map Map maintenance Tracker

Tracking thread Responsible estimation of camera pose and rendering augmented graphics Must run at 30Hz Make as robust and accurate as possible Track/render loop with two tracking stages

Tracking thread Pre-process frame Project points Project points Measure points Measure points Update Camera Pose Update Camera Pose Coarse stage Fine stage Draw Graphics

Pre-process frame Make mono and RGB version of image

Pre-process frame Make mono and RGB version of image Make four pyramid levels 640x480 320x240 160x120 80x60

Pre-process frame Make mono and RGB version of image Make four pyramid levels Detect FAST corners 640x480 320x240 160x120 80x60

Project Points Use motion model to update camera pose Project all map points into image to see which are visible, and at what pyramid level Choose subset to measure ~50 biggest features for coarse stage 1000 randomly selected for fine stage

Measure Points Generate 8x8 matching template (warped from source keyframe) Search a fixed radius around projected position Use zero-mean SSD Only search at FAST corner points Up to 10 inverse composition iterations for subpixel position (for some patches) Typically find 60-70% of patches

Update camera pose 6-DOF problem 10 IRWLS iterations Tukey M-Estimator

Tracking thread Pre-process frame Project points Project points Measure points Measure points Update Camera Pose Update Camera Pose Coarse stage Fine stage Draw Graphics

Draw graphics What can we draw in an unknown scene? Assume single plane visible at start Run VR simulation on the plane Radial distortion Want proper blending

Input Image 640x480

Undistort 1600x1200

Render 1600x1200

Re-distort (640x480)

Tracking Quality Monitoring Heuristic check based on fraction of found measurements Three quality levels: Good, Poor, Lost Only add to map on `Good' Stop tracking and relocalise on `Lost'

Results Is it any good? Yes

Comparison to EKF-SLAM EKF-SLAM ICCV 2007 ISMAR 2007 More accurate More robust Faster tracking

Results Video ISMAR 2007

Areas in need of improvement Outlier management Still brittle in some scenarios Fine repeated texture Stereo initialisation No occlusion reasoning Point cloud is inadequate for AR

No occlusion reasoning Point cloud is inadequate for AR User interaction? Automatic primitive detection? Reitmayr et al ISMAR 2007 Live dense reconstruction? Checklov et al ISMAR 2007

Parallel Tracking and Mapping for Small AR Workspaces Georg Klein and David Murray Active Vision Lab, Oxford