Augmented Architectural Environments

Similar documents
BUILDING TELEPRESENCE SYSTEMS: Translating Science Fiction Ideas into Reality

High speed 3D capture for Configuration Management DOE SBIR Phase II Paul Banks

Projection Center Calibration for a Co-located Projector Camera System

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Protocol for Microscope Calibration

A Short Introduction to Computer Graphics

GOM Optical Measuring Techniques. Deformation Systems and Applications

Introduction to Computer Graphics

ENGN D Photography / Winter 2012 / SYLLABUS

EPSON SCANNING TIPS AND TROUBLESHOOTING GUIDE Epson Perfection 3170 Scanner

Automatic Labeling of Lane Markings for Autonomous Vehicles

A Prototype For Eye-Gaze Corrected

3D U ser I t er aces and Augmented Reality

T-REDSPEED White paper

Integration Services

CS635 Spring Department of Computer Science Purdue University

pb tec solutions GmbH, Max-Planck-Str. 11, Alzenau (Germany) Tel.: Fax:

Spatially Augmented Reality

How To Compress Video For Real Time Transmission

AR-media Player v2.3. INSTALLATION & USER GUIDE (February, 2013) (Windows XP/Vista/7)

Software Manual. IDEA The Software Version 1.0

Automatic 3D Mapping for Infrared Image Analysis

Mouse Control using a Web Camera based on Colour Detection

Adaptive Coded Aperture Photography

3D MODELING OF LARGE AND COMPLEX SITE USING MULTI-SENSOR INTEGRATION AND MULTI-RESOLUTION DATA

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

OPERATING INSTRUCTIONS Nikon TiE Deconvolution Microscope CCRB 1-220G

A Novel Multitouch Interface for 3D Object Manipulation

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

Chapter 3 Input Devices

3D/4D acquisition. 3D acquisition taxonomy Computer Vision. Computer Vision. 3D acquisition methods. passive. active.

Video Tracking Software User s Manual. Version 1.0

NaturalPoint Tracking Toolkits (Point Cloud and Rigid Body) Users Manual : Getting Started 02/22/08 00:06:05

Tracking devices. Important features. 6 Degrees of freedom. Mechanical devices. Types. Virtual Reality Technology and Programming

NaturalPoint Tracking Tools Users Manual : Getting Started 04/19/10 18:47:59

A. OPENING POINT CLOUDS. (Notepad++ Text editor) (Cloud Compare Point cloud and mesh editor) (MeshLab Point cloud and mesh editor)

Autodesk Revit Architecture 2011 Professional Massmodeling Rendering Video Tutorial

Chapter 5 Objectives. Chapter 5 Input

Graphic Design. Background: The part of an artwork that appears to be farthest from the viewer, or in the distance of the scene.

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

How To Run A Factory I/O On A Microsoft Gpu 2.5 (Sdk) On A Computer Or Microsoft Powerbook 2.3 (Powerpoint) On An Android Computer Or Macbook 2 (Powerstation) On

COMPLETE VIDEO SUPPORT FOR EVENTS

AR-media TUTORIALS OCCLUDERS. (May, 2011)

Interior Design in Augmented Reality Environment

REAL-TIME IMAGE BASED LIGHTING FOR OUTDOOR AUGMENTED REALITY UNDER DYNAMICALLY CHANGING ILLUMINATION CONDITIONS

Fast Z-stacking 3D Microscopy Extended Depth of Field Autofocus Z Depth Measurement 3D Surface Analysis

Smart Cam, CC-Smart-Cam, and Smart Cam Packages Installation and Quick Start Operating Instructions

Image Processing and Computer Graphics. Rendering Pipeline. Matthias Teschner. Computer Science Department University of Freiburg

Web:

An Interactive Dynamic Tiled Display System

Self-Positioning Handheld 3D Scanner

Effects of Orientation Disparity Between Haptic and Graphic Displays of Objects in Virtual Environments

Compositing a 3D character over video footage in Maya Jean-Marc Gauthier, Spring 2008

Technical Particulars For Video Conferencing Systems For RC Period ( To )

Go to contents 18 3D Visualization of Building Services in Virtual Environment

INTRODUCTION TO RENDERING TECHNIQUES

Pipeline External Corrosion Analysis Using a 3D Laser Scanner

Service-Oriented Visualization of Virtual 3D City Models

MoveInspect HF HR. 3D measurement of dynamic processes MEASURE THE ADVANTAGE. MoveInspect TECHNOLOGY

Virtuelle Realität. Overview. Teil 5: Visual Displays. Prof. Bernhard Jung. Virtuelle Realität

VIRGINIA WESTERN COMMUNITY COLLEGE

Differentiation of 3D scanners and their positioning method when applied to pipeline integrity

The Chillon Project: Aerial / Terrestrial and Indoor Integration

Build Panoramas on Android Phones

Sweet Home 3D user's guide

High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound

Laser Gesture Recognition for Human Machine Interaction

Tips on Photography What Kind of Images? Photos and more Uses - A good picture speaks 1,000 words

Scanners and How to Use Them

How To Use The Dc350 Document Camera

Application Report: Running µshape TM on a VF-20 Interferometer

Sense. 3D Scanner. User Guide

Blender addons ESRI Shapefile import/export and georeferenced raster import

DAMAGED ROAD TUNNEL LASER SCANNER SURVEY

Thea Omni Light. Thea Spot Light. Light setup & Optimization

IE Operating Instruction Manual

Color holographic 3D display unit with aperture field division

NVIDIA IndeX Enabling Interactive and Scalable Visualization for Large Data Marc Nienhaus, NVIDIA IndeX Engineering Manager and Chief Architect

B2.53-R3: COMPUTER GRAPHICS. NOTE: 1. There are TWO PARTS in this Module/Paper. PART ONE contains FOUR questions and PART TWO contains FIVE questions.

Research-Grade Research-Grade. Capture

Light-Field Displays: Technology and Representation of 3D visual information. Péter Tamás Kovács Holografika

Dual Laser InfraRed (IR) Thermometer

Visualization and Simulation for Research and Collaboration. An AVI-SPL Tech Paper. (+01)

Stand Alone Type. Digital Video Recorder USER S MANUAL. (Real time recording 8 & 16 CH DVR) Revision Date :

Auto Head-Up Displays: View-Through for Drivers

Shear :: Blocks (Video and Image Processing Blockset )

Automotive Applications of 3D Laser Scanning Introduction

Steerable Augmented Reality with the Beamatron

PRODUCT SHEET.

Synthetic Sensing: Proximity / Distance Sensors

Water Flow in. Alex Vlachos, Valve July 28, 2010

Collaboration Software

House Design Tutorial

ASSESSMENT OF VISUALIZATION SOFTWARE FOR SUPPORT OF CONSTRUCTION SITE INSPECTION TASKS USING DATA COLLECTED FROM REALITY CAPTURE TECHNOLOGIES

CNC-STEP. "LaserProbe4500" 3D laser scanning system Instruction manual

3 Image-Based Photo Hulls. 2 Image-Based Visual Hulls. 3.1 Approach. 3.2 Photo-Consistency. Figure 1. View-dependent geometry.

AS-M5630U Sony 32X Optical Zoom HD 2MP Network Security Camera Module

AR-media Plugin v2.3. for Autodesk 3ds Max. QUICK START GUIDE (September, 2013)

Transcription:

Laser Pointer Tracking in Projector- Augmented Architectural Environments Daniel Kurz, Ferry Häntsch, Max Große, Alexander Schiewe, Oliver Bimber Bauhaus-University Weimar, ARGroup

What is this about?

Motivation Spatial Augmented Reality for Architecture Visualization Interaction ti Projector-based AR Laser pointer tracking Video see-through h AR

Outline Related work System overview Custom-built PTZ camera Distributed software framework System calibration Self-registration Texture capturing Projector calibration System while in use Laser pointer tracking Applications Video see-through LED tracking Conclusions Summary Results and Discussion Future work

Related Work Laser Pointer Interaction Interaction techniques Mapping position to mouse movement [Kirstein98] Gesture-controlled events [Olsen01] Customized hardware Additional buttons [Oh02] [Bi05] Multiple laser rays [Matveyev03] [Wienss06] Infrared laser pointer Avoids visible misregistration and jitter [Cavens02] [Cheng03] Large displays/multi-user Discrimination of lasers [Davis02] [Oh02] Multiple cameras [Ahlborn05]

Related Work PTZ Camera Systems Calibration of multi-projector systems Un-calibrated camera [Chen00] Acqusition of environment geometry and texture Depth enhanced panoramas [Bahmutov04] Large scale acquisition [Bahmutov06] PTZ camera and projector [Pinhanez01, Ehnes04] converts optimized everyday surfaces into interactive displays Projector-based display Finger tracking

How does our system work?

Custom-built PTZ Camera Two video cameras PTZ detail camera Wide-angle context camera Microcontroller Stepper motors Attached laser module Connected to a PC Image analysis System control

Distributed ib t Software Framework Client/server-based architecture

System Calibration System's s intrinsic parameters Remain constant Pre-calibrated System's s position and orientation Changes when placed in an environment/room Will be estimated fully automatically

Automatic ti Self-Registration ti Reference model Taken in advance as part of the architectural surveying process Low resolution Stored on server In world coordinate system Sampling the room Adaptive LOD sampling avoids oversampling Results in a point cloud in device coordinate system

Automatic ti Self-Registration ti (cont'd) Match sampled points to reference model Outliers (missing details in model) are removed Final rigid body transformation matrix (position and orientation) is estimated numerically

Texture Capturing Capturing environment's reflectance Composed of several photos taken under different orientations Short exposure version Everything appears black except bright areas Binarized version is used to mask out such critical regions (later)

Projector Calibration Unaligned projectors Automatic calibration One after another Geometric projector calibration PTZ camera delivers 3D positions of known projected 2D points Radiometric compensation Blending

How to track a laser pointer in such an environment?

Laser Pointer Tracking Auto exposure Short exposure Laser identification Short exposure to get rid of the background Intensity thresholding Critical regions Bright areas (e.g. lights, windows) Masked out by rendering the binary environment map with detail camera settings and multiplying to the current image

Laser Pointer Tracking (cont'd) Laser's s position on the image plane is known Due to known camera parameters, a ray can be defined pointing to that spot Intersection of this ray and the environment's geometry serves as the 3D position of the laser

Laser Pointer Tracking (cont'd) The detail camera is realigned to center the laser spot when it leaves the central region of the image Interplay of detail and context camera If the detail camera loses the spot, the context camera delivers a rough position to realign the detail camera

What's all this good for?

Applications Demonstrating basic object manipulation Translation sa Rotation Scaling Applies laser pointer tracking and projector- based visualization Switching mode using spatial gestures

Applications (cont'd)

Applications (cont'd) Colored Architecture[Tonn06] Color and light simulation using real-time radiosity On-site planning tool Color drag&drop Replaces time consuming and expensive classical techniques

Applications (cont'd)

Applications in General (so far) Interaction directly on the physical surface Visualization of structures that lie on the physical surface

What about floating 3D structures?

Video See-Through h Augmentation of the whole environment No need for projectors o Integrating CG into live video stream Required information Camera parameters (intrinsic/extrinsic) Environment's geometry Correct occlusions

Stereoscopic Projector-based AR View-dependent stereoscopic projection in real environments [Bimber05] Geometric warping Radiometric compensation Interpolation of precalibrated scene parameters Interactive rates On a per-pixel basis Multiple projectors

LED-Marker Head Tracking Projector-based approaches require head tracking Simple active LED marker is tracked by PTZ camera Rough indications of observer's position No need for extra tracking hardware!

Summary

Results and Discussion i Results Issues Fully automatic Only one task per time calibration Laser pointer tracking Required for real-world or marker tracking applications or video see-through Occlusion problems Non experts E.g. architects Multi-purpose tool Projector calibration Interaction Video see-through Simple head tracking User occluding Occluded surfaces Performance Accuracy Speed and latency

Performance Camera calibration Takes ~25min Deviation between known 3d points and their tracked position Avg. 0.18deg (8mm @ 2.54m distance) Laser pointer tracking Comparing laser dot and cursor Avg. deviation 14mm Latency ~300ms Speed ~30fps Projector calibration Takes ~1.5min Deviation between known 3d points and their projection Avg. 0.16deg (10.7mm @ 3.83m distance)

Performance Camera calibration Takes ~25min Deviation between known 3d points and their tracked position Avg. 0.18deg (8mm @ 2.54m distance) Laser pointer tracking Comparing laser dot and cursor Avg. deviation 14mm Latency ~300ms Speed ~30fps Projector calibration Takes ~1.5min Deviation between known 3d points and their projection Avg. 0.16deg (10.7mm @ 3.83m distance)

Performance LED marker tracking Avg. deviation 72mm (@2.5m distance) Latency ~300ms Speed ~10fps

Future Work Improving the tracking Latency Speed Precision More DOF Multiple cameras Cover a larger area Simultaneously track multiple entities Solve occlusion problems New prototype PTZ projector-camera system Embedded into tachymeter hardware Single, rotatable system that supports Tracking Scene analysis Projector-based augmentation within the targeted region of interest

Future Work (cont'd) Placement techniques for interaction items, considering... Surface geometry and reflectivity Surface visibility Hand jittering New and innovative architectural applications New projector-camera camera enabled interaction tasks e.g. material copy-and-paste Select material samples Scan, analyze, enlarge Re-produced at other surface portions via projector-based AR Final user study

Thank you! References [Ahlborn05] B. A. Ahlborn, et al. A practical system for laser pointer interaction on large displays. [Bahmutov04] G. Bahmutov, V. Popescu, and E. Sacks. Depth enhanced panoramas. [Bahmutov06] G. Bahmutov, V. Popescu, and M. Mudure. Efficient large scale acquisition of building interiors. [Bi05] X. Bi, Y. Shi, X. Chen, and P. Xiang. Facilitating interaction with large displays in smart spaces. [Bimber05] O. Bimber et al. Enabling View-Dependent Stereoscopic Projection in Real Environments. [Cavens02] D. Cavens, F. Vogt, S. Fels, and M. Meitner. Interacting with the big screen: pointers to ponder. [Chen00] Y. Chen et al. Automatic alignment of high-resolution multi-projector display using an un-calibrated camera. [Cheng03] K. Cheng and K. Pulo. Direct interaction with large-scale display systems using infrared laser tracking devices. [Olsen01] J. Dan R. Olsen and T. Nielsen. Laser pointer interaction. [Davis02] J. Davis and X. Chen. Lumipoint: Multi-user laser-based interaction on large tiled displays. [Ehnes04] J. Ehnes, K. Hirota, and M. Hirose. Projected augmentation augmented reality using rotatable video projectors. [Kirstein98] C. Kirstein and H. Mueller. Interaction with a projection screen using a camera-tracked laser pointer. [Matveyev03] S. V. Matveyev and M. Göbel. Direct interaction based on a twopoint laser pointer technique. S. V. Matveyev and M. Göbel. The optical tweezers: multiple-point point interaction technique. [Oh02] J. Oh and W. Stuerzlinger. Laser pointers as collaborative pointing devices. [Pinhanez01] C. Pinhanez. Using a steerable projector and a camera to transform surfaces into interactive displays. [Tonn06] C. Tonn and D. Donath. Color, material and light in the design process -- a software concept. [Wienss06] C. Wienss et al. Sceptre: an infrared laser tracking system for virtual environments.

Thank you! Further information http://www.uni-weimar.de/medien/ar http://www.sarc.de (Project website) (ARGroup) Authors Daniel Kurz Daniel.Kurz@medien.uni-weimar.de Ferry Häntsch Ferry.Haentsch@medien.uni-weimar.de Max Große Max.Grosse@medien.uni uni-weimar.de Alexander Schiewe Alexander.Schiewe@medien.uni-weimar.de Oliver Bimber Oliver.Bimber@medien.uni-weimar.de Supported by Deutsche Forschungsgemeinschaft gg (DFG) under contract number PE 1183/1-1