White Paper. Time of Flight Technology for Gesture Interaction

Similar documents
Automotive HMI: Current status and future challenges

EB TechPaper. Test drive with the tablet. automotive.elektrobit.com

T-REDSPEED White paper

Position Paper for W3C Web and Automotive Workshop. Marius Spika, Mark Beckmann

Auto Head-Up Displays: View-Through for Drivers

Automated Recording of Lectures using the Microsoft Kinect

Optical Sensing: 1D to 3D using Time-of-Flight Technology. Shaping the Future of MEMS & Sensors September 10, 2013

Human & Hardware in the Loop Simulator

Sony IR Day Semiconductor Business. June 29, Terushi Shimizu

Introduction to the Perceptual Computing

Technical Article. Gesture Sensors Revolutionize User Interface Control. By Dan Jacobs

from mind to motion Automotive Your partner for mechatronics

Interactive Motion Simulators

3. INTERNATIONAL VDI CONFERENCE 2016 AUTOMOTIVE HMI & CONNECTIVITY

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Trend Analysis: Connected Car 2015

SMARTDEVICELINK CONNECTIVITY FROM THE CAR S POINT OF VIEW

A vehicle independent low cost platform for naturalistic driving studies Henning Mosebach, DLR

Continental Automotive Singapore New Answers for Future Mobility

Robot Perception Continued

Amit Moran, Gila Kamhi, Artyom Popov, Raphaël Groscot Perceptual Computing - Advanced Technologies Israel

Laser Gesture Recognition for Human Machine Interaction

Acquisition of Novero. Investor presentation 18th December 2015

Time-domain Beamforming using 3D-microphone arrays

HAVEit. Reiner HOEGER Director Systems and Technology CONTINENTAL AUTOMOTIVE

Smart features like these are why Ford F-Series has been America s best-selling truck for 37 years and America s best-selling vehicle for 32 years

NVIDIA AUTOMOTIVE. Driving Innovation

Dassault Systèmes in Germany Partner for Industry 4.0

2014 Annual General Meeting

Tips and Technology For Bosch Partners

EB Automotive Driver Assistance EB Assist Solutions. Damian Barnett Director Automotive Software June 5, 2015

PASSENGER/PEDESTRIAN ANALYSIS BY NEUROMORPHIC VISUAL INFORMATION PROCESSING

Introduction. to the. QNX Smart Energy Reference

AN580 INFRARED GESTURE SENSING. 1. Introduction. 2. Hardware Considerations

Synthetic Sensing: Proximity / Distance Sensors

Context-aware Library Management System using Augmented Reality

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

Software Technology in an Automotive Company - Major Challenges

Part I. Introduction

A new dimension in infotainment

DESIGN OF A TOUCHLESS USER INTERFACE. Author: Javier Onielfa Belenguer Director: Francisco José Abad Cerdá

VZ-C3D Visualizer. Bringing reality closer in 3D

IFS-8000 V2.0 INFORMATION FUSION SYSTEM

Vehicles in the Cloud: Using Online Services to Improve Safety and Functionality in the Automotive Industry

How To Make A Car A Car Into A Car With A Car Stereo And A Car Monitor

Roles of Smart TV in Internet of Things

AFC BUS TAXI BIS. Solutions for Traffic Information and Fare Collection

Robotic Home Assistant Care-O-bot: Past Present Future

Driving the User Interface. Trends in Automotive GUIs

Introduction. Thank you for your purchasing this car safety series On-board. HD Digital Video Recorder. This product is specially developed and

2 Vehicle CCTV System with Mini Internal, Mini IR Vehicle Cameras and 4 Camera Recorder with GPS and SD Card Recording (Model: SVC412PS)

Using big data in automotive engineering?

Comparison of Request Admission Based Performance Isolation Approaches in Multi-tenant SaaS Applications

Today's Presenter. Phillip Wright, Ph.D. IntertechPira Consultant and Chief Analyst and Managing Director at WRT Associates, LLC

Android Phone Controlled Robot Using Bluetooth

KCI Communications, Inc Ensell Road, Suite 100 Lake Zurich, IL

Head-Coupled Perspective

CASE STUDY. Automotive Components. Bosch Drancy, France Brake NVH Testing. Braking Systems. France. Automotive. PULSE, Transducers

Optimao. In control since Machine Vision: The key considerations for successful visual inspection

SELF-SERVICE ANALYTICS: SMART INTELLIGENCE WITH INFONEA IN A CONTINUUM BETWEEN INTERACTIVE REPORTS, ANALYTICS FOR BUSINESS USERS AND DATA SCIENCE

Safety compliance. Energy management. System architecture advisory services. Diagnostics. Network topologies. Physical and functional partitioning

ZEISS T-SCAN Automated / COMET Automated 3D Digitizing - Laserscanning / Fringe Projection Automated solutions for efficient 3D data capture

Using WebEx Player. Playing a recording. Installing WebEx Player. System requirements for listening to audio in a recording

OPERATION MANUAL. IWB Setup Software/EyeRIS NEC edition

EBERSPÄCHER ELECTRONICS automotive bus systems

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Telematics & Wireless M2M

A Survey of Video Processing with Field Programmable Gate Arrays (FGPA)

Your Multi-Touch Agency for Interactive Terminals. Exponatec - International Trade Fair for Museums. An overview of our exhibition showcase:

Integrated Engineering Solutions

Sage CRM for Media solution by Providian

Crater detection with segmentation-based image processing algorithm

SMART ENTRY WITH PUSH BUTTON START

A new radio system for the German coast

Hardware in the Loop (HIL) Testing VU 2.0, , WS 2008/09

High speed 3D capture for Configuration Management DOE SBIR Phase II Paul Banks

RMS KREATION FROM CONCEPT TO PRODUCTION

A new dimension of sound and vibration analysis

EmerT a web based decision support tool. for Traffic Management

Test Automation Product Portfolio

Smartphone as a Remote Control Proxy in Automotive Navigation System

JEREMY SALINGER Innovation Program Manager Electrical & Control Systems Research Lab GM Global Research & Development

automotive.elektrobit.com Driver assistance software EB Assist solutions

Introduction to 3D Imaging

VIRTUAL TRIAL ROOM USING AUGMENTED REALITY

Large-Area Wireless Sensor System for Ambient Assisted Living

Eyes + Device = Wise: Enabling Collaborative Visual Analysis of Big Data

A Survey Report by Horst Hientz Hans-Jürgen Kugler

Executive Master Program Green Mobility Engineering. Technology + Management

Use Case Design for AdaptIVe

White paper. In the best of light The challenges of minimum illumination

One-Way Pseudo Transparent Display

CONFERENCE SYSTEMS VZ5500 VZ6100 VZ8000 COMMUNICATIONS

3D/4D acquisition. 3D acquisition taxonomy Computer Vision. Computer Vision. 3D acquisition methods. passive. active.

Teaching Methodology for 3D Animation

Wii Remote Calibration Using the Sensor Bar

Introduction CHAPTER 1

It s Mine. Yes To Mobility > Insight and Outlook 2015 Continental Mobility Study

Optimized Design for MEMS Based Automotive Laser Pico Projectors

Interfacing with Manufacturing Systems in Education and Small Industry Using Microcontrollers through the World Wide Web

Transcription:

White Paper Time of Flight Technology for Gesture Interaction

Time of Flight Technology for Gesture Interaction Oliver Kirsch 1, Dr. Alexander van Laack 2, Gert-Dieter Tuzar 3, Judy Blessing 4 Advanced Technologies, Visteon Electronics Germany GmbH 1 Design Experience Europe, Visteon Innovation & Technology GmbH 2 Design Experience Europe, Visteon Electronics Germany GmbH 3 Market & Trends Research Europe, Visteon Innovation & Technology GmbH 4 Abstract In this paper we present Visteon s findings in the field of time-of-flight (ToF) gesture interaction. We give a brief overview of current technologies before introducing our advanced technology development, which was integrated into a drivable vehicle. A significant number of hand gestures were implemented through our proprietary software and can be tested in the vehicle. We conclude this paper with a summary of our findings and an outlook for next steps. 1 Introduction Multimodal human-machine interaction (HMI) is used in today s vehicles to offer the driver several redundant ways to control functions. Voice and touch interaction possibilities have already become standard features. To keep the driver s eyes on the road while still controlling various functions safely, gesture interaction is seen as a promising modality. Future driver interaction systems will require solutions for driver monitoring such as gesture control. This functionality can be provided by 2D/3D interior camera systems and related computer vision software solutions. In particular, the 3D sensing technology promises a new modality to interact in a more natural and intuitive way with the cockpit electronics and infotainment system of the vehicle because the exact position of the objects in 3D space is known.

2 State-of-the-Art Looking to camera-based solutions, several principles and technologies are available on the market and in the research domain to record 3D depth images. Triangulation is probably the most common and well-known principle that enables the acquisition of 3D depth measurements with reasonable distance resolution. It can be either a passive system based on a stereo vision (SV) camera system using two 2D cameras, or an active system using one single 2D camera together with a projector that beams a light pattern onto the scene. The drawback of active triangulation systems is that they have difficulties with shadows in the scene area. SV systems have difficulties with scene content that has no or only rare contrast since the principle of SV-based triangulation relies on identification of the same features in both images. Figure 1: ToF measurement principle (RF modulated light source with phase detector 1 ) An alternative depth measurement method that Visteon uses in one of its technology vehicles is called the ToF principle (see Figure 1). A camera system that uses this technology can provide an amplitude image and a depth image of the scene (within the camera field of view) with a high frame rate and without moving parts inside the camera. The amplitude image looks similar to an image from a 2D camera. That means that well reflecting objects appear brighter in that image than less reflecting objects. The distance image provides distance information for each sensor pixel of the scene in front of the camera. 1 Büttgen, B. (2005) CCD/CMOS Lock-in pixel for range imaging, p. 3

The essential components of a ToF camera system are the ToF sensor, an objective lens and an infrared illumination light source that emits RF modulated light. The sensor consists of an array of independent ToF pixels whereby each pixel is able to measure the arrival time of the modulated scene illumination with high accuracy. This is important because the method for measuring the distance between the sensor and the object is based on the time difference the light needs to travel from the light source until it is received by the sensor (being reflected by an object). These measurements take place in parallel in each pixel and because of that the camera can provide, for each pixel, one amplitude and one distance value and thus one amplitude and one distance image for each camera frame. These two images form the basic information and are used as input for applying computer vision software algorithms in order to detect hand gestures. 3 Advanced Development 3.1 Overview Visteon started to investigate ToF technology several years ago in its advanced development department to build up knowledge and to understand the performance and maturity of the technology. Besides the technology investigation inside the lab, a proprietary hardware and software solution has been developed to enable recognition of hand gestures in mid-air and further use cases that can be linked to certain hand poses or hand and finger position in 3D space. The right column of Figure 2 shows the system partitioning of a ToF camera system and provides high-level information about the different software blocks in our current system. The column on the left provides an overview of the system architecture with a highlevel flow chart on how the gesture recognition concept works. This automaker-independent development phase was necessary to reach the following goals: Find out the potential and limitation of the current ToF technology and investigate ways to push the confines of its current limitations. Build up a good understanding of feasible use cases that can work robustly in a car environment by implementing and testing them and executing improvement loops based on the findings. Investigate the necessary hardware and software components of such a system. These activities brought us into a position to build up our own know-how in this field, demonstrate this to automakers and provide recommendations about which uses

cases (per our experience and knowledge) can work robustly in a car environment. These achievements also reduce risks either by offering a solution for co-innovation projects or RFQs for serial implementation. Figure 2: Visteon s ToF camera technology and system architecture 3.2 Vehicle integration During the advanced development activities, one ToF camera system was integrated into a drivable vehicle to investigate use cases for this new input modality that can be used, for example, for driver information and infotainment systems to reduce driver distraction and enhance the user experience. The implementation in the car demonstrates the current status of the functionalities that have been achieved so far considering a camera position that is located in the overhead console (see Figure 3 and Figure 4). Figure 3: ToF Camera setup in Visteon s ToF technology vehicle

This car is used to demonstrate the possibilities of ToF technology to automakers. In parallel the vehicle is also used continuously as the main development platform to implement new use cases and to improve our software algorithms in order to enable robust hand/arm segmentation and tracking under real environmental and build in conditions. Figure 4: Camera head of ToF prototype camera (left), hand gesture interaction within the car (right) 3.3 Investigation of use cases beyond simple 3D gestures Because the ToF camera system also provides - besides the amplitude image - depth information, it can be used to do more than just detecting simple gestures (like a swipe or approach in a short range). The camera sees the details of the scene (i.e. number of visible fingers and pointing direction) and the exact position in 3D space. Therefore the target is ready to realize a more natural HMI within a large interaction space. Figure 5: Examples of hand gestures and poses for HMI interaction The following uses cases are already integrated and showcased in the technology vehicle to demonstrate the potential of the technology and enabling new ways for interacting with the vehicle (see Figure 5): Highly accurate proximity sensing, based on 3D coordinates of calculated features 3D gesture recognition (dynamic gestures performed in mid-air) Hand pose detection (static hand poses can use as shortcut) 2D gesture recognition (performed on surfaces such as display or interior part) Touch detection on a display

Functional areas on interior trim parts can act as a button, slider, TP etc. Driver / co-driver hand distinction Function release without need of physical contact (hygiene) Enables 3D HMI on 3D displays High level of design freedom for seamless information integration on curved surfaces enables new premium aesthetics and merge of smart (functional) decorative parts and display area Addressing a functional area that is out of reach of the hand and triggering the related function (i.e open/close the passenger s side widow) 4 Conclusion In this paper, we investigated the implementation and application of a gesture recognition system in a vehicle. Based on extensive state-of-the-art research the ToF technology was identified as the most suitable to detect a high diversity of spatial and on-surface gestures with high accuracy inside the vehicle. To address the demands of automakers, a new ToF camera was developed and implemented in a test vehicle. The current implementation of the ToF system offers a high enough resolution to identify gestures with a high level of detail. Due to a fairly large field-of-view of the camera, it can detect both the driver and the passenger s hand and, therefore, understand who is trying to interact at a given point in time. This offers a first step to a contextual system, which can react on active as well as passive gestures. Future research will have to focus not only on active gestures, but also passive ones. Driver monitoring and understanding a driver s posture becomes certainly more relevant when we think about hand-over scenarios for autonomous driving. 5 Literature References Büttgen, B., Oggier, T., Lehmann, M., Kaufmann, R., Lustenberger, F. (2005). CCD/CMOS Lock-in pixel for range imaging: challenges, limitations and state-of-the-art. In: Proceedings of 1 st Range Imaging Research Day, p. 3

6 Authors Dr. Alexander van Laack As human-machine interaction (HMI) and technical design manager in Visteon s European design experience group, Dr. Alexander van Laack is responsible for developing strategies for the use of specialized, automotive-related HMI concepts. In his role, he leads driving simulated user experience research initiatives and is responsible for HMI and user experience concept implementation. Dr. van Laack has more than eight years of automotive experience. He previously held a position as research engineer at Ford Research and Advanced Engineering developing methodologies to measure user experience and quality perception of interiors and HMIs. Dr. van Laack has a master's degree in business administration and engineering, as well as a PhD in engineering, from the RWTH University of Aachen in Germany. Judy Blessing Judy Blessing brings 17 years of research experience to her manager position in market and trends research. Her in-depth knowledge of all research methodologies allow her to apply the proper testing and analysis to showcase Visteon s automotive intellect to external customers and industry affiliates. Judy holds a German University Diploma degree in Marketing/ Market Research from the Fachhoch-schule Pforzheim, Germany. Judy Blessing has more than ten years of automotive experience, investigating topics like consumer perceived quality, user experience, usability and advanced product research. She previously held positions as researcher in different technology companies and research institutes, measuring consumer reactions to innovative products.

Gert-Dieter Tuzar, MFA As Principal Designer HMI and Human Factors Expert in Visteon s European design experience group, Gert-Dieter Tuzar is responsible for Human Centered HMI Design Development Process and Interaction Concepts. In his role he leads HMI Design Development for Infotainment products, and Driver Information Systems. Gert-Dieter Tuzar has more than twenty years of automotive experience. He has a Masters of Fine Arts degree from The Ohio State University, Columbus, OH, USA as well as a Diplom- Industrie-Designer (FH) degree from The University of Applied Science in Pforzheim, Germany. Oliver Kirsch As Innovation Project Manager in Visteon s European Advance Innovation Group, Oliver Kirsch is responsible for investigating new technologies from technology monitoring to proof of concept. In his role he investigates advanced camera technologies for interior applications with a focus on computer vision algorithms for Time of Flight based 3-D hand gesture recognition. Previously, he held roles in other areas of cockpit electronics like instrument clusters and head-up displays. Oliver Kirsch applies more than twenty years of automotive engineering experience. He earned a degree in electrical engineering from the Bergische University Wuppertal, Germany. Note: Co-author Oliver Kirsch s contributions to this paper were made during his previous employment at Visteon Corporation.

White Paper About Visteon Visteon is a global company that designs, engineers and manufactures innovative cockpit electronics products and connected car solutions for most of the world s major vehicle manufacturers. Visteon is a leading provider of instrument clusters, head-up displays, information displays, infotainment, audio systems, and telematics solutions; its brands include Lightscape, OpenAir and SmartCore. Visteon also supplies embedded multimedia and smartphone connectivity software solutions to the global automotive industry. Headquartered in Van Buren Township, Michigan, Visteon has nearly 11,000 employees at more than 40 facilities in 18 countries. Visteon had sales of $3.25 billion in 2015. Learn more at www.visteon.com. Visteon Corporation One Village Center Dr. Van Buren Township, MI 48188 1-800-VISTEON www.visteon.com Copyright 2016 Visteon Corporation