Head-Coupled Perspective



Similar documents
* * * Intel RealSense SDK Architecture

Intel Media SDK Library Distribution and Dispatching Process

Intel SSD 520 Series Specification Update

Intel Perceptual Computing SDK My First C++ Application

Specification Update. January 2014

Intel Solid-State Drive Pro 2500 Series Opal* Compatibility Guide

Intel Core i5 processor 520E CPU Embedded Application Power Guideline Addendum January 2011

Software Solutions for Multi-Display Setups

The Case for Rack Scale Architecture

Intel Modular Server System MFSYS25

Intel Desktop Board D945GCPE

with PKI Use Case Guide

Intel Solid-State Drives Increase Productivity of Product Design and Simulation

Intel HTML5 Development Environment. Tutorial Test & Submit a Microsoft Windows Phone 8* App (BETA)

This guide explains how to install an Intel Solid-State Drive (Intel SSD) in a SATA-based desktop or notebook computer.

Intel Service Assurance Administrator. Product Overview

Intel Desktop Board DG43RK

Displaying Stereoscopic 3D (S3D) with Intel HD Graphics Processors for Software Developers August 2011

Software Evaluation Guide for Autodesk 3ds Max 2009* and Enemy Territory: Quake Wars* Render a 3D character while playing a game

Intel Desktop Board D945GCPE Specification Update

iscsi Quick-Connect Guide for Red Hat Linux

Intel Desktop Board DG41TY

Intel Desktop Board DP55WB

Intel HTML5 Development Environment. Tutorial Building an Apple ios* Application Binary

Intel Desktop Board DG41BI

Scaling Networking Solutions for IoT Challenges and Opportunities

Three Paths to Faster Simulations Using ANSYS Mechanical 16.0 and Intel Architecture

Intel Desktop Board DQ43AP

Intel Extreme Memory Profile (Intel XMP) DDR3 Technology

Intel Desktop Board DG31PR

Intel Retail Client Manager Audience Analytics

Intel Identity Protection Technology (IPT)

Intel Platform and Big Data: Making big data work for you.

Maximize Performance and Scalability of RADIOSS* Structural Analysis Software on Intel Xeon Processor E7 v2 Family-Based Platforms

Intel Desktop Board DQ35JO

A Case Study of an Android* Client App Using Cloud-Based Alert Service

Intel Core TM i3 Processor Series Embedded Application Power Guideline Addendum

Intel and Qihoo 360 Internet Portal Datacenter - Big Data Storage Optimization Case Study

Intel Desktop Board D945GCL

Cloud based Holdfast Electronic Sports Game Platform

Intel Cyber Security Briefing: Trends, Solutions, and Opportunities. Matthew Rosenquist, Cyber Security Strategist, Intel Corp

Intel Desktop Board DQ965GF

WHITE PAPER. LVDS Flat Panel Display Interface on Intel Desktop Boards. July 2009 Order Number: E

Intel Identity Protection Technology Enabling improved user-friendly strong authentication in VASCO's latest generation solutions

Intel Desktop Board D101GGC Specification Update

Fast, Low-Overhead Encryption for Apache Hadoop*

Intel Desktop Board DG41WV

Accelerating Business Intelligence with Large-Scale System Memory

Resetting USB drive using Windows Diskpart command

Intelligent Business Operations

Intel Desktop Board DG33TL

Create Natural User Interfaces with the Next-Generation Intel Perceptual Computing SDK

Intel HTML5 Development Environment. Article - Native Application Facebook* Integration

Intel System Event Log (SEL) Viewer Utility

Intel Desktop Board DP43BF

Douglas Fisher Vice President General Manager, Software and Services Group Intel Corporation

Intel Matrix Storage Console

Intel Integrated Native Developer Experience (INDE): IDE Integration for Android*

2013 Intel Corporation

Intel 810 and 815 Chipset Family Dynamic Video Memory Technology

Intel Solid-State Drive 320 Series

Intel 845G/845GL/845GV Chipset

Intel Active Management Technology Embedded Host-based Configuration in Intelligent Systems

Version Rev. 1.0

USB 3.0* Radio Frequency Interference Impact on 2.4 GHz Wireless Devices

Intel Internet of Things (IoT) Developer Kit

Intel Cloud Builders Guide to Cloud Design and Deployment on Intel Platforms

Intel Cloud Builder Guide: Cloud Design and Deployment on Intel Platforms

Intel System Event Log (SEL) Viewer Utility

Enabling Touch in Windows 8* Metro Style Apps with C#

The ROI from Optimizing Software Performance with Intel Parallel Studio XE

System Event Log (SEL) Viewer User Guide

Intel Network Builders: Lanner and Intel Building the Best Network Security Platforms

Intel Compute Stick STCK1A32WFC User Guide. Intel Compute Stick STCK1A32WFC

Software Evaluation Guide for Microsoft Office Excel 2010* and WinZip 15.5*

Power Benefits Using Intel Quick Sync Video H.264 Codec With Sorenson Squeeze

Intel System Event Log (SEL) Viewer Utility

Intel Media Server Studio Professional Edition for Windows* Server

High Performance Computing and Big Data: The coming wave.

System Image Recovery* Training Foils

Intel IXP42X Product Line of Network Processors and IXC1100 Control Plane Processor: Spread-Spectrum Clocking to Reduce EMI

Intel HTML5 Development Environment Article Using the App Dev Center

Cloud Service Brokerage Case Study. Health Insurance Association Launches a Security and Integration Cloud Service Brokerage

Partition Alignment of Intel SSDs for Achieving Maximum Performance and Endurance Technical Brief February 2014

Accelerating Business Intelligence with Large-Scale System Memory

Intel Desktop Board DG965RY

Creating Full Screen Applications Across Multiple Displays in Extended Mode

SDN. WHITE PAPER Intel Ethernet Switch FM6000 Series - Software Defined Networking. Recep Ozdag Intel Corporation

Intel System Event Log (SEL) Viewer Utility. User Guide SELViewer Version 10.0 /11.0 December 2012 Document number: G

Intel Media Server Studio - Metrics Monitor (v1.1.0) Reference Manual

Intel Data Migration Software

Unlocking Hidden Potential at Intel Through Big Data Analytics

Intel Desktop Board D945GCZ

Intel Ethernet Controller X540 Feature Software Support Summary. LAN Access Division (LAD)

Intel Technical Advisory

Intel Platform Controller Hub EG20T

Intel Data Direct I/O Technology (Intel DDIO): A Primer >

Intel vpro Technology. How To Purchase and Install Symantec* Certificates for Intel AMT Remote Setup and Configuration

Intel Remote Configuration Certificate Utility Frequently Asked Questions

IDE Integration for Android* Part of the Intel Integrated Native Developer Experience (Intel INDE) 1.5.7

Transcription:

Head-Coupled Perspective Introduction Head-Coupled Perspective (HCP) refers to a technique of rendering a scene that takes into account the position of the viewer relative to the display. As a viewer moves in front of the display, the scene s projection is adjusted to match, creating the impression of the display as a window onto a scene existing behind it. The viewer s head motion controls the scene intuitively; leaning left reveals more of the scene to the right, ducking reveals more of the scene above, etc. As the viewer leans in, features in the scene remain a constant size (measured in view angle), but a wider field of view becomes visible. The necessary input data for creating an HCP display is knowledge of the viewer s eye position relative to the physical display. This sample utilizes a feature of the Intel Perceptual Computing Software Development Kit (SDK) to track the viewer s face location using a user-facing webcam or the built-in camera on an Ultrabook device. The sample scene is controlled using a standard mouse/keyboard camera control, plus position and zoom offsets derived from the stream of face detection data provided by the SDK. Use Cases of Head-Coupled Perspective The sample includes two scenes that illustrate potential uses of HCP. The first is an indoor scene, with objects stretching from the near- to middle-distance. This scene is representative of a first-person shooter (FPS) style game with foreground objects that can occlude more distant objects. The user moves about the scene using mouse and keyboard input. When the HCP camera is enabled, the user can move their head to look over, under, or around a foreground object without changing their in-game position. This view mechanic might be used in a game to peek out from behind cover or around a corner. Figure 1 - Pipe Room scene The second scene is an outdoor mountain scene with a fixed camera position. This scene represents a fixed window onto a rendered scene, and the user interacts just by moving relative to the display. This

mode might be used to represent a view from an airplane window or from a box seat in a stadium. Another application would be as a retail kiosk or trade show display, where passersby would see the rendered scene shift as they walk by. Figure 2 - Environment Window scene Implementation The location of the sample viewer is obtained from the Intel Perceptual Computing SDK using a color camera sensor. Facial data obtained from the SDK consist of image size, face position, face size, and position of six facial landmarks in (X,Y) coordinates. The scene s rendering (virtual) camera is implemented as two coupled cameras, initially aligned and with the second set slightly behind the first (i.e., offset along the first camera s Z axis). Mouse and keyboard inputs are handled by the first camera, and head-coupled offsets are applied to move the second camera about the shared axis. The scene is then rendered from the second camera s point of view. When facial input is enabled, the second camera is offset in X, Y (using the first camera s frame of reference) proportionally to the calculated offset of the face from the center of the captured image. The second camera s look-at point remains coincident with the first s, so the position offset translates into a yaw/tilt effect on the scene. The sample initially assumes that the image acquisition camera has a horizontal field of view similar to that of the synthetic rendering camera, about 60. If this assumption is not true, the scene will appear to yaw too little or too much, so an XY Gain slider allows the user to adjust to suit. The sample includes the option to zoom in/out of the scene corresponding to how near the detected face is to the camera. This effect is intended to mimic the effect of a scene existing at a distance behind the display. In a normal, non-hcp display the scene is unchanged as the viewer gets closer and individual features grow larger. In the real world as one approaches a window with a distant scene behind it, the scene s features remain nearly unchanged in size, but the overall field of view expands. When the sample s zoom effect is enabled, an estimate is made of how close the face is to the display and the rendered field of view is adjusted to match. The proximity estimate is based both on the size of the face

rectangle provided by the SDK, supplemented with a set of measurements of the distance between selected pairs of facial landmarks. These data, landmarks especially, are particularly noisy unless in very optimal lighting conditions, so a heavy smoothing is applied to reduce the resulting jitter in zoom values. Face Tracking with Intel Perceptual Computing SDK The Intel Perceptual Computing SDK provides a wide range of sensor data and control inputs from camera, microphone, and other hardware sensors. Within the camera sensor subset, the SDK provides face and hand/gesture data. Within the facial data set are number, position, size, facial landmarks, individual face recognition, and classification of age, gender, and expression. From this available data, the HCP sample utilizes just the position and size of a single face and location of facial landmarks within that face. The SDK provides a COM-like API that allows fine-grain control over all aspects of capture and processing of camera images. It also supplies a UtilPipeline class that abstracts away much of the complexity and provides a simple, asynchronous callback interface applicable to many common uses; this sample utilizes the UtilPipeline class. When the UtilPipeline-derived capture class is initialized, a separate thread is spawned to capture camera images. Within the rendering loop, a non-blocking check is made for new image data, and when available, the OnNewFrame() method is called to extract facial data. The new facial data are merged with the existing data using linear interpolation based on the elapsed time since the last detection; the more time elapsed, the more weight given to the new data. After the new face data (if any) have been incorporated, the scene camera is updated to reflect the face position. A linear smoothing is again applied during the camera position calculation to minimize position stutter in frames where new facial data have been added vs. those without new data. Limitations This sample is intended as a proof of concept of how HCP might be integrated into games or kiosks and to spur thought among game developers. This implementation relies on an initial beta release of the Intel Perceptual Computing SDK and commonly available sensors. Responsiveness is quite dependent on environmental lighting and the particular camera sensor in use, and it is reasonable to expect improved HCP latency as the SDK and the available sensors mature. The sample s coupled camera setup currently implements only pointing angle and field of view adjustments. This leaves out adjustment of the projection matrix to compensate for the off-center view of the display. Consider two equally-sized (in screen pixels) objects at either side of the screen. When the viewer is positioned nearer to one side of the screen, the object at the closer edge appears larger to the viewer than the one at the far edge, and the display outline becomes trapezoidal. To compensate, the projection should be transformed with a shear to maintain the apparent size of the two objects. Calculating the appropriate shear requires information about the viewer s absolute distance, which is not available from current Ultrabook or webcam sensors.

The SDK head tracker does a considerable amount of behind-the-scenes work to track individual faces, and only seems to report face data when it has fairly high confidence, i.e., when confidence is low, it is more likely to say nothing than to guess incorrectly. This conservative behavior can lead to tracking drop-outs in less than ideal lighting, which in turn break the HCP effect. Future releases of the SDK along with evolving sensor hardware should tend to alleviate this problem over time. Distance estimation for zoom values relies on accurate position data of facial landmarks, and these data currently exhibit a significant amount of noise/jitter. Heavy smoothing is applied to minimize the noise, but this smoothing in turn leads to a noticeable lag in response time to actual movement. Future Enhancements A future enhancement of this sample may include support for camera sensors that report distance as well as color values. With actual distance data, the zoom function should be much smoother and more responsive. Knowing the actual physical distance from the viewer to the screen will also allow calculation of the correct projection transformation to maintain a constant apparent size of objects as the viewer moves in front of the display. A standard display is typically viewed from well within the viewer s binocular depth discrimination range. Even when the HCP motion response indicates that a scene is distant from the viewer, their binocular depth perception indicates that they are in fact looking at a close, flat screen, therefore blunting the effect. When displaying an HCP scene on a 3-D display, the binocular depth would instead reinforce the HCP effect, leading to an enhanced immersion. Adding 3-D display support is a logical and likely enhancement of this sample. Notices INFORMATION IN THIS DOCUMENT IS PROVIDED IN CONNECTION WITH INTEL PRODUCTS. NO LICENSE, EXPRESS OR IMPLIED, BY ESTOPPEL OR OTHERWISE, TO ANY INTELLECTUAL PROPERTY RIGHTS IS GRANTED BY THIS DOCUMENT. EXCEPT AS PROVIDED IN INTEL'S TERMS AND CONDITIONS OF SALE FOR SUCH PRODUCTS, INTEL ASSUMES NO LIABILITY WHATSOEVER AND INTEL DISCLAIMS ANY EXPRESS OR IMPLIED WARRANTY, RELATING TO SALE AND/OR USE OF INTEL PRODUCTS INCLUDING LIABILITY OR WARRANTIES RELATING TO FITNESS FOR A PARTICULAR PURPOSE, MERCHANTABILITY, OR INFRINGEMENT OF ANY PATENT, COPYRIGHT OR OTHER INTELLECTUAL PROPERTY RIGHT. UNLESS OTHERWISE AGREED IN WRITING BY INTEL, THE INTEL PRODUCTS ARE NOT DESIGNED NOR INTENDED FOR ANY APPLICATION IN WHICH THE FAILURE OF THE INTEL PRODUCT COULD CREATE A SITUATION WHERE PERSONAL INJURY OR DEATH MAY OCCUR. Intel may make changes to specifications and product descriptions at any time, without notice. Designers must not rely on the absence or characteristics of any features or instructions marked "reserved" or "undefined." Intel reserves these for future definition and shall have no responsibility whatsoever for conflicts or incompatibilities arising from future changes to them. The information here is subject to change without notice. Do not finalize a design with this information.

The products described in this document may contain design defects or errors known as errata which may cause the product to deviate from published specifications. Current characterized errata are available on request. Contact your local Intel sales office or your distributor to obtain the latest specifications and before placing your product order. Copies of documents which have an order number and are referenced in this document, or other Intel literature, may be obtained by calling 1-800-548-4725, or go to: http://www.intel.com/design/literature.htm Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors. Performance tests, such as SYSmark* and MobileMark*, are measured using specific computer systems, components, software, operations, and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products. Any software source code reprinted in this document is furnished under a software license and may only be used or copied in accordance with the terms of that license. Intel, the Intel logo, and Ultrabook are trademarks of Intel Corporation in the US and/or other countries. Copyright 2013 Intel Corporation. All rights reserved. *Other names and brands may be claimed as the property of others.