The In and Out: Making Games Play Right with Stereoscopic 3D Technologies. Samuel Gateau



Similar documents
3D Stereoscopic Game Development. How to Make Your Game Look

Beyond 2D Monitor NVIDIA 3D Stereo

Image Processing and Computer Graphics. Rendering Pipeline. Matthias Teschner. Computer Science Department University of Freiburg

3D Modeling Using Stereo Projection

INTRODUCTION TO RENDERING TECHNIQUES

Dynamic Resolution Rendering

Shader Model 3.0. Ashu Rege. NVIDIA Developer Technology Group

GPU(Graphics Processing Unit) with a Focus on Nvidia GeForce 6 Series. By: Binesh Tuladhar Clay Smith

CS 4204 Computer Graphics

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

How To Run A Factory I/O On A Microsoft Gpu 2.5 (Sdk) On A Computer Or Microsoft Powerbook 2.3 (Powerpoint) On An Android Computer Or Macbook 2 (Powerstation) On

iz3d Stereo Driver Description (DirectX Realization)

6 Space Perception and Binocular Vision

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1

SketchUp Instructions

Scan-Line Fill. Scan-Line Algorithm. Sort by scan line Fill each span vertex order generated by vertex list

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec.

Binocular Vision and The Perception of Depth

XPAND (DLP -Link ) 3D Glasses

Convex Mirrors. Ray Diagram for Convex Mirror

Advanced Visual Effects with Direct3D

Basic Principles of Stereoscopic 3D. Simon Reeve & Jason Flock Senior Editors, BSKYB simon.reeve@bskyb.com jason.flock@bskyb.com

OpenGL Performance Tuning

Space Perception and Binocular Vision

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Stereo 3D Monitoring & Correction

GPU Christmas Tree Rendering. Evan Hart

Image Synthesis. Transparency. computer graphics & visualization

Optimizing AAA Games for Mobile Platforms

B2.53-R3: COMPUTER GRAPHICS. NOTE: 1. There are TWO PARTS in this Module/Paper. PART ONE contains FOUR questions and PART TWO contains FIVE questions.

QuickSpecs. NVIDIA Quadro K4200 4GB Graphics INTRODUCTION. NVIDIA Quadro K4200 4GB Graphics. Overview

OpenGL Insights. Edited by. Patrick Cozzi and Christophe Riccio

QuickSpecs. NVIDIA Quadro K2200 4GB Graphics INTRODUCTION. NVIDIA Quadro K2200 4GB Graphics. Technical Specifications

Data Sheet. definiti 3D Stereo Theaters + definiti 3D Stereo Projection for Full Dome. S7a1801

QuickSpecs. NVIDIA Quadro K5200 8GB Graphics INTRODUCTION. NVIDIA Quadro K5200 8GB Graphics. Overview. NVIDIA Quadro K5200 8GB Graphics J3G90AA

CLOUD DIGITISER 2014!

QuickSpecs. NVIDIA Quadro K5200 8GB Graphics INTRODUCTION. NVIDIA Quadro K5200 8GB Graphics. Technical Specifications

Realtime 3D Computer Graphics Virtual Reality

NVFX : A NEW SCENE AND MATERIAL EFFECT FRAMEWORK FOR OPENGL AND DIRECTX. TRISTAN LORACH Senior Devtech Engineer SIGGRAPH 2013

The Limits of Human Vision

GPU Architecture. Michael Doggett ATI

3D Drawing. Single Point Perspective with Diminishing Spaces

Recent Advances and Future Trends in Graphics Hardware. Michael Doggett Architect November 23, 2005

Using Photorealistic RenderMan for High-Quality Direct Volume Rendering

How To Use 3D On A Computer Or Tv

Please see the Global Visions module description. Sequence

Chapter 8: Perceiving Depth and Size

TABLE OF CONTENTS. INTRODUCTION... 5 Advance Concrete... 5 Where to find information?... 6 INSTALLATION... 7 STARTING ADVANCE CONCRETE...

Introduction GPU Hardware GPU Computing Today GPU Computing Example Outlook Summary. GPU Computing. Numerical Simulation - from Models to Software

3D Drawing. Single Point Perspective with Diminishing Spaces

Lecture Notes, CEng 477

GEFORCE 3D VISION QUICK START GUIDE

Outline. srgb DX9, DX10, XBox 360. Tone Mapping. Motion Blur

Deferred Shading & Screen Space Effects

NVIDIA workstation 3D graphics card upgrade options deliver productivity improvements and superior image quality

Optimizing Unity Games for Mobile Platforms. Angelo Theodorou Software Engineer Unite 2013, 28 th -30 th August

NVPRO-PIPELINE A RESEARCH RENDERING PIPELINE MARKUS TAVENRATH MATAVENRATH@NVIDIA.COM SENIOR DEVELOPER TECHNOLOGY ENGINEER, NVIDIA

White Paper AMD HD3D Pro Technology: Stereoscopic 3D For Professionals. Table of Contents

2013 Getting Started Guide

The Rocket Steam Locomotive - Animation

CLOUD GAMING WITH NVIDIA GRID TECHNOLOGIES Franck DIARD, Ph.D., SW Chief Software Architect GDC 2014

Experiment 3 Lenses and Images

Color correction in 3D environments Nicholas Blackhawk

How To Use An Amd Ramfire R7 With A 4Gb Memory Card With A 2Gb Memory Chip With A 3D Graphics Card With An 8Gb Card With 2Gb Graphics Card (With 2D) And A 2D Video Card With

DATA VISUALIZATION OF THE GRAPHICS PIPELINE: TRACKING STATE WITH THE STATEVIEWER

Understanding the DriveRack PA. The diagram below shows the DriveRack PA in the signal chain of a typical sound system.

WAVELENGTH OF LIGHT - DIFFRACTION GRATING

Stereoscopic 3D Video in the Home

Maya 2014 Basic Animation & The Graph Editor

Maxwell Render 1.5 complete list of new and enhanced features

Digital Photography Composition. Kent Messamore 9/8/2013

Cabri Geometry Application User Guide

Kapitel 12. 3D Television Based on a Stereoscopic View Synthesis Approach

English Table of Contents

The Evolution of Computer Graphics. SVP, Content & Technology, NVIDIA

Creating Web Pages with Microsoft FrontPage

Blender addons ESRI Shapefile import/export and georeferenced raster import

CSE 564: Visualization. GPU Programming (First Steps) GPU Generations. Klaus Mueller. Computer Science Department Stony Brook University

Video in Logger Pro. There are many ways to create and use video clips and still images in Logger Pro.

Stereoscopic 3D Digital Theater System. Operator Manual (MI-2100)

Tutorial for Tracker and Supporting Software By David Chandler

Basic Problem: Map a 3D object to a 2D display surface. Analogy - Taking a snapshot with a camera

1. You stand two feet away from a plane mirror. How far is it from you to your image? a. 2.0 ft c. 4.0 ft b. 3.0 ft d. 5.0 ft

COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies

Basic 2D Design Be sure you have the latest information!

1 of 9 2/9/2010 3:38 PM

NVIDIA Quadro M4000 Sync PNY Part Number: VCQM4000SYNC-PB. User Guide

Race 07 Car Skinning Tutorial

House Design Tutorial

Thea Omni Light. Thea Spot Light. Light setup & Optimization

MovieClip, Button, Graphic, Motion Tween, Classic Motion Tween, Shape Tween, Motion Guide, Masking, Bone Tool, 3D Tool

Equalizer. Parallel OpenGL Application Framework. Stefan Eilemann, Eyescale Software GmbH

Autodesk Revit Architecture 2011 Professional Massmodeling Rendering Video Tutorial

PowerPoint 2007 Basics Website:

Computer Graphics Hardware An Overview

Protocol for Microscope Calibration

Cork Education and Training Board. Programme Module for. 3 Dimensional Computer Graphics. Leading to. Level 5 FETAC

Writing Applications for the GPU Using the RapidMind Development Platform

Transcription:

The In and Out: Making Games Play Right with Stereoscopic 3D Technologies Samuel Gateau

Outline NVIDIA 3D Vision Stereoscopic driver & HW display solutions Stereoscopic basics Definitions and equations Rendering techniques Practical recommendations Advanced stereo effects Selection Marquee 3D Video

How does it work? NVIDIA 3D VISION

NVIDIA 3D Vision Stereoscopic 3D Driver DirectX Stereoscopic 3D Driver Micropol Display Passive Polarized Glasses Any monitor Anaglyph Red/blue Anaglyph Glasses 120 Hz LCDs Shutter Glasses DLPs Shutter Glasses 3D Projector Shutter Glasses

How It Works 3D game data is sent to stereoscopic driver The driver takes the 3D game data and renders each scene twice once for the left eye and once for the right eye. Left Eye view Right Eye view A Stereoscopic display then shows the left eye view for even frames (0, 2, 4, etc) and the right eye view for odd frames (1, 3, 5, etc).

How It Works Left eye view on, right lens blocked Right eye view on, left lens blocked In this example active shutter glasses black-out the right lens when the left eye view is shown on the display and black-out the left lens when the right eye view is shown on the display. This means that the refresh rate of the display is effectively cut in half for each eye. (e.g. a display running at 120 Hz is 60 Hz per eye) on off off on Left lens Right lens Left lens Right lens The resulting image for the end user is a combined image that appears to have depth in front of and behind the stereoscopic 3D Display.

NVIDIA 3D Vision NVAPI Stereoscopic Module NVAPI is NVIDIA's core software development kit that allows direct access to NVIDIA GPUs and drivers NVAPI now expose a Stereoscopic Module providing access to developer to the Stereoscopic driver settings Detect if the system is 3D Vision capable Manage the stereo profile settings for the game Control dynamically the stereo parameters from within the game engine for a better stereo experience For download and documentation http://developer.nvidia.com/object/nvapi.html

Under the hood STEREOSCOPIC BASICS

Stereo Basics Standard Mono Scene is viewed from one mono eye and projected on Near Clipping plane in Viewport Near clipping plane Viewport Y Z X Eye space

Stereoscopic Basics Two eyes, one screen, two images Left and Right eyes Shifting the mono eye along the X axis Left Frustum Right Frustum One virtual screen Where the left and right frustums converge Screen Left Image Right Image Near clipping plane Left Image Right Image Y Z X Eye space Two images 2 images are generated at the near clipping plane in each views then presented independently to each eyes of the user on the real screen

Stereoscopic Basics Interaxial Distance between the 2 virtual eyes in eye space The mono, left & right eyes directions are all parallels Screen Eye space Y Z X Left Eye Mono Eye Right Eye Interaxial

Stereoscopic Basics Screen Depth Also called Convergence Screen s virtual depth in eye space Plane where Left and Right Frustums intersect Screen Left Frustum Eye space Y Z X Left Eye Mono Eye Right Eye Right Frustum Screen Depth

Stereoscopic Basics Left / Right Projection Projection matrix for each eyes is a horizontally modified version of regular mono projection matrix Shifting X coordinate left or right Screen Left Frustum Eye space Y Z X Left Eye Mono Eye Right Eye Mono Frustum Right Frustum

Stereoscopic Basics Parallax Signed Distance on the screen between the projected positions of one vertex in left and right image Parallax is function of the depth of the vertex in eye space Clip space Y Z X Screen Parallax Left Eye Mono Eye Right Eye Screen Depth Vertex depth

Stereoscopic Basics In / Out of the Screen Parallax creates the depth perception relative to the screen When Parallax is negative, vertex appears Out of the screen Out of the Screen Screen In the Screen Left Eye Mono Eye Right Eye Eye space Y Z X

Stereoscopic Basics Parallax in equation In normalized window space, after perspective division by W Parallax = DepthFactor * ( 1 ScreenDepth / W ) Parallax in window space Screen Depth Depth Factor Parallax is 0 at screen depth Depth Factor is the maximum Parallax for object at infinity Vertex Depth (W) Parallax diverges quickly to negative infinity for object closer to the eye

Stereoscopic Basics Left / Right surfaces The back buffer is duplicated Intermediate full screen render targets used to process the final image should also be duplicated High dynamic range, Blur, Bloom Screen Space Ambient Occlusion Some render targets DON T need to be duplicated Shadow map view independent render targets

Stereoscopic Basics Stereoscopic Surfaces & Separation Stereoscopic rendering is the result of 2 independent actions Stereoscopic surfaces Rendering surfaces are duplicated Stereoscopic separation Parallax shift is applied to the geometry vertices

Stereoscopic Basics Stereoscopic Surfaces in NVIDIA driver Automatic duplication is based on the surface size Surfaces equal or larger than back buffer size are duplicated Square surfaces are NOT duplicated Small surfaces are NOT duplicated Heuristic defined by driver profile setting Consult documentation for fine tuning Explicit duplication will be available programmatically soon

Stereoscopic Basics Stereoscopic Separation in NVIIDA driver Every Draw call Issued twice in left & right surfaces Parallax shift is applied in the vertex shader to the vertex s clip position output When separation is not required, render geometry at Screen depth Full screen quad to do image filtering No separation if the surface is mono

Recommendations RENDERING TECHNIQUES

Rendering Techniques 3D Objects All the 3D objects should be rendered using a unique Perspective Projection in a given frame The sky box should be drawn with a valid depth further than the regular scene Best is at the Far distance All the 3D objects must have a coherent depth relative to the scene

Rendering Techniques Parallax Budget Define Screen Depth for best usage of the parallax budget in screen and out of the screen Just like when dealing with Near and Far Near Far Parallax in window space Screen Depth Depth Factor Vertex Depth (W)

Rendering Techniques Defining Screen Depth Maximizing the usage of the depth range to see the variation of separation along the depth Scene Depth should range between Screen Depth and 100*Screen Depth With NVIDIA 3D Vision, Screen depth can be defined from NVAPI

Rendering Techniques 3D Objects Culling Culling should be extended horizontally to see extra left and right frustum space after screen depth Screen Left Frustum Left Eye Mono Eye Mono Frustum Right Eye Right Frustum

Rendering Techniques 2D Objects 2D Overlay elements (defined in window space) must be drawn at a valid Depth At the screen depth to look mono Head Up Display interface UI elements At the correct depth when interacting with the 3D scene Mouse Cursor at the pointed object s depth Can not use the HW cursor Crosshair Labels or billboards in the scene The depth is provided by the game engine Needs to modify the projection to take into account depth

Rendering Techniques 2D Objects hybrid projection Proposed vertex shader float4 2DObjects_VertexShader( in float2 posclip : POSITION, // Input position in clip space uniform float depth // Depth where to draw the 2D object ) : POSITION // Output the position in clip space { return float4( posclip.xy * depth, // Simply scale the posclip by the depth // to compensate for the division by W // performed before rasterization 0, // Z is not used if the depth buffer is not used // If needed Z = ( depth * f nf )/(f n); } depth ); // W is the Z in eye space

Stereoscopic Basics Out of the screen objects The user s brain is fighting against the perception of hovering objects out of the screen Extra care must be taken to achieve a convincing effect Make sure object is not clipped by the edges of the window Be aware of the extra guard bands Move object slowly from inside the screen to the outside area to give eyes time to adapt Make smooth visibility transitions No blinking Realistic rendering helps

The cool part ADVANCED STEREOSCOPIC EFFECTS

Advanced Stereoscopic Effects Selection marquee Something is broken In mono, the selection marquee is naturally defined in the window space as a 2D shape It can be simply drawn as a rectangle in 2D in window space In stereo, the same solution does not work Each view defines its own selection rectangle in its clipping space The vertical edges of the rectangles don t match Area in the rectangle in right image Area in the rectangle in left image Selection rectangle

Advanced Stereoscopic Effects Selection marquee In fact it s a volume The selection marquee seen in mono is in fact a 2D shape projected on the 3D scene and intersecting with the scene surface the 2D shape defined in near clip plane is generating a selection volume Near plane Selection marquee projected on 3D scene

Advanced Stereoscopic Effects Selection marquee Representing the selection surface In stereo, we want to represent the selection marquee as the fragments of the 3D scene inside the selection volume Just like shadow volume Render the selection volume with z test against the depth buffer and just incrementing the stencil buffer Render the selection volume displaying only the fragments with a odd stencil (inside the volume) Selection marquee for right view

Advanced Stereoscopic Effects Selection marquee Representing the selection surface Render the selection volume with z test against the depth buffer and just incrementing the stencil buffer Render the selection volume displaying only the fragments with a odd stencil (inside the volume) Left Viewport Right Viewport Y X Z Near

Advanced Stereoscopic Effects Selection marquee Going back to the 2D space For shading purpose, we need to know the 2D coordinate of the fragment expressed in the original 2D selection space The fragment is on the selection volume At that fragment, fetch the view space position of the 3D scene from a previously rendered position buffer Transform back that position to the mono 2D window space Y X Z Near

Advanced Stereoscopic Effects Selection marquee Conclusion It works with any 2D selection marquee shape, just need to generate the corresponding selection volume Lasso! An accurate solution to represent the selection marquee in stereo Requires an extra buffer containing the position in view space of 3D scene fragments

Advanced Stereoscopic Effects 3D Video Displaying a stereo frame How to display existing stereo content in Stereo? Replay 3D video Display 3D photos Prerecorded cut scenes

Advanced Stereoscopic Effects 3D Video Presenting a stereo frame from 2 images On a 3D vision driver enabled platform under DirectX9 Upload new frame (left and right) in a video surface 2 * Width Height + 1 Left image on the left, Right image on the right Edit stereo tag in the last extra raw StretchRect the video surface into the back buffer Presenting the back buffer will display the stereo frame Left w * h Right w * h Upload Left Video stereo Right surface Stereo tag Back Buffer Right Left StretchRect Present

Advanced Stereoscopic Effects 3D Video Presenting a stereo frame: Creating the source image IDirect3DSurface9* gimagesrcleft; // Left Source image surface in video memory IDirect3DSurface9* gimagesrcright; // Right Source image Surface in video memory int gimagewidth = 1680; int gimageheight = 1050; // Source image width // Source image height IDirect3DSurface9* gimagesrc = NULL; // Source stereo image beeing created D3DDev->CreateOffscreenPlainSurface( gimagewidth * 2, // Stereo width is twice the source width gimageheight + 1, // Stereo height add one raw to encode signature D3DFMT_A8R8G8B8, D3DPOOL_DEFAULT, // Surface is in video memory &gimagesrc, NULL); // Blit left src image to left side of stereo RECT srcrect = { 0, 0, gimagewidth, gimageheight }; RECT dstrect = { 0, 0, gimagewidth, gimageheight }; D3DDev->StretchRect(gImageSrcLeft, &srcrect, gimagesrc, &destrect, D3DTEXF_LINEAR); // Blit left src image to left side of stereo dstrect = {gimagewidth, 0, 2*gImageWidth, gimageheight }; D3DDev->StretchRect(gImageSrcRight, &srcrect, gimagesrc, &destrect, D3DTEXF_LINEAR);

Advanced Stereoscopic Effects 3D Video Stereo signature // Stereo Blit defines #define NVSTEREO_IMAGE_SIGNATURE 0x4433564e //NV3D typedef struct _Nv_Stereo_Image_Header { unsigned int dwsignature; unsigned int dwwidth; unsigned int dwheight; unsigned int dwbpp; unsigned int dwflags; } NVSTEREOIMAGEHEADER, *LPNVSTEREOIMAGEHEADER; // ORed flags in the dwflags fiels of the _Nv_Stereo_Image_Header structure above #define SIH_SWAP_EYES 0x00000001 #define SIH_SCALE_TO_FIT 0x00000002

Advanced Stereoscopic Effects 3D Video Tagging stereo image with Stereoscopic signature // Lock the stereo image D3DLOCKED_RECT lr; gimagesrc->lockrect(&lr,null,0); // write stereo signature in the last raw of the stereo image LPNVSTEREOIMAGEHEADER psih = (LPNVSTEREOIMAGEHEADER)(((unsigned char *) lr.pbits) + (lr.pitch * gimageheight)); // Update the signature header values psih->dwsignature = NVSTEREO_IMAGE_SIGNATURE; psih->dwbpp = 32; psih->dwflags = SIH_SWAP_EYES; // Src image has left on left and right on right psih->dwwidth = gimagewidth*2; psih->dwheight = gimageheight; // Unlock surface gimagesrc->unlockrect();

Advanced Stereoscopic Effects 3D Video Recording a stereo frame Saving the full stereoscopic frame (left and right) currently in the back buffer in system memory Reverse operation, it s called reverse stereo blit Generate stereoscopic videos

Advanced Stereoscopic Effects 3D Video Saving a stereo frame On a NVIDIA driver enabled platform under DirectX9 Allocate a video image of the stereo image size 2 * Width Height Enable the reverse stereo blit mode with a specific call to NVAPI StrechRect the back buffer into the video image Left image on the left, Right image on the right Disable the reverse stereo blit mode Present Back Right Left Buffer StretchRect Video stereo Left Right surface

Advanced Stereoscopic Effects 3D Video Reverse stereo blitting the backbuffer to a stereo image // Create reverse blit destination double surface IDirect3DSurface9* gstereoimage = NULL; D3DDev->CreateOffscreenPlainSurface( IMAGE_WIDTH * 2, // Stereo width is twice the source width IMAGE_HEIGHT, // Stereo height is backbuffer height D3DFMT_A8R8G8B8, D3DPOOL_DEFAULT, // Surface is in video memory &gstereoimage, NULL); // Turn on reverse blit NvAPI_Stereo_ReverseStereoBlitControl(gStereoHandle, true); // Set destination rectangle as double backbuffer's one RECT backbufferrect = {0, 0, IMAGE_WIDTH, IMAGE_HEIGHT}; RECT stereoimagerect = {0, 0, 2*IMAGE_WIDTH, IMAGE_HEIGHT}; // Reverse blit D3DDev->StretchRect(BackBuf, &backbufferrect, gstereoimage, &stereoimagerect, D3DTEXF_LINEAR); // Turn off reverse blit NvAPI_Stereo_ReverseStereoBlitControl(gStereoHandle, false);

QUESTIONS?

Acknowledgements Every one in the Stereo driver team! The usual suspects in demo and devtech team

How To Reach Us During GDC Expo Suite 656, West Hall Developer Tool Open Chat, 1:30 to 2:30 pm (25 th -27 th ) Online Twitter: nvidiadeveloper Website: http://developer.nvidia.com Forums: http://developer.nvidia.com/forums

Stereo Basics Depth Factor In virtual space the stereo separation depends on the ratio Interaxial / VirtualScreenWidth VirtualScreenWidth = ScreenDepth * Width / Near User can adjust the depth perception with a scaling factor (DepthScale) DepthFactor = DepthScale * Interaxial * Near / ( Width * ScreenDepth ) Virtual Screen Eye space Left Eye Y X Z Right Eye Interaxial ScreenDepth * Width / Near Screen Depth

Stereo Basics Depth Factor in reality The reality of the visualization system Interaxial Distance between the 2 eyes of the viewer (Interocular) On average, the interocular distance is 2.5 (6.4 cm) Screen width is fixed Screen distance to the viewer is recommended for a comfortable usage The bigger the screen the further away is the viewer Depth Factor is function of Interocular / ScreenWidth Screen Real space Left Eye Y Z Interocular Screen width X Right Eye Screen distance