Developing with Oculus: Mastering the Oculus SDK. Volga Aksoy Software Engineer, Oculus

Similar documents
What is GPUOpen? Currently, we have divided console & PC development Black box libraries go against the philosophy of game development Game

Several tips on how to choose a suitable computer

Optimizing AAA Games for Mobile Platforms

GRAPHICS PROCESSING REQUIREMENTS FOR ENABLING IMMERSIVE VR

Outline. srgb DX9, DX10, XBox 360. Tone Mapping. Motion Blur

Equalizer. Parallel OpenGL Application Framework. Stefan Eilemann, Eyescale Software GmbH

iz3d Stereo Driver Description (DirectX Realization)

If you are working with the H4D-60 or multi-shot cameras we recommend 8GB of RAM on a 64 bit Windows and 1GB of video RAM.

How To Run A Factory I/O On A Microsoft Gpu 2.5 (Sdk) On A Computer Or Microsoft Powerbook 2.3 (Powerpoint) On An Android Computer Or Macbook 2 (Powerstation) On

Dynamic Resolution Rendering

GPU Usage. Requirements

Kathy Au Billy Yi Fan Zhou Department of Electrical and Computer Engineering University of Toronto { kathy.au, billy.zhou }@utoronto.

CLOUD GAMING WITH NVIDIA GRID TECHNOLOGIES Franck DIARD, Ph.D., SW Chief Software Architect GDC 2014

Several tips on how to choose a suitable computer

How to choose a suitable computer

User Manual Version p BETA III December 23rd, 2015

Livestream Studio. Release Notes & New Features!!! For use with Livestream Studio version Published on April 13, 2015

Performance Optimization and Debug Tools for mobile games with PlayCanvas

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1

product. Please read this instruction before setup your VenomXTM.

An Introduction to OSVR

PRODUCT SHEET.

Mark Bennett. Search and the Virtual Machine

NetVu. App for Android TM. Installation & Usage Guide

Shader Model 3.0. Ashu Rege. NVIDIA Developer Technology Group

User Manual. Ver 1.0. Solutions for the Digital Life

White Paper Perceived Performance Tuning a system for what really matters

1 ImageBrowser Software Guide

Real-Time Scheduling 1 / 39

Getting Started on the PC and MAC

Optimizing Unity Games for Mobile Platforms. Angelo Theodorou Software Engineer Unite 2013, 28 th -30 th August

VIRTU Universal MVP Installation Guide

Wirecast Release Notes

Getting Started with the ZED 2 Introduction... 2

White Paper. Anywhere, Any Device File Access with IT in Control. Enterprise File Serving 2.0

ACADEMIC TECHNOLOGY SUPPORT

How To Understand The Power Of Unity 3D (Pro) And The Power Behind It (Pro/Pro)

이 기기는 업무용 급 으로 전자파적합등록을 한 기기이오니 판매자 또는 사용자는 이점을 주의하시기 바라며 가정 외의 지역에서 사용하는 것을 목적으로 합니다

Catalyst Software Suite Version 9.11 Release Notes

RingCentral from AT&T Desktop App for Windows & Mac. User Guide

NVIDIA GeForce GTX 580 GPU Datasheet

HTML5 Applications Made Easy on Tizen IVI. Brian Jones / Jimmy Huang

ANDROID DEVELOPER TOOLS TRAINING GTC Sébastien Dominé, NVIDIA

Central Management System

Q. Can an Exceptional3D monitor play back 2D content? A. Yes, Exceptional3D monitors can play back both 2D and specially formatted 3D content.

GearVRf Developer Guide

1. Check the Accessories

1. Open the battery compartment as shown in the image.

Using AORUS Notebook for the First Time

STEELSERIES FREE MOBILE WIRELESS CONTROLLER USER GUIDE

Ocularis Web User Guide

Camera Sensor Driver Development And Integration

OpenEXR Image Viewing Software

TVWall User s Manual (for WindowsXP/2003/Win7/Vista)

Acer LCD Monitor Driver Installation Guide

CS 378: Computer Game Technology

Catalyst Software Suite Version 9.3 Release Notes

MixMeister EZ Converter Setup & Troubleshooting Contents:

SE05: Getting Started with Cognex DataMan Bar Code Readers - Hands On Lab Werner Solution Expo April 8 & 9

Coherent GT Performance Best Practices

Crosswalk: build world class hybrid mobile apps

Otis Photo Lab Inkjet Printing Demo

MOVEIRO BT-200 Technical Information for Application Developer

Android Virtualization from Sierraware. Simply Secure

2016 LG Ultra HD 4K Monitor (UD88 UD68)

Cura for Type A Machines Quick Start Guide

USB 2.0 VGA ADAPTER USER MANUAL

Classroom Capture Admin Guide & Instructor Guide

Graphics Cards and Graphics Processing Units. Ben Johnstone Russ Martin November 15, 2011

White Paper AMD PROJECT FREESYNC

Example of Standard API

USB 3.0 Bandwidth, High Definition Performance

================================================================== CONTENTS ==================================================================

Color correction in 3D environments Nicholas Blackhawk

BlackHawk for MAC Software User Guide

Chapter 11 I/O Management and Disk Scheduling

Water Flow in. Alex Vlachos, Valve July 28, 2010

VZ-M7 HDMI Field Monitor 7 Class HD LCD

Introduction to Embedded Systems. Software Update Problem

1 ImageBrowser Software User Guide

Cloud Gaming & Application Delivery with NVIDIA GRID Technologies. Franck DIARD, Ph.D. GRID Architect, NVIDIA

INSTALLATION GUIDE ENTERPRISE DYNAMICS 9.0

User Manual. For additional help please send a detailed to Support@phnxaudio.com. - 1 Phoenix Audio Technologies

Lab 0 (Setting up your Development Environment) Week 1

About Parallels Desktop 7 for Mac

NVPRO-PIPELINE A RESEARCH RENDERING PIPELINE MARKUS TAVENRATH MATAVENRATH@NVIDIA.COM SENIOR DEVELOPER TECHNOLOGY ENGINEER, NVIDIA

按 一 下 以 編 輯 母 片 標 題 樣 式

Using GIGABYTE Notebook for the First Time

Advanced Visual Effects with Direct3D

SoMA. Automated testing system of camera algorithms. Sofica Ltd

Wiley Publishing, Inc.

Boundless Security Systems, Inc.

OPERATION MANUAL. MV-410RGB Layout Editor. Version 2.1- higher

FabulaTech Products Advanced Communications Solutions

USB 3.0 to HDMI/DVI Dual Display Adapter Installation Guide

Series. Laser air Leddura Lexinus Mensa. 70 inch. Smart innovation! When function matters.

Universal Push2TV HD Adapter PTVU1000 Installation Guide

M100 System File Manager Help

Magic Control Technology Corporation. Android Mirror KM-C6105. User Manual

Transcription:

SPEAKER Developing with Oculus: Mastering the Oculus SDK Volga Aksoy Software Engineer, Oculus Michael Antonov Chief Software Architect, Oculus

Table of Contents New SDK Features Direct-to-Rift Latency Testing & Dynamic Prediction in DK2 Pixel Luminance Overdrive Real-time Engine Integration Tips SDK rendering vs. App rendering Troubleshooting Judder FOV, texture resolution Asymmetric/symmetric frustum issues Left/right separate eyes vs one buffer Multi-threading and timing Looking Into The Future Time Warp Improvements SDK Roadmap

New SDK Features: SPEAKER Direct To Rift

Extended Mode Headset shows up as an OS Display Application must place a window on Rift monitor Icons and Windows in a wrong place Windows compositor handles Present There is usually at least one extra frame of latency More latency if no CPU & GPU sync is done

Direct To Rift Outputs to Rift without Display being a part of the desktop Headset not seen by the OS Avoid jumping windows and icons Decouples Rift v-sync from OS compositor Avoids extra GPU buffering for minimum latency Use ovrhmd_attachtowindow Window swap chain output is directed to the Rift Hope to see Direct Mode become the longer term solution

Direct To Rift - Mirroring Direct to Rift Supports duplicating the image to a Window Minimum overhead Can be disabled by ovrhmdcap_nomirrortowindow Possible to display a different image instead, such as messages in this screen

Direct To Rift - Implementation LibOVR shims Direct3D/GL Runtime Must call ovr_initializerenderingshim() before D3D use Window Swap Chain has an associated Rift Primary Buffer On DK2, back buffer is copied and rotated to Primary for display For best quality, use reported resolution User/Kernel Mode Video Driver Shim Hides Rift from OS, allowing us to find and render to it Allows flipping with Rift's v-sync (no composition queue)

Direct To Rift - Challenges so Far (1/2) Symptom: Rift doesn t turn on Optimus (NVidia + Intel HD GPU Laptops) Discrete GPU not connected to HDMI port SW has to create a bridge and copy data Use NVidia driver bridge in Win 7/8; Win 8.1 cross adaptor support Works on Razer Blade laptops On some machines copy is slow (10ms+ making it unusable) Multi-GPU (SLI/Crossfire) Rift plugged into the wrong adapter, need to copy

Direct To Rift - Challenges so Far (2/2) DisplayLink Conflicting driver shim; working with Display Link to resolve. Too slow for Rift output. Multi-Monitors May sync to wrong Display => V-sync tear or Judder Resolved with upcoming driver update Other Software Shims ASUS Splendid color, etc. Plan: Adding debugging logic, fix incrementally

Direct To Rift - Future Move Distortion/composition to a Separate Process Remove Shims from the Application Avoid state-related rendering issues Support Time warp layer composition Asynchronous Time Warp (TBD) Multi-GPU TW rendering Implementing Time warp on secondary GPU with Optimus Researching better GPU pipelining / preemption

New SDK Features: SPEAKER Latency Testing & Dynamic Prediction in DK2

VR Motion-To-Photon Latency Input New Image USB Game Engine Write Display Pixel Switching

VR Motion-To-Photon Latency Keeping latency low is crucial for a solid VR experience Goal is < 20 ms, and hopefully close to 5 ms We re almost there! (stay tuned for some numbers)

Latency Testing in DK1 Required custom hardware: Oculus Latency Tester App goes into latency testing mode Too cumbersome, ultimately not very practical Minimal developer utilization

Latency Testing in DK2 DK2 can: Sample top-right pixel color Time stamp of color change Messaged tohost PC via USB SDK handles rest of the latency orchestration on host PC

Latency Testing in DK2 SDK can internally time everything up to the point of calling Present()/ SwapBuffers() (i.e. pre-present). Post-present timing is a black-box to the VR app SDK relies on DK2 latency testing for post-present timing

Latency Testing in DK2 Variable frames primarily due to differences in: OS GPU Render APIs Driver settings

Latency Testing in DK2 At Oculus, we ve seen post-present latencies From 0 to 4+ frames That s a delta of: 50+ ms @ 75 Hz!

Latency Testing in DK2 High-level implementation PC renders top-right pixel with given color & records time HMD pings PC with detected color & time stamp of pixel SDK matches up detected color & calculates time difference Latency is reported after enough successive matches

Latency Testing in DK2 Final latency calculated by SDK as: Time difference from final HMD pose sampling to present/swap + Post-present delay duration from DK2 latency tester

Latency Testing in DK2 If (for any reason) timing fails SDK falls back to HMD-specific best-guess value Latency info accessible to VR app See OculusWorldDemo HUD info ( Space to view)

OculusWorldDemo showing HUD Info

Latency Testing in DK2 Timing info can be N/A because: Not full-screened on HMD Not running v-sync'ed Unaccounted for gamma curve in back-buffer (srgb, gamma 2.2 etc.)

Latency Testing in DK2 Ren: Latency from point left-eye HMD pose was queried from SDK

Latency Testing in DK2 Ren: Latency from point left-eye HMD pose was queried from SDK TWrp: Latency from EndFrame()

Latency Testing in DK2 Ren: Latency from point left-eye HMD pose was queried from SDK TWrp: Latency from EndFrame() PostPresent Latency from the point GPU executed Present() / SwapBuffer

Latency Testing in DK2 Captured from Mac Book Pro w/ GeForce GT 750M on Windows 8.1 ~0.0 ms Present, w/ TW at <7 ms Mid-display-scan lands @ ~10 ms We can now force display *IMMEDIATELY after Present()! (*when using direct-mode)

Latency Testing in DK2 Extended-mode, PostPresent will usually be multiple of 13.3 ms PostPresent will increase Ren & TWrp

Latency Testing in DK2 So we have motion-to-photon latency from SDK Then what? Remember with 0.2 SDK, you provided a prediction time offset for every HMD pose query? Enter dynamic prediction

Dynamic Prediction in DK2 0.4 SDK internally knows how much prediction the app needs to stabilize head tracking VR app no longer needs to worry about prediction.

New SDK Features: SPEAKER Pixel Luminance Overdrive

Pixel Luminance Overdrive Artifact called 2-frame-rise-delay Display fails to hit requested luminance levels fast enough Accounting for this in 0.4 SDK

Normal Driving Over Driving Pixel driving voltage Pixel transmission Transmission after one frame period Pixel driving voltage Pixel transmission Transmission after one frame period 1 frame Time 1 frame Time Pixel Luminance Overdrive

Pixel Luminance Overdrive Current formula in distortion shader code: overdrivecolor = newcolor + (newcolor - prevcolor) * overdrivescale; For-each-RGB, newcolor > prevcolor, overdrivescale = 0.1 For-each-RGB, newcolor < prevcolor, overdrivescale = 0.05 Currently in D3D10/11. Coming to OpenGL and D3D9.

Pixel Luminance Overdrive SDK needs to hold onto last presented frame Requires *v-sync *otherwise GPU won t know the contents of display Yet another reason to make sure v-sync is always on

Pixel Luminance Overdrive Current implementation fixes many 2-frame-rise artifacts e.g. in OculusWorldDemo, look through windows outside, or outside at the cypress trees against the sky Except the staircase and fireplace

Pixel Luminance Overdrive Black smear Anomalous behavior in 2-frame-rise-delay response Extremely-dark-green slightly-dark-green transitions Evaluating fine tuned LUTs for the anomalous zone Replace overdrive scale factors by small texture look-ups

Real-time Engine Integration Tips

Real-time Engine Integration Tips SDK rendering vs. App rendering Troubleshooting Judder FOV, texture resolution Asymmetric/symmetric frustum issues Split vs. Shared Eye Textures Multi-threading and timing

Real-time Engine Integration Tips: SPEAKER SDK Rendering vs. App Rendering

SDK rendering vs. App rendering 0.2 SDK didn t do any rendering Only provided parameters needed for proper rendering New rendering backend in SDK 0.4 Takes over critical rendering features App gives SDK L & R eye textures to ovrhmd_endframe()

SDK rendering vs. App rendering SDK 0.4 finishes rendering with the following features: Barrel distortion with chromatic aberration & time warp Internal latency testing & dynamic prediction Low-latency v-sync and flip (even better with direct-to-rift) Pixel Luminance Overdrive Health & Safety Warning

SDK rendering vs. App rendering Our Goal: Allow every engine to use SDK rendering Unity 3D, Unreal Engine 4 use SDK rendering All SDK rendering code is open source, and we highly encourage experimentation. If your engine cannot use SDK rendering, then *please* let us know why not Some valid reasons exist

SDK rendering vs. App rendering Don t have your engine s source code (e.g. XNA)? Need to support new render API? (e.g. AMD s Mantle)? Need a stereo 3D post-fx with distortion (a la CryEngine)? Need to do post-distortion rendering (e.g. debug info)? Show stopper bugs or missing features in our SDK?

SDK rendering vs. App rendering In some cases, what you think is unsupported by our SDK, might actually be supported, but we still want to know! Ping us. We will listen.

SDK rendering vs. App rendering Recommended SDK integration path: Follow OculusRoomTiny for bootstrapping the SDK Demonstrates both SDK rendering & App rendering Then move onto OculusWorldDemo for advanced features

Real-time Engine Integration Tips: SPEAKER Troubleshooting Judder

Judder aka Jitter-in-time Very few complained about judder on the DK1 DK2 has low-persistence + precise tracking & visuals Tracking hitches magnified when working accurately (Think uncanny valley of VR tracking)

Judder Reasons Low fps: Too much work per-frame (provide GFX options!) Highly variable prediction: Unpredictable frame loads Bad refresh rate: Not v-syncing with the HMD display Bad multi-threading: Incorrect usage of timing in SDK

Real-time Engine Integration Tips: SPEAKER FOV & Eye Texture Resolution

FOV & Eye Texture Resolution SDK provides recommended eye texture resolution Eye buffer texture resolution can be modified per-frame Pushed into ovrhmd_endframe() SDK will account for changes during distortion rendering e.g. Can drop eye resolution for better perf

FOV & Eye Texture Resolution Runtime modification can be tested in OculusWorldDemo Hit Tab & enable Render Target->Dynamic Res Scaling Will get better with layered distortion rendering for UI (e.g. high-res UI vs. lower-res 3D scene)

FOV & Eye Texture Resolution Symmetric Projection SDK provides recommended FOVs FOV port: Left, Right, Top, Bottom tangent angles Helpers to convert FOV Port to projection matrix Please: Do not cook up your own projection matrix that looks good enough using your own magic constants. We have seen games with divergent stereo-parallax. BAD! Asymmetric Projection

FOV & Eye Texture Resolution FOV can be set based on your game s needs Set via ovrhmd_configurerendering() SDK will generate new distortion mesh and account for changes during distortion rendering Changing FOV every frame not recommended

Wide FOV HMDs have locked physical FOV Unlike classic monitors, not an arbitrary setting Virtual FOV settings larger than HMD physical FOV are *generally wasteful *could modify FOV to account for time warp filling

Narrow FOV Modified virtual FOV respects physical FOV Otherwise, the view will look like it s viewed from *binoculars (*should be a separate zooming feature)

FOV & Eye Texture Resolution Outer FOV angle > Inner FOV angle? In real-life nose limits inner FOV Better HMD screen utilization by providing larger total horizontal FOV with less frustum overlap DK1 FOVs more asymmetric than DK2 Asymmetric/off-center projection to the rescue!

Real-time Engine Integration Tips: SPEAKER Asymmetric/Symmetric Frustum

Asymmetric/Symmetric Frustum Some engines always expect a symmetric camera frustum Many sub-systems can be fixed, but can be cumbersome Camera frameworks with only symmetric FOV options Deferred rendering interpolation artifacts Culling and LOD algorithms Many more

Asymmetric/Symmetric Frustum Make FOV port symmetric Inner FOV & Outer FOV = max(outer FOV, Inner FOV)

Asymmetric/Symmetric Frustum Make FOV port symmetric Inner FOV & Outer FOV = max(outer FOV, Inner FOV) Clipped-symmetric frustums similar to asymmetric-frustums

Asymmetric/Symmetric Frustum Caveat: Wasted pixel draw on distortion clipped area Perf issue? Scissor 3D scene render to visible distortion area.

Union Camera? Culling can be expensive Especially if done once for each HMD eye camera Some engines cull once from center eye & live with culling artifacts

Union Camera? A unionized HMD camera will allow culling once without artifacts Let us know if you d want a helper function for this Takes in: 2x camera transforms & 2x projections Returns: a camera transform & projection

Real-time Engine Integration Tips: SPEAKER Split vs. Shared Eye Textures

Split vs. Shared Eye Textures To share or not to share? SDK allows splitting two eye textures vs. sharing a single render target for the eye textures ovreyerenderdesc provides viewport for each scenario OculusWorldDemo option Shared RenderTarget

Split vs. Shared Eye Textures Combined Eye Texture - Pros: Per-draw alternate-eye rendering without switching render targets Pipelining on some GPU types Can save state & constant uploads Single-quad post-fx on both eyes

Split vs. Shared Eye Textures Split Eye Texture - Pros: If needed separable work load Smaller texture per-distortion Can modify eye texture resolutions independently w/o wasting space Accidental cross-eye sampling with time warp not a problem

Real-time Engine Integration Tips: SPEAKER Multi-Threading & Timing

Multi-threading & Timing Input devices are usually the first thing the game processes HMD also an input device! Some multi-threaded engines have 2+ frame input latency @ 75 Hz, that s 25+ ms latency given! Please use time warp no matter what!

Multi-threading & Timing e.g. producer/consumer two-thread engine i.e. main & render thread On main thread sample HMD with thread-safe functions: ovrhmd_getframetiming() + ovrhmd_gettrackingstate()

Multi-threading & Timing New API call coming soon: ovrhmd_geteyeposes() Returns HMD Pose + L & R Eye Poses No need for viewadjust by VR app Can be used on main thread & re-used on render thread

Multi-threading & Timing 0.4 SDK time warp only fixes up orientation latency To decrease positional latency: Resample HMD pose via ovrhmd_geteyepose(s) before 3D-scene draw on *render thread*

Multi-threading & Timing Time warp needs precise timing triggers on render thread ovrhmd_beginframe() or ovrhmd_beginframetiming() ovrhmd_endframe() or ovrhmd_endframetiming() No other API call directly affects SDK timing each frame!

Multi-threading & Timing Time warp parameters sent to SDK via ovrhmd_endframe() ovrposef renderpose[2] & ovrtexture eyetexture[2] Important: Time warp doesn t care about when and via what SDK call renderpose[2] was generated ovrhmd_gettrackingstate(), ovrhmd_geteyepose() etc.

Looking Into The Future: SPEAKER Time Warp Improvements

Time Warp Re-projects rendering for a later point in time Done simultaneously with distortion Reduces perceived latency Accounts for DK2 rolling shutter 0.4 SDK only handles orientation, positional possible

Synchronous Time Warp 0.4 SDK supports synchronous time warp Each frame does Distortion + TW after both eyes render Time warp uses later sensor reading Judder if we don t hit frame rate Dropping a frame can be very uncomfortable to the user Much worse than low-fps on regular monitors Can we use Time warp to fill in frames? Run game engine at half rendering speed

Time Warp OculusWorldDemo has features to show Time Warp in action. Hit C to pause rendering 3D scene Time Warp will continue to do its thing

Time Warp You can keep rotating the HMD to look around in the last 3D scene render Artifact: Missing scenery is black Artifact: UI warps with the rest of the 3D scene

Asynchronous Time Warp Time warp in a separate thread Best hope for reducing Judder Fill in the frames based on old rendering Implemented on Android for Gear VR Works well for smoothing head tracking on 30 fps game Requires: GPU Priorities GPU Preemption

Async Time Warp Sample Diagram CPU Eye Render Thread Left Right Left Right Left Right CPU Time warp Thread TW TW TW TW GPU Command Processor Left R- 1 TW R- 2 Left TW Right L- 1 TW L- 2 Right TW Frame N Frame N+1 Frame N+2 Frame N+3

Asynchronous Time Warp - PC Current OS/Windows limitations No fine-grained GPU Preemption GPU rendering priorities are not exposed Working with GPU Vendors AMD supports compute shader preemption NVidia supports GPU context priorities internally Exploring Adaptive and Cooperative Techniques Generating extra TW data just in case Checking TW time from game code

Time Warp Layers May need separate time warp transformations for: World-space FPS avatar/cockpit-space Static UI-space Allows feeding controller input into worldspace time warp while avatar-space time warp works as usual Solution: Layered time warp composition

Looking Into The Future: SPEAKER SDK Roadmap

SDK Roadmap Short Term Driver Improvements & Fixes Multi-Monitor fixes to v-sync related latency & judder Reduce post-present latency Handle Optimus and multi-gpu scenarios Linux Support Asynchronous/Adaptive Time warp Reduce Judder by generating extra frames when needed Time warp Layers Handle Cockpit and Overlays separately from the scene

SDK Roadmap Long Term Runtime DLL Better forward compatibility with HW & Runtime Updates Less frequent API versioning Advanced VR Composition (TBD) Support Layered VR composition with time warp Better state management & Preemption Overlays from secondary application (launcher etc.) Audio See Brian Hook s talk on VR Audio

Questions? SPEAKER Michael Antonov michael.antonov@oculusvr.com Volga Aksoy volga.aksoy@oculusvr.com @volgaksoy

OCULUS TM