Deferred Shading. Shawn Hargreaves

Similar documents
Image Processing and Computer Graphics. Rendering Pipeline. Matthias Teschner. Computer Science Department University of Freiburg

Image Synthesis. Transparency. computer graphics & visualization

Advanced Visual Effects with Direct3D

Modern Graphics Engine Design. Sim Dietrich NVIDIA Corporation

Optimizing AAA Games for Mobile Platforms

Shader Model 3.0. Ashu Rege. NVIDIA Developer Technology Group

Deferred Shading & Screen Space Effects

INTRODUCTION TO RENDERING TECHNIQUES

Lezione 4: Grafica 3D*(II)

Making natural looking Volumetric Clouds In Blender 2.48a

Outline. srgb DX9, DX10, XBox 360. Tone Mapping. Motion Blur

OpenGL Performance Tuning

Optimizing Unity Games for Mobile Platforms. Angelo Theodorou Software Engineer Unite 2013, 28 th -30 th August

GPU(Graphics Processing Unit) with a Focus on Nvidia GeForce 6 Series. By: Binesh Tuladhar Clay Smith

The Evolution of Computer Graphics. SVP, Content & Technology, NVIDIA

Recent Advances and Future Trends in Graphics Hardware. Michael Doggett Architect November 23, 2005

Computer Graphics Hardware An Overview

Dynamic Resolution Rendering

GPU Christmas Tree Rendering. Evan Hart

Scan-Line Fill. Scan-Line Algorithm. Sort by scan line Fill each span vertex order generated by vertex list

Hardware design for ray tracing

A Short Introduction to Computer Graphics

Computer Graphics. Introduction. Computer graphics. What is computer graphics? Yung-Yu Chuang

How To Understand The Power Of Unity 3D (Pro) And The Power Behind It (Pro/Pro)

Hi everyone, my name is Michał Iwanicki. I m an engine programmer at Naughty Dog and this talk is entitled: Lighting technology of The Last of Us,

GPU Architecture. Michael Doggett ATI

Introduction to Computer Graphics

Lecture 15: Hardware Rendering

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

The Car Tutorial Part 1 Creating a Racing Game for Unity

NVPRO-PIPELINE A RESEARCH RENDERING PIPELINE MARKUS TAVENRATH MATAVENRATH@NVIDIA.COM SENIOR DEVELOPER TECHNOLOGY ENGINEER, NVIDIA

SkillsUSA 2014 Contest Projects 3-D Visualization and Animation

3D Math Overview and 3D Graphics Foundations

Low power GPUs a view from the industry. Edvard Sørgård

DATA VISUALIZATION OF THE GRAPHICS PIPELINE: TRACKING STATE WITH THE STATEVIEWER

How To Make A Texture Map Work Better On A Computer Graphics Card (Or Mac)

My Materials. In this tutorial, we ll examine the material settings for some simple common materials used in modeling.

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

3D Computer Games History and Technology

Lecture Notes, CEng 477

Interactive visualization of multi-dimensional data in R using OpenGL

CSE 564: Visualization. GPU Programming (First Steps) GPU Generations. Klaus Mueller. Computer Science Department Stony Brook University

The Future Of Animation Is Games

Far Cry and DirectX. Carsten Wenzel

Performance Optimization and Debug Tools for mobile games with PlayCanvas

Image Synthesis. Fur Rendering. computer graphics & visualization

Maxwell Render 1.5 complete list of new and enhanced features

Writing Applications for the GPU Using the RapidMind Development Platform

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1

IT 386: 3D Modeling and Animation. Review Sheet. Notes from Professor Nersesian s IT 386: 3D Modeling and Animation course

Triangle Scan Conversion using 2D Homogeneous Coordinates

Rally Sport Racing Game: CodeName Space Racer

So, you want to make a photo-realistic rendering of the Earth from orbit, eh? And you want it to look just like what astronauts see from the shuttle

Lighting & Rendering in Maya: Lights and Shadows

GUI GRAPHICS AND USER INTERFACES. Welcome to GUI! Mechanics. Mihail Gaianu 26/02/2014 1

GAME ENGINE DESIGN. A Practical Approach to Real-Time Computer Graphics. ahhb. DAVID H. EBERLY Geometrie Tools, Inc.

John Carmack Archive -.plan (2002)

PRODUCT LIFECYCLE MANAGEMENT COMPETENCY CENTRE RENDERING. PLMCC, JSS Academy of Technical Education, Noida Rendering 1 of 16

Color correction in 3D environments Nicholas Blackhawk

Advanced Rendering for Engineering & Styling

GPU-Driven Rendering Pipelines

OpenGL Insights. Edited by. Patrick Cozzi and Christophe Riccio

Advances in Real-Time Skin Rendering

Computer Animation: Art, Science and Criticism

Volume Rendering on Mobile Devices. Mika Pesonen

Thea Omni Light. Thea Spot Light. Light setup & Optimization

Materials in NX Render

Overview Motivation and applications Challenges. Dynamic Volume Computation and Visualization on the GPU. GPU feature requests Conclusions

Interactive Information Visualization using Graphics Hardware Študentská vedecká konferencia 2006

Computer Graphics Global Illumination (2): Monte-Carlo Ray Tracing and Photon Mapping. Lecture 15 Taku Komura

Multi-Zone Adjustment

Adding Animation With Cinema 4D XL

ABS 731 Lighting Design & Technology. Spring 2006

Our One-Year 3D Animation Program is a comprehensive training in 3D using Alias

Real-Time Realistic Rendering. Michael Doggett Docent Department of Computer Science Lund university

Renderman on Film Combining CG & Live Action using Renderman with examples from Stuart Little 2

MicroStation Visualization V8i (SELECTseries 1) Luxology Update 1

High Dynamic Range and other Fun Shader Tricks. Simon Green

Midgard GPU Architecture. October 2014

Real-Time Graphics Architecture

COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies

2: Introducing image synthesis. Some orientation how did we get here? Graphics system architecture Overview of OpenGL / GLU / GLUT

Illuminating With HDRI

Specular reflection. Dielectrics and Distribution in Ray Tracing. Snell s Law. Ray tracing dielectrics

Getting Started with iray in 3ds Max 2014

CUBE-MAP DATA STRUCTURE FOR INTERACTIVE GLOBAL ILLUMINATION COMPUTATION IN DYNAMIC DIFFUSE ENVIRONMENTS

Rendering Microgeometry with Volumetric Precomputed Radiance Transfer

GPUs Under the Hood. Prof. Aaron Lanterman School of Electrical and Computer Engineering Georgia Institute of Technology

Outdoor beam tracing over undulating terrain

Shadows. Shadows. Thanks to: Frédo Durand and Seth Teller MIT. Realism Depth cue

A Crash Course on Programmable Graphics Hardware

Beginners guide to advanced multi-pass render in C4D.

Contents Rendering Reference

The OpenGL R Graphics System: A Specification (Version 3.3 (Core Profile) - March 11, 2010)

What is a DSLR and what is a compact camera? And newer versions of DSLR are now mirrorless

Shortest Path Algorithms

Approval Sheet. Interactive Illumination Using Large Sets of Point Lights

NVFX : A NEW SCENE AND MATERIAL EFFECT FRAMEWORK FOR OPENGL AND DIRECTX. TRISTAN LORACH Senior Devtech Engineer SIGGRAPH 2013

Real-Time BC6H Compression on GPU. Krzysztof Narkowicz Lead Engine Programmer Flying Wild Hog

Transcription:

Deferred Shading Shawn Hargreaves

Overview Don t bother with any lighting while drawing scene geometry Render to a fat framebuffer format, using multiple rendertargets to store data such as the position and normal of each pixel Apply lighting as a 2D postprocess, using these buffers as input

Comparison: Single Pass Lighting For each object: Render mesh, applying all lights in one shader Good for scenes with small numbers of lights (eg. outdoor sunlight) Difficult to organize if there are many lights Easy to overflow shader length limitations

Comparison: Multipass Lighting For each light: For each object affected by the light: framebuffer += object * light Worst case complexity is num_objects * num_lights Sorting by light or by object are mutually exclusive: hard to maintain good batching Ideally the scene should be split exactly along light boundaries, but getting this right for dynamic lights can be a lot of CPU work

Deferred Shading For each object: Render to multiple targets For each light: Apply light as a 2D postprocess Worst case complexity is num_objects + num_lights Perfect batching Many small lights are just as cheap as a few big ones

Multiple Render Targets Required outputs from the geometry rendering are: Position Normal Material parameters (diffuse color, emissive, specular amount, specular power, etc) This is not good if your lighting needs many input values! (spherical harmonics would be difficult)

Fat Framebuffers The obvious rendertarget layout goes something like: Position A32B32G32R32F Normal A16B16G16R16F Diffuse color A8R8G8B8 Material parameters A8R8G8B8 This adds up to 256 bits per pixel. At 1024x768, that is 24 meg, even without antialiasing! Also, current hardware doesn t support mixing different bit depths for multiple rendertargets

Framebuffer Size Optimizations Store normals in A2R10G10B10 format Material attributes could be palettized, then looked up in shader constants or a texture No need to store position as a vector3: Each 2D screen pixel corresponds to a ray from the eyepoint into the 3D world (think raytracing) You inherently know the 2D position of each screen pixel, and you know the camera settings So if you write out the distance along this ray, you can reconstruct the original worldspace position

My Framebuffer Choices 128 bits per pixel = 12 meg @ 1024x768: Depth R32F Normal + scattering A2R10G10B10 Diffuse color + emissive A8R8G8B8 Other material parameters A8R8G8B8 My material parameters are specular intensity, specular power, occlusion factor, and shadow amount. I store a 2-bit subsurface scattering control in the alpha channel of the normal buffer

Depth Buffer Specular Intensity / Power

Normal Buffer

Diffuse Color Buffer

Deferred Lighting Results

How Wasteful One of my 32 bit buffers holds depth values But the hardware is already doing this for me! It would be nice if there was an efficient way to get access to the existing contents of the Z buffer Pretty please?

Global Lights Things like sunlight and fog affect the entire scene Draw them as a fullscreen quad Position, normal, color, and material settings are read from texture inputs Lighting calculation is evaluated in the pixel shader Output goes to an intermediate lighting buffer

Local Lights These only affect part of the scene We only want to render to the affected pixels This requires projecting the light volume into screenspace. The GPU is very good at that sort of thing At author time, build a simple mesh that bounds the area affected by the light. At runtime, draw this in 3D space, running your lighting shader over each pixel that it covers

Convex Light Hulls The light bounding mesh will have two sides, but it is important that each pixel only gets lit once As long as the mesh is convex, backface culling can take care of this

Convex Light Hulls The light bounding mesh will have two sides, but it is important that each pixel only gets lit once As long as the mesh is convex, backface culling can take care of this

Convex Light Hulls #2 If the camera is inside the light volume, draw backfaces

Convex Light Hulls #3 If the light volume intersects the far clip plane, draw frontfaces If the light volume intersects both near and far clip planes, your light is too big!

Volume Optimization We only want to shade the area where the light volume intersects scene geometry There is no need to shade where the volume is suspended in midair Or where it is buried beneath the ground

Stencil Light Volumes This is exactly the same problem as finding the intersection between scene geometry and an extruded shadow volume The standard stencil buffer solution applies But using stencil requires changing renderstate for each light, which prevents batching up multiple lights into a single draw call Using stencil may or may not be a performance win, depending on context

Light Volume Z Tests A simple Z test can get things half-right, without the overhead of using stencil When drawing light volume backfaces, use D3DCMP_GREATER to reject floating in the air portions of the light

Light Volume Z Tests #2 If drawing frontfaces, use D3DCMP_LESS to reject buried underground light regions Which is faster depends on the ratio of aboveground vs. underground pixels: use a heuristic to make a guess

Alpha Blending This is a big problem! You could simply not do any lighting on alpha blended things, and draw them after the deferred shading is complete. The emissive and occlusion material controls are useful for crossfading between lit and non-lit geometry The return of stippling? Depth peeling is the ultimate solution, but is prohibitively expensive at least for the time being

High Dynamic Range You can do deferred shading without HDR, but that wouldn t be so much fun Render your scene to multiple 32 bit buffers, then use a 64 bit accumulation buffer during the lighting phase Unfortunately, current hardware doesn t support additive blending into 64 bit rendertargets Workaround: use a pair of HDR buffers, and simulate alpha blending in the pixel shader

DIY Alpha Blending Initialize buffers A and B to the same contents Set buffer A as a shader input, while rendering to B, and roll your own blend at the end of the shader This only works as long as your geometry doesn t overlap in screenspace When things do overlap, you have to copy all modified pixels from B back into A, to make sure the input and output buffers stay in sync This is slow, but not totally impractical as long as you sort your lights to minimize screenspace overlaps

HDR Results

Volumetric Depth Tests Neat side effect of having scene depth available as a shader input Gradually fade out alpha as a polygon approaches the Z fail threshold No more hard edges where alpha blended particles intersect world geometry!

Deferred Shading Advantages Excellent batching Render each triangle exactly once Shade each visible pixel exactly once Easy to add new types of lighting shader Other kinds of postprocessing (blur, heathaze) are just special lights, and fit neatly into the existing framework

The Dark Side Alpha blending is a nightmare! Framebuffer bandwidth can easily get out of hand Can t take advantage of hardware multisampling Forces a single lighting model across the entire scene (everything has to be 100% per-pixel) Not a good approach for older hardware But will be increasingly attractive in the future

Any questions? The End