High Dynamic Range and other Fun Shader Tricks. Simon Green



Similar documents
CUBE-MAP DATA STRUCTURE FOR INTERACTIVE GLOBAL ILLUMINATION COMPUTATION IN DYNAMIC DIFFUSE ENVIRONMENTS

INTRODUCTION TO RENDERING TECHNIQUES

Image Processing and Computer Graphics. Rendering Pipeline. Matthias Teschner. Computer Science Department University of Freiburg

Path Tracing. Michael Doggett Department of Computer Science Lund university Michael Doggett

Introduction to Computer Graphics

Shader Model 3.0. Ashu Rege. NVIDIA Developer Technology Group

An introduction to Global Illumination. Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology

Thea Omni Light. Thea Spot Light. Light setup & Optimization

Getting Started with iray in 3ds Max 2014

Specular reflection. Dielectrics and Distribution in Ray Tracing. Snell s Law. Ray tracing dielectrics

GPU Shading and Rendering: Introduction & Graphics Hardware

REAL-TIME IMAGE BASED LIGHTING FOR OUTDOOR AUGMENTED REALITY UNDER DYNAMICALLY CHANGING ILLUMINATION CONDITIONS

Optimizing AAA Games for Mobile Platforms

Recent Advances and Future Trends in Graphics Hardware. Michael Doggett Architect November 23, 2005

The Evolution of Computer Graphics. SVP, Content & Technology, NVIDIA

Illuminating With HDRI

OpenGL Performance Tuning

A Short Introduction to Computer Graphics

Visualization and Feature Extraction, FLOW Spring School 2016 Prof. Dr. Tino Weinkauf. Flow Visualization. Image-Based Methods (integration-based)

Computer Applications in Textile Engineering. Computer Applications in Textile Engineering

Using Photorealistic RenderMan for High-Quality Direct Volume Rendering

Dhiren Bhatia Carnegie Mellon University

The Future Of Animation Is Games

Flame On: Real-Time Fire Simulation for Video Games. Simon Green, NVIDIA Christopher Horvath, Pixar

SkillsUSA 2014 Contest Projects 3-D Visualization and Animation

Making natural looking Volumetric Clouds In Blender 2.48a

Advanced Computer Graphics. Rendering Equation. Matthias Teschner. Computer Science Department University of Freiburg

CSE168 Computer Graphics II, Rendering. Spring 2006 Matthias Zwicker

Computer Graphics Global Illumination (2): Monte-Carlo Ray Tracing and Photon Mapping. Lecture 15 Taku Komura

Advances in Real-Time Skin Rendering

Maya 2014 Still Life Part 1 Texturing & Lighting

Water Flow in. Alex Vlachos, Valve July 28, 2010

Deferred Shading & Screen Space Effects

PRODUCT LIFECYCLE MANAGEMENT COMPETENCY CENTRE RENDERING. PLMCC, JSS Academy of Technical Education, Noida Rendering 1 of 16

Workstation Applications for Windows. NVIDIA MAXtreme User s Guide

Volume Rendering on Mobile Devices. Mika Pesonen

CAUSTICS are complex patterns of shimmering

Introduction Computer stuff Pixels Line Drawing. Video Game World 2D 3D Puzzle Characters Camera Time steps

Computer-Generated Photorealistic Hair

Lesson 26: Reflection & Mirror Diagrams

Introduction to Computer Graphics. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

1. Definition of the project. 2. Initial version (simplified texture) 3. Second version (full textures) 5. Modelling and inserting 3D objects

IT 386: 3D Modeling and Animation. Review Sheet. Notes from Professor Nersesian s IT 386: 3D Modeling and Animation course

Course Overview. CSCI 480 Computer Graphics Lecture 1. Administrative Issues Modeling Animation Rendering OpenGL Programming [Angel Ch.

Software Virtual Textures

A Proposal for OpenEXR Color Management

Lezione 4: Grafica 3D*(II)

My Materials. In this tutorial, we ll examine the material settings for some simple common materials used in modeling.

The Lighting Effects Filter

CS 431/636 Advanced Rendering Techniques"

Modern Graphics Engine Design. Sim Dietrich NVIDIA Corporation

How To Make A Texture Map Work Better On A Computer Graphics Card (Or Mac)

GPU(Graphics Processing Unit) with a Focus on Nvidia GeForce 6 Series. By: Binesh Tuladhar Clay Smith

Rendering Area Sources D.A. Forsyth

Fire Simulation For Games

Lecture 16: A Camera s Image Processing Pipeline Part 1. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Computer Graphics Hardware An Overview

Reflectance Measurements of Materials Used in the Solar Industry. Selecting the Appropriate Accessories for UV/Vis/NIR Measurements.

Procedural Shaders: A Feature Animation Perspective

AUDIO. 1. An audio signal is an representation of a sound. a. Acoustical b. Environmental c. Aesthetic d. Electrical

PATH TRACING: A NON-BIASED SOLUTION TO THE RENDERING EQUATION

Interactive Level-Set Deformation On the GPU

Digital Image Requirements for New Online US Visa Application

Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002

Monte Carlo Path Tracing

Float a Beachball in Psuanmi

Introduction GPU Hardware GPU Computing Today GPU Computing Example Outlook Summary. GPU Computing. Numerical Simulation - from Models to Software

OpenEXR Image Viewing Software

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

Computer Graphics CS 543 Lecture 12 (Part 1) Curves. Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

Learning about light and optics in on-line general education classes using at-home experimentation.

Digital Imaging and Multimedia. Filters. Ahmed Elgammal Dept. of Computer Science Rutgers University

Monte Carlo Path Tracing

Dynamic Resolution Rendering

GPU Christmas Tree Rendering. Evan Hart

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

So, you want to make a photo-realistic rendering of the Earth from orbit, eh? And you want it to look just like what astronauts see from the shuttle

GPU Architecture. Michael Doggett ATI

GUI GRAPHICS AND USER INTERFACES. Welcome to GUI! Mechanics. Mihail Gaianu 26/02/2014 1

How many PIXELS do you need? by ron gibbs

Understanding Line Scan Camera Applications

Using Image J to Measure the Brightness of Stars (Written by Do H. Kim)

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1

Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ.

Lighting & Rendering in Maya: Lights and Shadows

COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies

Interactive Local Tone Mapping Operator with the Support of Graphics Hardware

Cloud function tutorial

What is a DSLR and what is a compact camera? And newer versions of DSLR are now mirrorless

Hardware design for ray tracing

PHOTON mapping is a practical approach for computing global illumination within complex

Overview. Raster Graphics and Color. Overview. Display Hardware. Liquid Crystal Display (LCD) Cathode Ray Tube (CRT)

Lighting Estimation in Indoor Environments from Low-Quality Images

Outline. srgb DX9, DX10, XBox 360. Tone Mapping. Motion Blur

Calibration Best Practices

path tracing computer graphics path tracing 2009 fabio pellacini 1

Lecture 15: Hardware Rendering

Color correction in 3D environments Nicholas Blackhawk

Three-Dimensional Data Recovery Using Image-Based Modeling

MicroStation Visualization V8i (SELECTseries 1) Luxology Update 1

Transcription:

High Dynamic Range and other Fun Shader Tricks Simon Green

Demo Group Motto If you can t make it good, make it big. If you can t make it big, make it shiny.

Overview The OpenGL vertex program and texture shader extensions enable hardware acceleration of effects that were previously only possible in offline rendering 3 case studies: High Dynamic Range Images Real-time Caustics Procedural Terrain

Case Study 1: High Dynamic Range Summary: displaying HDR images in realtime Definition of dynamic range: The ratio of the maximum intensity in an image to the minimum detectable intensity Most imagery used in computer graphics today is stored in 8 bits per component Low Dynamic Range: 0 = black, 255 = white Light in the real world is not constrained in this way! Dynamic range between bright sunshine and shadow can easily be 10,000 to 1

Exposure time = 0.270 s

Exposure time = 0.133 s

Exposure time = 0.066 s

Exposure time = 0.033 s

What is High Dynamic Range? The human visual system adapts automatically to changes in brightness In photography, shutter speed and lens aperture are used to control the amount of light that reaches the film HDR imagery attempts to capture the full dynamic range of light in real world scenes Measures radiance = amount of energy per unit time per unit solid angle per unit area W / (sr.m 2) 8 bits is not enough!

Why Do We Need HDR? It effectively allows us to change the exposure after we've taken/rendered the picture Dynamic adaptation effects e.g. moving from a bright outdoor environment to indoors Allows physically plausible image-based lighting BRDFs may need high dynamic range Enables realistic optical effects glows around bright light sources, more accurate motion blurs

Creating HDR Images from Photographs "Recovering High Dynamic Range Radiance Maps from Photographs", Debevec, Malik, Siggraph 1997 Using several images of the same scene taken with different exposures: Calculates the non-linear response curve of camera Recovers the actual radiance at each pixel Environment maps can be captured by either: Photographing a mirrored sphere ( lightprobe ) Combining 2 or more 180 degree fisheye images

Displaying HDR Images To display an HDR image at a given exposure, we use the following equation: Z = f(et) where Z = pixel value E = irradiance value t = exposure time f = camera response curve

Displaying HDR Images using Graphics Hardware Previous work: Real-Time High Dynamic Range Imagery, Cohen, Tchou, Hawkins, Debevec, Eurographics 2001 Split HDR image into several 8-bit textures, display by recombining using multitexturing and register combiners on NVIDIA TNT2 and above Hard because combiners treat texture values as fixed-point numbers between 0 and 1. Largest number you can multiply by is 4 Requires different combiner setups for different exposure ranges, so exposure can only be changed on a per-primitive basis

Representing HDR Imagery in OpenGL GeForce3/4 support a 16-bit format known as HILO Stores 2 16-bit components: (HI, LO, 1) Filtered by hardware at 16-bit precision Signed version intended for storing high-precision normal maps We can also use this format to store high(er) dynamic range imagery Remap floating point HDR data to gamma encoded 16-bit fixed-point range [0, 65535] Unfortunately, only two components so we need two HILO textures to store RGB

Displaying HDR Images using the OpenGL Texture Shader Extension To display the image, we need to multiply the HDR radiance values by the exposure factor, and then re-map them to the displayable [0,255] range This can be achieved using the GL_DOT_PRODUCT_TEXTURE_2D operation of the OpenGL texture shader extension Exposure is sent as texture coordinates, the dot product performs the multiply for both channels We create a 2D texture that maps the result back to displayable values

Displaying HDR Images using OpenGL Texture Shaders NVParse code:!!ts1.0 texture_cube_map(); dot_product_2d_1of2(tex0); dot_product_2d_2of2(tex0); Pseudo code: 0: hilo = texture_cube_map(hdr_texture, s0, t0, r0) 1: dot1 = s1*hi + t1*lo + r1*1.0; // = r_exposure*r + 0 + r_bias 2: dot2 = s2*hi + t2*lo + r2*1.0; // = 0 + g_exposure*g + g_bias color = texture_2d(lut_texture, dot1, dot2)

Displaying HDR Images using OpenGL Texture Shaders Requires 2 passes to render RGB, using glcolormask to mask off color channels First pass renders R and G: texcoord1 = (r_exposure, 0.0, r_bias) texcoord2 = (0.0, g_exposure, g_bias) Second pass renders B: texcoord1 = (0, 0, 0) texcoord2 = (b_exposure, 0.0, b_bias)

Exposure = 0.25

Exposure = 0.0625

Exposure = 0.015625

HDR Effects HDR Fresnel Glow Automatic exposure Vignette

HDR Fresnel Surfaces more tangent to the viewer reflect more Reflectivity can vary by a factor of 20 or more HDR environment map produces more accurate results Calculate per-vertex in vertex program Approximate Fresnel function as (1-V.N)^p Send down exposure as texture coordinate

Image-Space Glow Also known as: Glare Specular bloom Flare Blur image of bright parts of scene Can use hardware mipmap generation and LOD bias to calculate box filtering Ideally should do convolution with HDR values Real Gaussian blur would be smoother Blend back on top of original image Glow reaches around object

Original image

Blurred version

Glow = original + blurred

Image Based Lighting Lighting synthetic objects with real light An environment map represents all light arriving at a point for each incoming direction By convolving (blurring) an environment map with the diffuse reflection function (N.L) we can create a diffuse reflection map Indexed by surface normal N, this gives the sum of N.L for all light sources in the hemisphere Very slow to create Low freq - cube map can be small - e.g. 32x32x6 HDRShop will do this for you

References "Recovering High Dynamic Range Radiance Maps from Photographs", Debevec, Malik, Siggraph 1997 "Real-time High Dynamic Range Texture Mapping", Cohen, Tchou, Hawkins, Debevec, Eurographics Rendering Workshop 2001 Illumination and Reflection Maps: Simulated Objects in Simulated and Real Environments, Gene S. Miller and C. Robert Hoffman, Siggraph 1984 Course Notes for Advanced Computer Graphics Animation "Real Pixels", Greg Ward, Graphics Gems II P.80-83 http://www.debevec.org/

Case Study 2: Real-time Caustics Summary: simulating refractive caustics in realtime using OpenGL and vertex programs Inspired by Jos Stam s work at A/W What are caustics? Light patterns seen on bottom of swimming pools Caused by focusing of reflected or refracted light Traditionally calculated offline using photon mapping etc. Usually approximated in realtime using precalculated textures

Step 1: Generate Water Surface Drawn as triangle mesh Displaced using 4 octaves of procedural noise Each octave translates at speed proportional to frequency Calculated on CPU

Step 2: Refract Light Ray Using a vertex program: Calculate light ray from local light source to surface vertex Calculate refraction of ray about vertex normal Determine intersection between refracted ray and floor plane Y = Yo + Yd * t = 0 t = -Yo / Yd Set vertex position to intersection This gives refracted mesh on bottom of pool

Step 3: Simulate Light Focusing Use additive blending But we want intensity to be inversely proportional to area of triangle. Assuming the same of amount of light hits each triangle on the surface, smaller triangles = more focused, therefore brighter. We could send all three triangle vertices to calculate area, but that would be slow. Trick: use texture LOD as measure of projected area.

Step 3: Simulate Light Focusing (cont.) Create a texture with just two mipmap levels: 2x2: all black 1x1: all white Apply texture to refracted mesh, set texture coordinates so that pixels map roughly to texels. With tri-linear filtering, this will produce shades of gray depending on amount texture is minified. Unfortunately this is view dependent. Solution - render caustics from above using orthographic projection. Copy to texture.

Step 4: Final Surface Refraction Calculate refraction of view vector about surface normal. Intersect refracted ray with floor Calculate texture coordinates for caustic texture Also calculate reflected ray, used to index into environment cubemap. Attenutate reflection using Fresnel approximation. Result: convincing refractive caustics in real-time. Can also do refraction three times with different indices of refraction to simulate refractive dispersion (aka chromatic aberration )

References Jos Stam s Periodic Caustic Textures : http://www.dgp.toronto.edu/people/stam/reality/rese arch/periodiccaustics/index.html

Case Study 3: Procedural Terrain Summary: generate procedural terrain using vertex programs, register combiners and 3D textures Advantages of Procedural Modeling Small storage requirements Non-repeating Parameterized Disadvantages of Procedural Modeling Computation time Harder to control Not really practical on current hardware

Step 1: Noise in Vertex Program Displace triangle mesh using procedural noise Geometry doesn t move, just the displacement Similar to Perlin noise Uses permutation table stored in constant memory Generates a repeatable random value using recursive lookup into table based on vertex position Interpolates between 4 neighbors to produce smooth result Requires 42 vertex program instructions!

Step 2. Ridged Multi-fractal Function Ken Musgrave s trick to make noise look more like terrain Ridges: Take absolute value of signed noise Subtract from 1 Square result to produce sharper ridges Multi-fractal Scale each octave by previous result Valleys are smooth, peaks are rough We only have room for 2 octaves

Ridged Multi-fractal Function

Step 3: Lighting Hard to calculate normals in vertex program Calculate in image-space using register combiners instead Render terrain height field from above, color = height Copy to texture Bind to three texture units with offset texture coordinates Calculate approximate normal in register combiners: ( h(x,y)-h(x+1,y), h(x,y)-h(x,y+1), 1) Calculate diffuse lighting as N.L

Step 4. Texturing Use a 3D texture Different terrain type for each 2D slice R texture coordinate determines terrain type, computed in vertex program based on height Unfortunately 3D textures are mip-mapped in 3D! From a distance, all layers blend to a single image Can duplicate slices to help avoid blending Add reflective lakes: render scene upside down to texture displace in image space using GL_OFFSET_PROJECTIVE_TEXTURE_2D_NV

Terrain Textures

The Future Hardware is getting Faster More programmable Higher precision Today s off-line rendering effects will be real-time tomorrow Start thinking about it now!

Questions? E-mail: sgreen@nvidia.com