OpenGL ES 2.0 Lighting. CS421 Advanced Computer Graphics Jay Urbain, Ph.D.

Similar documents
INTRODUCTION TO RENDERING TECHNIQUES

Image Processing and Computer Graphics. Rendering Pipeline. Matthias Teschner. Computer Science Department University of Freiburg

Computer Animation: Art, Science and Criticism

CUBE-MAP DATA STRUCTURE FOR INTERACTIVE GLOBAL ILLUMINATION COMPUTATION IN DYNAMIC DIFFUSE ENVIRONMENTS

GRAFICA - A COMPUTER GRAPHICS TEACHING ASSISTANT. Andreas Savva, George Ioannou, Vasso Stylianou, and George Portides, University of Nicosia Cyprus

Computer Graphics Global Illumination (2): Monte-Carlo Ray Tracing and Photon Mapping. Lecture 15 Taku Komura

COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies

Thea Omni Light. Thea Spot Light. Light setup & Optimization

Computer Applications in Textile Engineering. Computer Applications in Textile Engineering

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

GUI GRAPHICS AND USER INTERFACES. Welcome to GUI! Mechanics. Mihail Gaianu 26/02/2014 1

Introduction to Computer Graphics

An introduction to Global Illumination. Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology

PRODUCT LIFECYCLE MANAGEMENT COMPETENCY CENTRE RENDERING. PLMCC, JSS Academy of Technical Education, Noida Rendering 1 of 16

Lighting & Rendering in Maya: Lights and Shadows

A Short Introduction to Computer Graphics

H.Calculating Normal Vectors

Specular reflection. Dielectrics and Distribution in Ray Tracing. Snell s Law. Ray tracing dielectrics

High Dynamic Range and other Fun Shader Tricks. Simon Green

path tracing computer graphics path tracing 2009 fabio pellacini 1

Part I. Basic Maths for Game Design

PHOTON mapping is a practical approach for computing global illumination within complex

Shader Model 3.0. Ashu Rege. NVIDIA Developer Technology Group

Vector Math Computer Graphics Scott D. Anderson

CSE168 Computer Graphics II, Rendering. Spring 2006 Matthias Zwicker

3D Math Overview and 3D Graphics Foundations

Advanced Computer Graphics. Rendering Equation. Matthias Teschner. Computer Science Department University of Freiburg

2: Introducing image synthesis. Some orientation how did we get here? Graphics system architecture Overview of OpenGL / GLU / GLUT

The Evolution of Computer Graphics. SVP, Content & Technology, NVIDIA

CS 4204 Computer Graphics

Lesson 26: Reflection & Mirror Diagrams

ABS 731 Lighting Design & Technology. Spring 2006

Making natural looking Volumetric Clouds In Blender 2.48a

Dhiren Bhatia Carnegie Mellon University

ADVANCED THEORIES FOR CG LIGHTING

L20: GPU Architecture and Models

Computer Graphics CS 543 Lecture 12 (Part 1) Curves. Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

PATH TRACING: A NON-BIASED SOLUTION TO THE RENDERING EQUATION

GPU(Graphics Processing Unit) with a Focus on Nvidia GeForce 6 Series. By: Binesh Tuladhar Clay Smith

Lezione 4: Grafica 3D*(II)

Scan-Line Fill. Scan-Line Algorithm. Sort by scan line Fill each span vertex order generated by vertex list

Higher Education Math Placement

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

Introduction to GPGPU. Tiziano Diamanti

Solving Simultaneous Equations and Matrices

Visualizing Data: Scalable Interactivity

Android and OpenGL. Android Smartphone Programming. Matthias Keil. University of Freiburg

Hardware design for ray tracing

CS 431/636 Advanced Rendering Techniques"

Introduction to Computer Graphics. Reading: Angel ch.1 or Hill Ch1.

Introduction GPU Hardware GPU Computing Today GPU Computing Example Outlook Summary. GPU Computing. Numerical Simulation - from Models to Software

Chapter 10. Bidirectional Path Tracing

Path Tracing. Michael Doggett Department of Computer Science Lund university Michael Doggett

Comp 410/510. Computer Graphics Spring Introduction to Graphics Systems

CSE 564: Visualization. GPU Programming (First Steps) GPU Generations. Klaus Mueller. Computer Science Department Stony Brook University

OpenGL Performance Tuning

Refraction of Light at a Plane Surface. Object: To study the refraction of light from water into air, at a plane surface.

Modern Graphics Engine Design. Sim Dietrich NVIDIA Corporation

Computer-Generated Photorealistic Hair

NVPRO-PIPELINE A RESEARCH RENDERING PIPELINE MARKUS TAVENRATH MATAVENRATH@NVIDIA.COM SENIOR DEVELOPER TECHNOLOGY ENGINEER, NVIDIA

3. INNER PRODUCT SPACES

Introduction to Computer Graphics with WebGL

The 3D rendering pipeline (our version for this class)

Algebra Academic Content Standards Grade Eight and Grade Nine Ohio. Grade Eight. Number, Number Sense and Operations Standard

Computer Graphics. Introduction. Computer graphics. What is computer graphics? Yung-Yu Chuang

Rally Sport Racing Game: CodeName Space Racer

MA 323 Geometric Modelling Course Notes: Day 02 Model Construction Problem

Chapter 17: Light and Image Formation

SkillsUSA 2014 Contest Projects 3-D Visualization and Animation

Blender 2.49b How to generate 3D-images?

Angle - a figure formed by two rays or two line segments with a common endpoint called the vertex of the angle; angles are measured in degrees

Getting Started with iray in 3ds Max 2014

Parallel Web Programming

Monte Carlo Path Tracing

Realtime 3D Computer Graphics Virtual Reality

MATHS LEVEL DESCRIPTORS

Computer Animation of Extensive Air Showers Interacting with the Milagro Water Cherenkov Detector

Light Control and Efficacy using Light Guides and Diffusers

Game Development in Android Disgruntled Rats LLC. Sean Godinez Brian Morgan Michael Boldischar

Hi everyone, my name is Michał Iwanicki. I m an engine programmer at Naughty Dog and this talk is entitled: Lighting technology of The Last of Us,

A.R.I.S.E. Architectural Real-time Interactive Showing Environment. Supervisor: Scott Chase

Deferred Shading & Screen Space Effects

Science In Action 8 Unit C - Light and Optical Systems. 1.1 The Challenge of light

b. In Laser View - click on wave. Pose an explanation that explains why the light bends when it enters the water.

Photorealistic Rendering Techniques in AutoCAD 3D

Angles that are between parallel lines, but on opposite sides of a transversal.

Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002

L 2 : x = s + 1, y = s, z = 4s Suppose that C has coordinates (x, y, z). Then from the vector equality AC = BD, one has

FURTHER VECTORS (MEI)

CS 325 Computer Graphics

AP Physics B Ch. 23 and Ch. 24 Geometric Optics and Wave Nature of Light

Cabri Geometry Application User Guide

Introduction to acoustic imaging

Photon Mapping Made Easy

My Materials. In this tutorial, we ll examine the material settings for some simple common materials used in modeling.

1. Definition of the project. 2. Initial version (simplified texture) 3. Second version (full textures) 5. Modelling and inserting 3D objects

Visualization and Feature Extraction, FLOW Spring School 2016 Prof. Dr. Tino Weinkauf. Flow Visualization. Image-Based Methods (integration-based)

Advances in Real-Time Skin Rendering

University of Tampere Computer Graphics 2013 School of Information Sciences Exercise

Crystal Optics of Visible Light

Transcription:

OpenGL ES 2.0 Lighting CS421 Advanced Computer Graphics Jay Urbain, Ph.D. 1

Lighting effects concepts Objectives Modeling ambient and diffuse light Light sources Use lighting effects in OpenGL ES 2.0 References: Fallout Software: OpenGL Lighting or How Light Sources Work (Long, In-depth Tutorial) Clockworkcoders Tutorials: Per Fragment Lighting Lighthouse3D.com: The Normal Matrix arcsynthesis.org: OpenGL Tutorials: Normal Transformation OpenGL Programming Guide: Chapter 5 Lighting OpenGL ES 2.0 Programming Guide http://www.learnopengles.com 2

Light What we perceive as light is really the aggregation of trillions of tiny particles called photons. Photons fly out of a light source, bounce around thousands or millions of times, and eventually reach our eye where we perceive it as light. How can we simulate the effects of light via computer graphics? 3

Ray Tracing in Nature Think of "ray" as a stream of photons traveling along the same path. Light source emits a ray of light which eventually intersects a surface which interrupts its progress. At point of intersection any combination of three things might happen with this light ray: Reflect Absorb Refract 4

Ray tracing Ray Tracing Mathematically trace actual rays of light in a scene and see where the end up. Provides accurate and realistic results. The downside is that simulating all of those rays is computationally expensive, and usually too slow for real-time rendering. Global lighting model which makes it more difficult to implement on GPUs which inherently perform local operations. Important area of research and significant progress is being made. 5

Computational Burden Simulating real-world process of tracing light rays can be considered extremely wasteful! 6

Rasterization Most real-time computer graphics use rasterization instead. Simulates lighting by approximatingthe result. Given realism of recent games, rasterizationcan also look very nice, and it is fast enough for RT graphics (even on mobile phones) Rasterizationis a local lighting model lending itself to efficient implementation on modern GPUs. Open GL ES is primarily a rasterization library. 7

Types of Lighting Ambient lighting Diffuse lighting Specular lighting Emissive lighting 8

Ambient lighting Base level of lighting that seems to pervade an entire scene. Does not appear to come from any particular light source since it has bounced around so much before reaching you. Experienced outdoors on an overcast day, or indoors as the cumulative effect of many different light sources. Instead of calculating all of the individual lights, we can just set a base light level for the object or scene. 9

Diffuse lighting Light that reaches your eye after bouncing directly off of an object. The illumination level of the object varies with its angle to the lighting. Something facing the light head on is lit more brightly than something facing the light at an angle. Also, we perceive the object to be the same brightness no matter which angle we are at relative to the object. Known aslambert s cosine law. 10

Specular lighting Unlike diffuse lighting, specular lighting changes as we move relative to the object. This gives shininess to the object and can be seen on smoother surfaces such as glass and other shiny objects. 11

Lighting Component Summary Ambient Light: Light that is scatteredwithout direction, i.e., direction is impossible to determine. When ambient light hits a surface, it s scattered equally in all directions. Diffuse Light: Light coming from one direction, so its brighter when it comes down perpendicular to surface versus an angle. Once it hits the surface its scattered equally in all directions. Specular Light: Light coming from one direction, and it tends to bounce off surface in a preferred, i.e., reflected direction. E.g., laser beam. 12

Lighting Components (cont.) Emissive Light: Simulate light coming from object. Adds intensity to object. Does not introduce additional lighting in scene. 13

Simulating light Directional lighting Point lighting Spot lighting 14

Directional lighting Directional lighting usually comes from a bright source that is so far away that it lights up the entire scene evenly and to the same brightness. This light source is the simplest type as the light is the same strengthand directionno matter where you are in the scene. 15

Point lighting Point lights can be added to a scene in order to give more varied and realistic lighting. The illumination of a point lightfalls off with distance. Light rays travel out in all directions with the point light at the center. 16

Spot lighting In addition to the properties of a point light, spot lights also have the direction of light attenuated, usually in the shape of a cone. 17

Ambient lighting the math Ambient lighting is really indirect diffuse lighting. Low-level light which pervades the entire scene. final color = material color * ambient light color Example: object is red and our ambient light is a dim white. final color = {1, 0, 0} * {0.1, 0.1, 0.1} = {0.1, 0.0, 0.0} The final color of the object will be a dim red, which is what you d expect if you had a red object illuminated by a dim white light. This is basic ambient lighting, unless you want to get into more advanced lighting techniques such as radiosity(global lighting algorithm using finite element analysis). 18

Diffuse lighting with point light source For diffuse lighting, we need to add attenuationand a light position. The light position is used to: calculate the angle between the light and the surface, which will affect the surface s overall level of lighting. calculate the distance between the light and the surface, which determines the strength of the light at that point. 19

Diffuse lighting point light source Lambert s cosine law A surface which is facing the light straight-on should be illuminated at full strength. Asurface which is slanted should get less illumination. Calculate angle between the surface and the light. Given two vectors, one being from the light to a point on the surface, and the second being a surface normal. Calculate the cosine by first normalizing each vector so that it has a length of one. Then by calculate thedot productof the two vectors. 20

Diffuse lighting point light source 1) Calculating the lambert factor Lambert factor, range between 0 and 1. 1. light vector = light position - object position 2. cosine = dot_product(object_normal, normalize(light_vector)) 3. lambert factor = max(cosine, 0) Normalization is to unit length. Assume object_normal is already normalized. Dot product of two normalized vectors gives you the cosine and can have a range of-1to1. Clamp to a range of0to1. 21

Diffuse lighting point light source 1) Calculating the lambert factor Example: Flat plane at the origin, surface normal pointing straight up {0, 1, 0}. Light positioned at {0, 10, -10}, or 10 units up and 10 units straight ahead. Want to calculate the light at the origin. light vector = {0, 10, -10}-{0, 0, 0} = {0, 10, -10} object normal = {0, 1, 0} light vector length = square root(0*0 + 10*10 + -10*-10) = square root(200) = 14.14 normalized light vector = {0, 10/14.14, -10/14.14} = {0, 0.707, -0.707} dot product({0, 1, 0}, {0, 0.707, -0.707}) = (0 * 0) + (1 * 0.707) + (0 * -0.707) = 0 + 0.707 + 0 = 0.707 lambert factor = max(0.707, 0) = 0.707 22

Diffuse lighting point light source 2) Calculating the attenuation factor Real light attenuation from a point light source follows the inverse square law: luminosity = 1 / (distance * distance) Since we have a distance of 14.14 (from example), our final luminosity is: luminosity = 1 / (14.14*14.14) = 1 / 200 = 0.005 Note: The inverse square law can lead to a strong attenuation over distance. This is how light from a point light source works in the real world, but since our graphics displays have a limited range, it can be useful to dampen this attenuation factor. 23

Diffuse lighting point light source 3) Calculate the final color final color = material color * (light color * lambert factor * luminosity) Example: red material and a full white light source: final color = {1, 0, 0} * ({1, 1, 1} * 0.707 * 0.005}) 24

Diffuse lighting point light source Summary //Step one light vector = light position - object position cosine = dot product(object normal, normalize(light vector)) lambert factor = max(cosine, 0) //Step two luminosity = 1 / (distance * distance) //Step three final color = material color * (light color * lambert factor * luminosity) 25

26

Shader // Transform the vertex into eye space. + " vec3 modelviewvertex = vec3(u_mvmatrix * a_position); \n" // Transform the normal's orientation into eye space. + " vec3 modelviewnormal = vec3(u_mvmatrix * vec4(a_normal, 0.0)); \n" // Needed for attenuation calculation + " float distance = length(u_lightpos - modelviewvertex); \n" // Get a lighting direction vector from the light to the vertex. + " vec3 lightvector = normalize(u_lightpos - modelviewvertex); \n" 27

Shader (cont.) // Calculate the dot product of the light vector and vertex normal. // If the normal &light vectors are pointing in the same direction then it will // get max illumination. + " float diffuse = max(dot(modelviewnormal, lightvector), 0.1); \n" // Attenuate the light based on distance. + " diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance * distance))); \n" // Multiply the color by the illumination level. It will be interpolated across the triangle. + " v_color= a_color* diffuse; \n" // Multiply the vertex by the matrix to get the final point in normalized screen coordinates. + " gl_position = u_mvpmatrix* a_position; \n" + "} \n"; 28

Pixel Shader pass through final String fragmentshader = "precision mediumpfloat; \n" // Set the default precision to medium. We don't need as high of // a precision in the fragment shader. + "varying vec4 v_color; \n" // This is the color from the vertex shader interpolated across the // triangle per fragment. + "void main() \n" // The entry point for our fragment shader. + "{ \n" + " gl_fragcolor= v_color; \n" // Pass the color directly through the pipeline. + "} \n"; 29

Application // Use culling to remove back faces. GLES20.glEnable(GLES20.GL_CULL_FACE); // Enable depth testing GLES20.glEnable(GLES20.GL_DEPTH_TEST); 30

Per-Vertex versus per-pixel lighting Per-Vertex Lighting For diffuse lighting of objects with smooth surfaces, such as terrain, or for objects with many triangles, this will often be good enough. If your objects don t contain many vertices (such as cubes!) or have sharp corners, vertex lighting can result in artifacts as the light level is linearly interpolated across the polygon. These artifacts also become much more apparent when specular highlights are added to the image. More can be seen at the Wiki article ongouraudshading. 31

Computing Surface Normals In order to display light correctly you are required to compute normal vectors for each polygon in an object. A normal of a polygon is a perpendicular vector in relation to the polygon s surface (v0, v1) for surface s0, s1, s2, s3. OpenGL does not compute normal vectors for you. 32

Surface Normals All models in your 3D scene will be made out of polygons. Convenient to have a function which calculates the normal vector of a polygon. A normal vector of a polygon is the cross productof two vectors located on the surface plane of the polygon. Take any two vectors located on the polygon's plane and calculate their cross product. The cross product will be the resulting normal vector. 33

Calculate Normal Vector typedefstructvertex_s{ float x, y, z; } vertex_t; void normal (vertex_tv[3], vertex_t*normal) { vertex_ta, b; // calculate the vectors A and B, CCW a.x= v[0].x -v[1].x; a.y= v[0].y -v[1].y; a.z= v[0].z -v[1].z; b.x= v[1].x -v[2].x; b.y= v[1].y -v[2].y; b.z= v[1].z -v[2].z; // calculate the cross product normal->x = (a.y* b.z) -(a.z* b.y); normal->y = (a.z* b.x) -(a.x* b.z); normal->z = (a.x* b.y) -(a.y* b.x); normalize(normal); } 34

The Need for Normalization To normalize a normal vector means to reduce its length to unit size. A unit size is just 1. All of the calculated normal vectors are required to be a length of 1 to work properly. 35

How to Perform Normalization Need to reduce the size of a given normal vector to a length of 1. First, find the length of a normal vector by taking all of the coordinate components (x, y and z) of that vector and squaring them. Add all of the squared components and find the square root of that sum. This sum will be the length of the vector. Afterwards, divide each coordinate component of the vector by its derived length and you will get a vector which points in the same exact direction but of unit length. 36

Normalize void normalize (vector_t*v){ // calculate the length of the vector float len= (float)(sqrt((v.x* v.x) + (v.y* v.y) + (v.z* v.z))); // avoid division by 0 if (len== 0.0f) len= 1.0f; // reduce to unit size v.x/= len; v.y/= len; v.z/= len; } 37