OpenGL Shading Language Course. Chapter 4 Advanced Shaders. By Jacobo Rodriguez Villar

Similar documents
Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

Computer Applications in Textile Engineering. Computer Applications in Textile Engineering

Shader Model 3.0. Ashu Rege. NVIDIA Developer Technology Group

OpenGL Shading Language Course. Chapter 5 Appendix. By Jacobo Rodriguez Villar jacobo.rodriguez@typhoonlabs.com

IT 386: 3D Modeling and Animation. Review Sheet. Notes from Professor Nersesian s IT 386: 3D Modeling and Animation course

Introduction to Computer Graphics

Vertex and fragment programs

INTRODUCTION TO RENDERING TECHNIQUES

CSE 564: Visualization. GPU Programming (First Steps) GPU Generations. Klaus Mueller. Computer Science Department Stony Brook University

Image Processing and Computer Graphics. Rendering Pipeline. Matthias Teschner. Computer Science Department University of Freiburg

Making natural looking Volumetric Clouds In Blender 2.48a

High Dynamic Range and other Fun Shader Tricks. Simon Green

The Lighting Effects Filter

GUI GRAPHICS AND USER INTERFACES. Welcome to GUI! Mechanics. Mihail Gaianu 26/02/2014 1

Procedural Shaders: A Feature Animation Perspective

Materials in NX Render

Tutorial: Biped Character in 3D Studio Max 7, Easy Animation

My Materials. In this tutorial, we ll examine the material settings for some simple common materials used in modeling.

Lesson 26: Reflection & Mirror Diagrams

Character Creation You can customize a character s look using Mixamo Fuse:

A. OPENING POINT CLOUDS. (Notepad++ Text editor) (Cloud Compare Point cloud and mesh editor) (MeshLab Point cloud and mesh editor)

Cork Education and Training Board. Programme Module for. 3 Dimensional Computer Graphics. Leading to. Level 5 FETAC

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

Visualization and Feature Extraction, FLOW Spring School 2016 Prof. Dr. Tino Weinkauf. Flow Visualization. Image-Based Methods (integration-based)

Maya 2014 Still Life Part 1 Texturing & Lighting

Bicycle Math. presented to the Math, Computer Science, & Physics Seminar Bard College Annandale-on-Hudson, New York. Timothy E.

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1

Computer Graphics. Introduction. Computer graphics. What is computer graphics? Yung-Yu Chuang

A Short Introduction to Computer Graphics

Quick Tutorial. Overview. The NVIDIA Software Improvement Program. Creating an Effect

Recent Advances and Future Trends in Graphics Hardware. Michael Doggett Architect November 23, 2005

Android and OpenGL. Android Smartphone Programming. Matthias Keil. University of Freiburg

Water Flow in. Alex Vlachos, Valve July 28, 2010

Computer Graphics CS 543 Lecture 12 (Part 1) Curves. Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

Shadows. Shadows. Thanks to: Frédo Durand and Seth Teller MIT. Realism Depth cue

Tutorial 9: Skeletal Animation

Introduction to Computer Graphics with WebGL

Advances in Real-Time Skin Rendering

Methodology for Lecture. Review of Last Demo

CS418 GLSL Shader Programming Tutorial (I) Shader Introduction. Presented by : Wei-Wen Feng 3/12/2008

Vector Math Computer Graphics Scott D. Anderson

3D Drawing. Single Point Perspective with Diminishing Spaces

Modern Graphics Engine Design. Sim Dietrich NVIDIA Corporation

So, you want to make a photo-realistic rendering of the Earth from orbit, eh? And you want it to look just like what astronauts see from the shuttle

GRAFICA - A COMPUTER GRAPHICS TEACHING ASSISTANT. Andreas Savva, George Ioannou, Vasso Stylianou, and George Portides, University of Nicosia Cyprus

3D Print Exporter Documentation 3D PRINT EXPORTER. Version Windows & Mac OSX

Thea Omni Light. Thea Spot Light. Light setup & Optimization

NVPRO-PIPELINE A RESEARCH RENDERING PIPELINE MARKUS TAVENRATH MATAVENRATH@NVIDIA.COM SENIOR DEVELOPER TECHNOLOGY ENGINEER, NVIDIA

AR-media TUTORIALS OCCLUDERS. (May, 2011)

Introduction to Game Programming. Steven Osman

PRODUCT LIFECYCLE MANAGEMENT COMPETENCY CENTRE RENDERING. PLMCC, JSS Academy of Technical Education, Noida Rendering 1 of 16

LLVM for OpenGL and other stuff. Chris Lattner Apple Computer

Interactive Computer Graphics

CUBE-MAP DATA STRUCTURE FOR INTERACTIVE GLOBAL ILLUMINATION COMPUTATION IN DYNAMIC DIFFUSE ENVIRONMENTS

Introduction to GPGPU. Tiziano Diamanti

A Pipeline From COLLADA to WebGL for Skeletal Animation

GPU Shading and Rendering: Introduction & Graphics Hardware

BCC Multi Stripe Wipe

Lezione 4: Grafica 3D*(II)

Basic controls of Rhinoceros 3D software

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

Scan-Line Fill. Scan-Line Algorithm. Sort by scan line Fill each span vertex order generated by vertex list

Billboard Tutorial. NOTE: The first Image is actually transparent any where you see white. The last picture actually IS white.

Illuminating With HDRI

VALLIAMMAI ENGNIEERING COLLEGE SRM Nagar, Kattankulathur

Visualization of 2D Domains

1. Definition of the project. 2. Initial version (simplified texture) 3. Second version (full textures) 5. Modelling and inserting 3D objects

Rapid Android Development

0.8 Rational Expressions and Equations

Blender addons ESRI Shapefile import/export and georeferenced raster import

Last lecture... Computer Graphics:

Intermediate Tutorials Modeling - Trees. 3d studio max. 3d studio max. Tree Modeling Matthew D'Onofrio Page 1 of 12

PART 1 Basic Setup. Section 1.1 Direct The Strokes 1.1.1

Float a Beachball in Psuanmi

Dhiren Bhatia Carnegie Mellon University

SkillsUSA 2014 Contest Projects 3-D Visualization and Animation

SHOW MORE SELL MORE. Top tips for taking great photos

How To Analyze Ball Blur On A Ball Image

How to create PDF maps, pdf layer maps and pdf maps with attributes using ArcGIS. Lynne W Fielding, GISP Town of Westwood

Game Development in Android Disgruntled Rats LLC. Sean Godinez Brian Morgan Michael Boldischar

How To Draw In Autocad

2.3 WINDOW-TO-VIEWPORT COORDINATE TRANSFORMATION

Tutorial for Tracker and Supporting Software By David Chandler

Hi everyone, my name is Michał Iwanicki. I m an engine programmer at Naughty Dog and this talk is entitled: Lighting technology of The Last of Us,

Radeon HD 2900 and Geometry Generation. Michael Doggett

Lab7 : Putting Everything Together

The Car Tutorial Part 1 Creating a Racing Game for Unity

This week. CENG 732 Computer Animation. Challenges in Human Modeling. Basic Arm Model

OpenGL Performance Tuning

VRayPattern also allows to curve geometry on any surface

Specular reflection. Dielectrics and Distribution in Ray Tracing. Snell s Law. Ray tracing dielectrics

ABS 731 Lighting Design & Technology. Spring 2006

Introduction to Computer Graphics. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

B2.53-R3: COMPUTER GRAPHICS. NOTE: 1. There are TWO PARTS in this Module/Paper. PART ONE contains FOUR questions and PART TWO contains FIVE questions.

5: Magnitude 6: Convert to Polar 7: Convert to Rectangular

Web Based 3D Visualization for COMSOL Multiphysics

Introduction to 2D and 3D Computer Graphics Mastering 2D & 3D Computer Graphics Pipelines

Understand the Sketcher workbench of CATIA V5.

Advanced Visual Effects with Direct3D

Blender 3D: Noob to Pro/Die Another Way

Transcription:

OpenGL Shading Language Course Chapter 4 Advanced Shaders By Jacobo Rodriguez Villar jacobo.rodriguez@typhoonlabs.com TyphoonLabs GLSL Course 1/25

Chapter 4: Advanced Shaders INDEX Introduction 2 Per Pixel Illumination Model 2 Bump Mapping 9 Simple Parallax Shader 15 Mandelbrot Shader (Flow Control) 19 Advanced Illumination Model 22 TyphoonLabs GLSL Course 2/25

INTRODUCTION This chapter will focus on advanced illumination algorithms and other miscellaneous items like fractal computing, displacement shaders, and Perlin Noise usage. You should be confident with the material in all previous chapters before reading further. Per Pixel illumination Model (point light, amb, dif, spec*gloss) The theory for this shader was covered in the last two chapters, so here we will only cover what changes need our code in order to achieve per pixel lighting rather than per vertex lighting. First, fill the property grid values with the same information as the last shader (specular * gloss mapping tutorial) and insert two uniforms (texture and camera position). [Vertex shader] varying vec3 normal; varying vec3 position; void main() gl_position = ftransform(); gl_texcoord[0] = gl_texturematrix[0] * gl_multitexcoord0; normal = gl_normal.xyz; position = gl_vertex.xyz; It is easy to see that we ve removed all code related to the light computations. Here only the future values needed for the fragment shader are computed and passed to it. TyphoonLabs GLSL Course 3/25

[Fragment shader] uniform sampler2d texture; uniform vec3 CAMERA_POSITION; varying vec3 normal; varying vec3 position; void main() vec4 specular = vec4(0.0); vec4 diffuse; vec3 norm = normalize(normal); //Important: after interpolation normal modulus!= 1. vec3 lightvector = gl_lightsource[0].position.xyz - position; float dist = length(lightvector); float attenuation = 1.0 / (gl_lightsource[0].constantattenuation + gl_lightsource[0].linearattenuation * dist + gl_lightsource[0].quadraticattenuation * dist * dist); lightvector = normalize(lightvector); float nxdir = max(0.0, dot(norm, lightvector)); diffuse = gl_lightsource[0].diffuse * nxdir * attenuation; if(nxdir!= 0.0) vec3 cameravector = normalize(camera_position - position.xyz); vec3 halfvector = normalize(lightvector + cameravector); float nxhalf = max(0.0,dot(norm, halfvector)); float specularpower = pow(nxhalf, gl_frontmaterial.shininess); specular = gl_lightsource[0].specular * specularpower * attenuation; vec4 texcolor = texture2d(texture, gl_texcoord[0].st); gl_fragcolor = gl_lightsource[0].ambient + (diffuse * vec4(texcolor.rgb,1.0)) + (specular * texcolor.a); This fragment shader contains all the computations that were removed from the vertex shader, but there are few things that are important to mention: First, in the vertex shader it wasn t necessary to normalize normals because they were normalized, but here, the normals have been interpolated, so their values have changed, and the normalization is obligatory. Secondly, we have to pass the interpolated vertex position in order to know the true fragment position by computing the exact vectors from light or camera to the fragment. In this shader there are many possible optimizations. For example, move back the attenuation factor computation to the vertex shader and pass it interpolated. Moving calculus from the fragment shader to the vertex shader always has the same result: it improves the speed (unless you have more vertices than fragments, but that is very unusual) and reduces the quality. TyphoonLabs GLSL Course 4/25

Let s see the difference between per pixel lighting and per vertex lighting: Per vertex lighting Per pixel lighting The difference is obvious, but the best thing is that the tessellation size of the mesh doesn t matter for per pixel lighting. Still, there is a drawback: per pixel lighting shader is much more expensive than per vertex. Here is this shader complete: Notice the nice specular effect, and the dirty surface obscuring some parts. TyphoonLabs GLSL Course 5/25

Here are some screenshots showing how the different light components affect the final render: Ambient = (51,51,51,); Diffuse = Specular = (255,255,255); Front Material Shininess = 64 Light Position = (0,0,2) Per Vertex Lighting Ambient Diffuse Ambient + Diffuse Ambient + Diffuse * texture TyphoonLabs GLSL Course 6/25

Specular Specular * gloss Diffuse + Specular Diffuse * texture + Specular TyphoonLabs GLSL Course 7/25

Per Pixel Lighting Ambient Diffuse Ambient + Diffuse Ambient + Diffuse * texture Specular Specular * gloss TyphoonLabs GLSL Course 8/25

Diffuse + Specular Diffuse * texture + Specular Choose for yourself which lighting model is better. TyphoonLabs GLSL Course 9/25

Bump Mapping Bump mapping is probably the most famous lighting effect. What is bump mapping? To answer this question, an explanation of shading models is in order. Let s see what happens with each one: Vertex: Light: Normal: Flat shading Goraud shading triangle Only the first normal of the triangle is used to compute lighting in the entire triangle. Phong shading The light intensity is computed at each vertex and interpolated across the surface. Bump mapping Normals are interpolated across the surface, and the light is computed at each fragment. Normals are stored in a bumpmap texture, and used instead of Phong normals. Until now, we ve only implemented Goraud and Phong shading; now it's time to do some bump mapping. In order to have enough information for our shader to be able to compute bump mapping lighting effects, we need to define a normal map. A normal map is a texture that holds the normals that must be applied to every fragment of a base texture. That is, if we have a base texture, we also need its bump map. The real maths needed to create a normal map are beyond the purposes of this document, and we will explain the process in a high-level language. A normal map can be built two ways: first, with a high poly model (there are plug-ins for the most common DCC applications that can compute the normals of a 3D high tessellated model and put them into a texture file) and secondly, having the height map of the model (or a height map for a base texture). A height map is TyphoonLabs GLSL Course 10/25

nothing more than a greyscale texture that holds the height of every pixel. For example: Base texture Height map With some math tricks (i.e. taking derivatives on adjacent texels to get the normal at each texel), a height map can be turned into a normal map. Normal Map The normals are scaled and translated (because a normal can have negative values, and a GL_UNSIGNED_BYTE texture can t either) to the range [0,255] and then encoded into RGB pixels ( 127.5 * (normal XYZ + 1.0) RGB ubyte). Then, when we read the normal map (it is automatically scaled and translated to [0,1], as all textures are, into a fragment shader), we only need to do this: vec3 normal = texture2d(mynormalmap, texcoord.st).xyz * 2.0 1.0; and we will have the normal in the range [-1,+1] again and ready to be used. There is one important thing before continuing, which makes this very complex. Those normals are not in the same space as the light vector, so we need to compute the light vector in the vertex shader, translate to the same space that the normals are in, and pass this light vector interpolated to the fragment TyphoonLabs GLSL Course 11/25

shader. But what new space is this, and how we can translate our light vector to this space? It is called Tangent Space, and it is defined for each face of the mesh. We need this space because it allows us to keep the normals unchanged. For example, if we store the normals in object space, when we rotate the model, we have to rotate the normals too, to maintain coherency. However, with the normals relatives to each face, this is not needed. Those normals are independent of the model orientation and position. In order to build this Tangent Space, we need to define an orthonormal pervertex basis, which will define our tangent space. To build this basis we need three vectors. We already have one of them: the vertex Normal. The other vector has to be tangent to the surface at the vertex (that s why this space is called Tangent Space). This vector is the Tangent vector. The last one can be obtained with the cross product of the normal and the tangent vector, and is called the Binormal vector. The computation of the tangent vector is quite complicated. It won't be explained here, but Shader Designer provides both Tangent and Binormal attributes automatically, as well as providing normals and texture coordinates. To convert our incoming light vector to tangent space we need to build a matrix that holds the three vectors and multiply by it. Tx Ty Tz mat3 TBN = Bx By Bz Nx Ny Nz Vector S = surface local vector, Vector O = object space vector Tx Ty Tz Sx, Sy, Sz = Ox, Oy, Oz * Bx By Bz Nx Ny Nz Our object space vector will be in tangent space. Now we must setup our variables and textures. First, load two textures with the textures dialog. Place in texture unit 0 the base texture (for example, mur_ambiant.bmp) and place in texture unit 1 the normal map TyphoonLabs GLSL Course 12/25

(mur_normalmap.bmp). Create two uniforms for these two textures, as well as the CAMERA_POSITION predefined uniform. In order to get a relevant (but not very realistic) specular effect, type 64 into the front material shininess. Next put the light at (0,0,2) and the diffuse contribution to (255, 255, 128) (a little yellow). Those values are not too realistic, but are there to increase the bump effect, for teaching purposes. Because Shader Designer automatically computes the tangent and binormal attributes, there are no more steps. Now, let s code our vertex shader for the bump mapping effect [Vertex shader] attribute vec3 tangent; attribute vec3 binormal; varying vec3 position; varying vec3 lightvec; void main() gl_position = ftransform(); gl_texcoord[0] = gl_texturematrix[0] * gl_multitexcoord0; position = gl_vertex.xyz; mat3 TBNMatrix = mat3(tangent, binormal, gl_normal); lightvec = gl_lightsource[0].position.xyz - position; lightvec *= TBNMatrix; Here, the data is prepared to feed the fragment shader. First, we will compute the usual variables (vertex position and texture coordinates), then create our TBN matrix with the tangent and binormal data. Finally, we compute the light vector in tangent space. TyphoonLabs GLSL Course 13/25

[Fragment shader] uniform sampler2d base; uniform sampler2d normalmap; uniform vec3 CAMERA_POSITION; varying vec3 position; varying vec3 lightvec; void main() vec3 norm = texture2d(normalmap, gl_texcoord[0].st).rgb * 2.0-1.0; vec3 basecolor = texture2d(base, gl_texcoord[0].st).rgb; float dist = length(lightvec); vec3 lightvector = normalize(lightvec); float nxdir = max(0.0, dot(norm, lightvector)); vec4 diffuse = gl_lightsource[0].diffuse * nxdir; float specularpower = 0.0; if(nxdir!= 0.0) vec3 cameravector = normalize(camera_position - position.xyz); vec3 halfvector = normalize(lightvector + cameravector); float nxhalf = max(0.0,dot(norm, halfvector)); specularpower = pow(nxhalf, gl_frontmaterial.shininess); vec4 specular = gl_lightsource[0].specular * specularpower; gl_fragcolor = gl_lightsource[0].ambient + (diffuse * vec4(basecolor.rgb,1.0)) + specular; The first thing to notice here is that the attenuation computation has been removed, because we want this shader to be as simple as possible. The rest of the shader is fairly simple. First, we read the base texel and the normal for this texel. Then we scale and translate it to the range [-1,1] ( * 2.0 1.0 scale and bias). The only difference with this shader and the last one (per pixel lighting) is the way we compute the light vector and the normal. In the per pixel shader, the light vector was in object space, and the normal was interpolated. Here the light vector is in tangent space, and the normal is extracted from a texture. For everything else, the computations remain the same. TyphoonLabs GLSL Course 14/25

Here are some screenshots of the bump map shader, though the best way to notice this effect is with moving lights. Use the rotate light feature to see it, and play with the diffuse and specular components of gl_fragcolor, to see how it affects them individually (for example, gl_fragcolor = diffuse * vec4(basecolor,1.0);). Notice the volume feeling Looks like plastic? Reduce specular and add a gloss map TyphoonLabs GLSL Course 15/25

Simple Parallax Mapping shader We ve seen two advanced lighting shaders that can add a great deal of complexity and realism to our scenes, but when the camera gets closer to the surface, it becomes evident that the surface is plain. Neither bump mapping nor per pixel lighting can fix that. Here we will learn an amazingly simple and cheap shader that increases the realism of our scenes by giving volume to our surfaces. This shader is known by many names: parallax shader, offset mapping, and texel displacement shader to name a few. We will call this shader parallax shader for now. When we sample a texture over a primitive, we take the texture coordinates and map the texture in a one-to-one fashion. Texture Triangle Texture coordinates: Texel: Point on the triangle surface where the texel is applied: For every texel, the rays are parallel; this is one-to-one mapping. Now, if we could have information about the height of every texel, using the camera vector, we could make a displacement in higher texels to create a sort of perspective effect. Let s learn how to achieve this. First, we need two things: the base texture and its height map. Polygon A OFFSET B Eye P T: Actual: Incorrect To Tn T: Displaced: Correct Real surface, represented with the height map TyphoonLabs GLSL Course 16/25

To achieve the desired effect we have to sample texel B, not texel A, so let s see how to compute this offset in a cheap way. To compute a texture coordinate offset at a point P, the eye vector must first be normalized to produce a normalized eye vector V. The height h at the original texture coordinate To is read from the height map. The height is then scaled by a scale factor s and biased by a bias b in order to map the height from the range 0.0, 1.0 to a range that better represents the physical properties of the surface being simulated. The new scaled and biased height hsb is given by hsb = h s + b. An offset is then computed by tracing a vector parallel to the polygon from the point on the surface directly above P to the eye vector. This new vector is the offset and can be added to To to produce the new texture coordinate Tn. Tn = To + ( hsb Vx, y ) The new texture coordinate is used to index a regular texture map and any other maps applied to the surface. This could seem too complex, but if you think about it carefully, you will see that it is actually fairly simple. Final considerations: The computations in the fragment shader must be in tangent space (the eye vector has to be in the same space as the height). [Vertex shader] attribute vec3 tangent; attribute vec3 binormal; uniform vec3 CAMERA_POSITION; varying vec3 eyevec; void main() gl_position = ftransform(); gl_texcoord[0] = gl_texturematrix[0] * gl_multitexcoord0; mat3 TBNMatrix = mat3(tangent, binormal, gl_normal); eyevec = CAMERA_POSITION - gl_vertex.xyz; eyevec *= TBNMatrix; This vertex shader looks very similar to the bump map vertex shader, but here we are converting the camera vector, instead of the light vector, to tangent space. TyphoonLabs GLSL Course 17/25

[Fragment shader] uniform sampler2d basetex; uniform sampler2d heightmap; uniform vec2 scalebias; varying vec3 eyevec; void main() float height = texture2d(heightmap, gl_texcoord[0].st).r; //Our heightmap only has one color channel. float v = height * scalebias.r - scalebias.g; vec3 eye = normalize(eyevec); vec2 newcoords = texcoord + (eye.xy * v); vec3 rgb = texture2d(basetex, newcoords).rgb; gl_fragcolor = vec4(rgb, 1.0); First, we have to read the height from the heightmap texture, perform the scale and bias, and then complete the computations to get the new texture coordinates. Finally, we use those texture coordinates to access the base texture. This is a very pretty shader, and very cheap, but doesn t include lighting calculations. How we can mix this parallax shader with a bump mapping shader to get a stunning effect? This is easy too. Use newcoords to access the bump map and go with the usual lighting calculations. These newcoords could be used to access any other map, like a gloss map. There is one thing that we haven't mentioned yet: the scale and bias factors. They should be filled in manually for a top-notch final appearance. Two good values to start with are: scale = 0.04 and bias = 0.02. Place a uniform for these two values and play with the slider widget until you find an ideal shader. TyphoonLabs GLSL Course 18/25

As happens with the bump mapping shader, this looks better when there is some kind of motion (camera and/or mesh motions). A static screenshot can t convey the effect entirely. TyphoonLabs GLSL Course 19/25

Mandelbrot Fractal Shader (Flow Control) Until now, we haven t used many of the most important tools in every language: flow control and loops. To illustrate these tools we will write a shader that draws the fractal Mandelbrot set. The Mandelbrot set is defined as a recursive function based on imaginary numbers. We will use the plane ST (the one defined by the texture coordinates) to represent those imaginary numbers. The function itself is: Z0 = 0 + 0i; Zn+1= Z 2 n + C One point is defined to be in the Mandelbrot set if its Z >= 2.0 (to avoid the square root, we will use this expression: Z * Z >= 4.0). If at any moment during the iterations, the Z * Z is greater than or equal to 4.0, we will end the loop and this point will be written with a linear interpolation (a gradient) of two colors (uniform variables) based on the number of iterations. If we reach the limit of the iterations (another uniform value), then this point may be outside the Mandelbrot set, and will be drawn with a black color. [Vertex shader] varying vec3 position; void main() position = vec3(gl_multitexcoord0-0.5) * 5.0; gl_position = ftransform(); We ve chosen the texture coordinates to parametrize our surface. By default, the texture coordinates are in the range [0,1], so we are going to remap this space to [-2.5,+2.5]. TyphoonLabs GLSL Course 20/25

[Fragment shader] varying vec3 position; uniform int maxiterations; uniform vec2 center; uniform vec3 outercolor1; uniform vec3 outercolor2; uniform float zoom; void main() float real = position.x * (1.0/zoom) + center.x; float imag = position.y * (1.0/zoom) + center.y; float creal = real; float cimag = imag; float r2 = 0.0; int iter; for (iter = 0; iter < maxiterations && r2 < 4.0; ++iter) float tempreal = real; real = (tempreal * tempreal) - (imag * imag) + creal; imag = 2.0 * tempreal * imag + cimag; // Base the color on the number of iterations. vec3 color; if (r2 < 4.0) color = vec3(0.0); else color = mix(outercolor1, outercolor2, fract(float(iter)*0.05)); gl_fragcolor = vec4 (clamp(color, 0.0, 1.0), 1.0); Uniform variables table: Uniform name Initial value Slider min value Slider max value maxiterations 50 1 128 center 0,0-2.5 +2.5 outercolor1 (1,0,0) Color widget Color widget outercolor2 (0,1,0) Color widget Color widget zoom 1 1 Choose one The only point that must be mentioned here is the center and zoom uniforms. These calculations are done to move the shader playing with the sliders. They are simply other scale and bias factors. TyphoonLabs GLSL Course 21/25

Here is a screenshot of one section of the Mandelbrot set computed in real time. It had been translated and scaled. TyphoonLabs GLSL Course 22/25

An Advanced Illumination Model In this shader, we will mix all that we ve learned so far about illumination techniques. We are going to write a shader that mixes bump mapping, parallax mapping, and gloss mapping. Sound difficult? Not if you understood the previous shaders. First we need at least four texture maps to compute this effect: the base color, the normal map, the height map, and the gloss map. Using a smart arrangement, we can store in one texture file the base color in the RGB channels and the gloss map in the alpha channel. In the other texture, we can store the normals in the RGB channels and the height map in the alpha channel (in the textures folder, there are two texture files like those: parallaxgloss.tga for the base+gloss data, and normalheight.tga for the others). Use these lighting values for light 0: front material shininess = 64, light position at (0,0,2), and the uniforms of the last shaders (CAMERA_POSITION, samplers, and scalebias). We are now ready to begin the shader. [Vertex shader] attribute vec3 tangent; attribute vec3 binormal; uniform vec3 CAMERA_POSITION; varying vec3 eyevec; varying vec3 lightvec; void main() gl_position = ftransform(); gl_texcoord[0] = gl_texturematrix[0] * gl_multitexcoord0; //----------------- Compute TBN Matrix ------------------ mat3 TBNMatrix = mat3(tangent, binormal, gl_normal); //----------------- Compute Eye vector --------------- eyevec = CAMERA_POSITION - gl_vertex.xyz; eyevec *= TBNMatrix; //----------------- Compute Light vector --------------- lightvec = gl_lightsource[0].position.xyz - gl_vertex.xyz; lightvec *= TBNMatrix; As you can see, we only mixed and rearranged the parallax and bump mapping shaders. TyphoonLabs GLSL Course 23/25

[Fragment shader] uniform sampler2d baseandgloss; uniform sampler2d normalandheight; uniform vec2 scalebias; varying vec3 eyevec; varying vec3 lightvec; void main() //------------- Compute displaced texture coordinates---------- float height = texture2d(normalandheight, gl_texcoord[0].st).a; float v = height * scalebias.r - scalebias.g; vec3 eye = normalize(eyevec); vec2 newcoords = gl_texcoord[0].st + (v * eye.xy); //------------- End of compute displaced texture coordinates----- vec3 norm = texture2d(normalandheight, newcoords).rgb * 2.0-1.0; vec4 basecolor = texture2d(baseandgloss,newcoords); float dist = length(lightvec); vec3 lightvector = normalize(lightvec); float nxdir = max(0.0, dot(norm, lightvector)); vec4 diffuse = gl_lightsource[0].diffuse * nxdir; float specularpower = 0.0; if(nxdir!= 0.0) vec3 cameravector = eyevec; vec3 halfvector = normalize(lightvector + cameravector); float nxhalf = max(0.0,dot(norm, halfvector)); specularpower = pow(nxhalf, gl_frontmaterial.shininess); vec4 specular = gl_lightsource[0].specular * specularpower; gl_fragcolor = (diffuse * vec4(basecolor.rgb,1.0)) + vec4(specular.rgb * basecolor.a, 1.0); That's all there is to it. We ve computed the displaced texture coordinates and used them with the resting texture fetches. We ve also used the interpolated tangent space camera vector instead of the CAMERA_POSITION based vector, and at the end, we used the gloss map stored in the alpha channel of the base texture. TyphoonLabs GLSL Course 24/25

Here are two screenshots of the rendered shader: one with the gloss map, and another without it. Notice that the parallax effect is best seen while you are moving the mesh; with static screenshots it isn t noticed as much. gl_fragcolor = (diffuse * vec4(basecolor.rgb,1.0)) + vec4(specular.rgb * basecolor.a,1.0); gl_fragcolor = (diffuse * vec4(basecolor.rgb,1.0)) + vec4(specular.rgb,1.0); TyphoonLabs GLSL Course 25/25