Comp 410/510. Computer Graphics Spring Overview - Graphics Pipeline

Similar documents
Introduction to Computer Graphics

Image Processing and Computer Graphics. Rendering Pipeline. Matthias Teschner. Computer Science Department University of Freiburg

CS 325 Computer Graphics

INTRODUCTION TO RENDERING TECHNIQUES

Lecture Notes, CEng 477

Outline. Quantizing Intensities. Achromatic Light. Optical Illusion. Quantizing Intensities. CS 430/585 Computer Graphics I

A Short Introduction to Computer Graphics

Overview. Raster Graphics and Color. Overview. Display Hardware. Liquid Crystal Display (LCD) Cathode Ray Tube (CRT)

Computer Graphics. Anders Hast

Over the past decade, computer and communication technologies have become

COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies

2: Introducing image synthesis. Some orientation how did we get here? Graphics system architecture Overview of OpenGL / GLU / GLUT

Comp 410/510. Computer Graphics Spring Introduction to Graphics Systems

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

Computer Applications in Textile Engineering. Computer Applications in Textile Engineering

Course Overview. CSCI 480 Computer Graphics Lecture 1. Administrative Issues Modeling Animation Rendering OpenGL Programming [Angel Ch.

CSE 564: Visualization. GPU Programming (First Steps) GPU Generations. Klaus Mueller. Computer Science Department Stony Brook University

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1

Computer Graphics Hardware An Overview

CS 4204 Computer Graphics

A Proposal for OpenEXR Color Management

Computer Graphics. Introduction. Computer graphics. What is computer graphics? Yung-Yu Chuang

Introduction to Computer Graphics. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

CUBE-MAP DATA STRUCTURE FOR INTERACTIVE GLOBAL ILLUMINATION COMPUTATION IN DYNAMIC DIFFUSE ENVIRONMENTS

Perception of Light and Color

OpenGL & Delphi. Max Kleiner. 1/22

GRAFICA - A COMPUTER GRAPHICS TEACHING ASSISTANT. Andreas Savva, George Ioannou, Vasso Stylianou, and George Portides, University of Nicosia Cyprus

Thea Omni Light. Thea Spot Light. Light setup & Optimization

SkillsUSA 2014 Contest Projects 3-D Visualization and Animation

CBIR: Colour Representation. COMPSCI.708.S1.C A/P Georgy Gimel farb

PRODUCT LIFECYCLE MANAGEMENT COMPETENCY CENTRE RENDERING. PLMCC, JSS Academy of Technical Education, Noida Rendering 1 of 16

VALLIAMMAI ENGNIEERING COLLEGE SRM Nagar, Kattankulathur

Realtime 3D Computer Graphics Virtual Reality

Instructor. Goals. Image Synthesis Examples. Applications. Computer Graphics. Why Study 3D Computer Graphics?

Computer Vision. Color image processing. 25 August 2014

Graphical displays are generally of two types: vector displays and raster displays. Vector displays

1. Introduction to image processing

Introduction to Computer Graphics. Reading: Angel ch.1 or Hill Ch1.

How To Teach Computer Graphics

Lezione 4: Grafica 3D*(II)

The Evolution of Computer Graphics. SVP, Content & Technology, NVIDIA

1. INTRODUCTION Graphics 2

B2.53-R3: COMPUTER GRAPHICS. NOTE: 1. There are TWO PARTS in this Module/Paper. PART ONE contains FOUR questions and PART TWO contains FIVE questions.

GUI GRAPHICS AND USER INTERFACES. Welcome to GUI! Mechanics. Mihail Gaianu 26/02/2014 1

The Information Processing model

Graphics Pipeline in a Nutshell

THE NATURE OF LIGHT AND COLOR

Teaching Introductory Computer Graphics Via Ray Tracing

Jan Köhnlein and Helmut Weberpals

L20: GPU Architecture and Models

Recent Advances and Future Trends in Graphics Hardware. Michael Doggett Architect November 23, 2005

Color Management Terms

Lecture 1: The Visual System

Introduction to GPGPU. Tiziano Diamanti

Graphics Cards and Graphics Processing Units. Ben Johnstone Russ Martin November 15, 2011

Two Research Schools become ONE

Filters for Black & White Photography

GPU(Graphics Processing Unit) with a Focus on Nvidia GeForce 6 Series. By: Binesh Tuladhar Clay Smith

ABS 731 Lighting Design & Technology. Spring 2006

Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002

How Landsat Images are Made

Improved predictive modeling of white LEDs with accurate luminescence simulation and practical inputs

Visualization and Feature Extraction, FLOW Spring School 2016 Prof. Dr. Tino Weinkauf. Flow Visualization. Image-Based Methods (integration-based)

Calibration Best Practices

Denis White Laboratory for Computer Graphics and Spatial Analysis Graduate School of Design Harvard University Cambridge, Massachusetts 02138

Advanced Rendering for Engineering & Styling

GPU Architecture. Michael Doggett ATI

Computer Graphics Global Illumination (2): Monte-Carlo Ray Tracing and Photon Mapping. Lecture 15 Taku Komura

Computer-Generated Photorealistic Hair

Two hours UNIVERSITY OF MANCHESTER SCHOOL OF COMPUTER SCIENCE. M.Sc. in Advanced Computer Science. Friday 18 th January 2008.

GPGPU Computing. Yong Cao

Radeon HD 2900 and Geometry Generation. Michael Doggett

Introduction Week 1, Lecture 1

Scan-Line Fill. Scan-Line Algorithm. Sort by scan line Fill each span vertex order generated by vertex list

Visualizing Data: Scalable Interactivity

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

The Limits of Human Vision

Color Balancing Techniques

Digital Image Basics. Introduction. Pixels and Bitmaps. Written by Jonathan Sachs Copyright Digital Light & Color

Using Photorealistic RenderMan for High-Quality Direct Volume Rendering

Computer Animation: Art, Science and Criticism

AP Physics B Ch. 23 and Ch. 24 Geometric Optics and Wave Nature of Light

Colour Science Typography Frontend Dev

COMP-557: Fundamentals of Computer Graphics McGill University, Fall 2010

OpenGL Performance Tuning

Masters of Science in Software & Information Systems

Experiment C-31 Color Absorption

The RADIANCE Lighting Simulation and Rendering System

Using MATLAB to Measure the Diameter of an Object within an Image

Fundamentals of Computer Graphics

Better Vision with LED lights

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

Review Vocabulary spectrum: a range of values or properties

Cork Education and Training Board. Programme Module for. 3 Dimensional Computer Graphics. Leading to. Level 5 FETAC

Choosing Colors for Data Visualization Maureen Stone January 17, 2006

Processing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230

Dhiren Bhatia Carnegie Mellon University

Digital Photography Composition. Kent Messamore 9/8/2013

Graphics Input Primitives. 5. Input Devices Introduction to OpenGL. String Choice/Selection Valuator

Transcription:

Comp 410/510 Computer Graphics Spring 2017 Overview - Graphics Pipeline

Recall: Basic Graphics System Graphical objects Output device Input devices Image is formed in Frame Buffer via the process Rasterization

Image Formation In computer graphics, we form images which are generally two dimensional using a process analogous to how images are formed by physical imaging systems Cameras, Microscopes,Telescopes, Human visual system viewing volume camera model

Elements of Image Formation Objects Viewer Light source(s) Attributes (that govern how light interacts with materials in the scene)

Light Light is the part of the electromagnetic spectrum that causes a reaction in our visual systems Generally these are wavelengths in the range of about 350-750 nm (nanometers) Long wavelengths appear as reds and short wavelengths as blues

Human Eye as a Spherical Camera ~100M sensors in retina Rods sense only intensity 3 types of cones sense color Fovea has tightly packed sensors, more cones Periphery has more rods Focal length is about 20mm Pupil/iris controls light entry

Luminance and Color Images Luminance Image Monochromatic Values are gray levels Analogous to working with black and white film or television Color Image Has perceptional attributes of hue, saturation, and brightness Do we have to consider every frequency in visible spectrum? Not necessarily. Can use three primaries (red, green and blue) to approximate any color we can perceive.

Color Formation (or Synthesis) Form a color by adding amounts of three primaries CRTs, projection systems, positive film Primaries are Red (R), Green (G), Blue (B)

Color Systems R, G, B values normalized to (0, 1) interval Humans perceive gray for triples on the diagonal Pure colors on corners

HSI (or HSV) Color System Separates out intensity I from the coding Two values (H & S) encode chromaticity Convenient for designing colors; used in computer graphics and vision algorithms Hue H refers to the perceived color ( purple ) Saturation S models the purity of the color, that is, its dilution by white light ( light purple ) I=(R+G+B)/3: Conversion to gray-level Computation of H and S is a bit more complicated. Darker Pure red Brighter More intensity Hue remains constant Less and less pure (or less saturated)

Synthetic Camera Model projector Equivalent alternatives: p image plane projection of p center of projection -

Pinhole Camera Use trigonometry to find projection of point at (x, y, z) x p = - x z/d y p = - y z/d z p = d These are equations of simple perspective projection

Application Programming Interface (API) Separation of objects, viewer, light sources Leads to simple software API Specify objects, lights, camera, attributes Let implementation determine the image Leads to fast hardware implementation But how is the API implemented?

How to Model Illumination? Some objects are blocked from light Light can reflect from object to object Some objects may be translucent

Local Illumination Computes color or shade independently for each object and for each light source Does not take into account interactions between objects Not very realistic, but fast

Global Illumination - Ray tracing For each image pixel, follow rays of light from center of projection until they are absorbed by objects, or go off to infinity, or reach a light source Can handle global effects (though not exactly) Multiple reflections Translucent and reflecting objects Shadows Slower Need whole data at once

Global (Ray Tracing) vs Local Illumination

Ray Tracing Example

Another Approach for Global Illumination: Radiosity Simulates the propagation of light starting at the light sources. Stores illumination values on the surfaces of the objects, as the light is propagated starting at the light sources Computes interactions between lights and objects more accurately Assumes that light striking a surface is reflected in all directions Radiosity calculation requires the solution of a large set of equations involving all the surfaces Models well diffuse surfaces but not specular surfaces Very slow

Radiosity vs Ray tracing Which one is which?

Why not always use global illumination? Seems more physically-based, so can be used to design a graphics system Possible for simple objects such as polygons and quadrics with simple point light sources But slow and not well-suited for interactive applications Good for creating realistic movies Ray tracing with latest GPUs is now close to real time

Pipeline architecture Practical approach implemented by most API s such as OpenGL, Java3D, DirectX. Considers only local illumination Process objects one at a time in the order they are generated by the application application program Rendering display All steps can be implemented in hardware on the graphics card

Following the Pipeline: Vertex Processing Converts object representations from world coordinate system to camera and then to screen coordinates Every change of coordinates is equivalent to a matrix transformation Used to transform objects,e.g., rotate, translate and scale Vertex processor also computes vertex colors

Projection Must carry out the process that combines the 3D viewer with 3D objects to produce the 2D image Perspective projection: all projectors meet at the center of projection Parallel projection: projectors are parallel, center of projection is replaced by a direction of projection Perspective Parallel

Primitive Assembly Vertices must be collected into geometric primitives before clipping and rasterization can take place, such as - Line segments - Polygons - Curves and surfaces 2 5

Clipping Just as a real camera cannot see the whole world, the virtual camera can only see part of the world space Objects (primitives) that are not within this volume are clipped out of the scene

Rasterization If an object is visible in the image, the corresponding pixels in the frame buffer must be assigned colors Rasterizer produces a set of fragments for each object Fragments are potential pixels Have a location in frame bufffer Color and depth attributes Vertex attributes are interpolated over objects by the rasterizer

Fragment Processing Fragments are processed to determine the color of the corresponding pixel in the frame buffer Colors can be determined by texture mapping or interpolation of vertex colors Fragments may be blocked by other fragments closer to the camera - Hidden-surface removal 2 8

The Programming Interface Programmer sees the graphics system through a software interface: the Application Programming Interface (API)

API Contents Functions that specify what we need to form an image Objects Viewer Light Source(s) Materials Other information Input from devices such as mouse and keyboard

Object Specification Most APIs support a limited set of primitives including Points (0D object) Line segments (1D objects) Polygons (2D objects) Some curves and surfaces Quadrics Parametric polynomials All are defined through locations in space or vertices

Example (old style) type of object Alternatives: GL_POINTS, GL_LINE_STRIP glbegin(gl_polygon) glvertex3f(0.0, 0.0, 0.0); glvertex3f(0.0, 1.0, 0.0); glvertex3f(0.0, 0.0, 1.0); glend( ); location of vertex end of object definition

Example (GPU based) Put geometric data in an array vec3 points[3]; points[0] = vec3(0.0, 0.0, 0.0); points[1] = vec3(0.0, 1.0, 0.0); points[2] = vec3(0.0, 0.0, 1.0); Send array to GPU Tell GPU to render as triangle 3 3

Camera Specification Six degrees of freedom Position of center of projection Orientation Lens (focal length) Film plane Size

Lights and Materials Types of lights Point sources vs distributed sources Spot lights Near and far sources Color properties Material properties Absorption: color properties Scattering Diffuse Specular