Colorado School of Mines Computer Vision Professor William Hoff

Similar documents
Geometric Camera Parameters

Endoscope Optics. Chapter Introduction

Revision problem. Chapter 18 problem 37 page 612. Suppose you point a pinhole camera at a 15m tall tree that is 75m away.

How an electronic shutter works in a CMOS camera. First, let s review how shutters work in film cameras.

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec.

Solution Derivations for Capa #14

How does my eye compare to the telescope?

Study of the Human Eye Working Principle: An impressive high angular resolution system with simple array detectors

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

Lecture 17. Image formation Ray tracing Calculation. Lenses Convex Concave. Mirrors Convex Concave. Optical instruments

Chapter 6 Telescopes: Portals of Discovery. How does your eye form an image? Refraction. Example: Refraction at Sunset.

Science In Action 8 Unit C - Light and Optical Systems. 1.1 The Challenge of light

Thin Lenses Drawing Ray Diagrams

9/16 Optics 1 /11 GEOMETRIC OPTICS

CS 325 Computer Graphics

Geometrical Optics - Grade 11

waves rays Consider rays of light from an object being reflected by a plane mirror (the rays are diverging): mirror object

How does my eye compare to the telescope?

The Geometry of Perspective Projection

Lenses and Telescopes

Visual perception basics. Image aquisition system. P. Strumiłło

PDF Created with deskpdf PDF Writer - Trial ::

Solution Guide III-C. 3D Vision. Building Vision for Business. MVTec Software GmbH

Light and its effects

OmniBSI TM Technology Backgrounder. Embargoed News: June 22, OmniVision Technologies, Inc.

Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002

Untangling the megapixel lens myth! Which is the best lens to buy? And how to make that decision!

4. CAMERA ADJUSTMENTS

Info425, UW ischool 10/11/2007

White paper. In the best of light The challenges of minimum illumination

The light. Light (normally spreads out straight and into all directions. Refraction of light

Light and Sound. Pupil Booklet

WHITE PAPER. Are More Pixels Better? Resolution Does it Really Matter?

Section 12.6: Directional Derivatives and the Gradient Vector

3D Scanner using Line Laser. 1. Introduction. 2. Theory

The Limits of Human Vision

7.2. Focusing devices: Unit 7.2. context. Lenses and curved mirrors. Lenses. The language of optics

Review Vocabulary spectrum: a range of values or properties

Binocular Vision and The Perception of Depth

Experiment 3 Lenses and Images

Comparing Digital and Analogue X-ray Inspection for BGA, Flip Chip and CSP Analysis

Anamorphic imaging with three mirrors: a survey

Synthetic Sensing: Proximity / Distance Sensors

Convex Mirrors. Ray Diagram for Convex Mirror

Magnification Devices

Telescope Types by Keith Beadman

Handy Pinhole Camera (Latin Camera Obscura) to observe the transit of Venus, eclipses and other phenomena occurring on the Sun

λ = c f, E = h f E = h c λ

DICOM Correction Item

Lesson 29: Lenses. Double Concave. Double Convex. Planoconcave. Planoconvex. Convex meniscus. Concave meniscus

3D Tranformations. CS 4620 Lecture 6. Cornell CS4620 Fall 2013 Lecture Steve Marschner (with previous instructors James/Bala)

EXPERIMENT 6 OPTICS: FOCAL LENGTH OF A LENS

HOMEWORK 4 with Solutions

EVIDENCE PHOTOGRAPHY TEST SPECIFICATIONS MODULE 1: CAMERA SYSTEMS & LIGHT THEORY (37)

Spatial location in 360 of reference points over an object by using stereo vision

CHAPTER 8, GEOMETRY. 4. A circular cylinder has a circumference of 33 in. Use 22 as the approximate value of π and find the radius of this cylinder.

1 of 9 2/9/2010 3:38 PM

Parts List. The HyperStar Lens Assembly includes three pieces: HyperStar Lens Secondary Mirror Holder Counterweight

Chapter 36 - Lenses. A PowerPoint Presentation by Paul E. Tippens, Professor of Physics Southern Polytechnic State University

Rutgers Analytical Physics 750:228, Spring 2016 ( RUPHY228S16 )

INFRARED THERMAL IMAGING DEFINITIONS AND TERMS

RAY OPTICS II 7.1 INTRODUCTION

Chapter 17: Light and Image Formation

Chapter 27 Optical Instruments The Human Eye and the Camera 27.2 Lenses in Combination and Corrective Optics 27.3 The Magnifying Glass

INTRODUCTION TO RENDERING TECHNIQUES

AP Physics B Ch. 23 and Ch. 24 Geometric Optics and Wave Nature of Light

Understanding Line Scan Camera Applications

Space Perception and Binocular Vision

The Information Processing model

Basic Problem: Map a 3D object to a 2D display surface. Analogy - Taking a snapshot with a camera

B4 Computational Geometry

Polarization of Light

High Resolution Planetary Imaging Workflow

Image Formation. 7-year old s question. Reference. Lecture Overview. It receives light from all directions. Pinhole

Basic Manual Control of a DSLR Camera

Master Anamorphic T1.9/35 mm

Chapter 22: Mirrors and Lenses

Theremino System Theremino Spectrometer Technology

1. You stand two feet away from a plane mirror. How far is it from you to your image? a. 2.0 ft c. 4.0 ft b. 3.0 ft d. 5.0 ft

ZEISS Compact Prime and Zoom lenses Flexibility and performance in a winning combination.

Motion Activated Camera User Manual

CONFOCAL LASER SCANNING MICROSCOPY TUTORIAL

Surveillance Security Camera Guide

Processing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230

Understanding Megapixel Camera Technology for Network Video Surveillance Systems. Glenn Adair

Optics and Geometry. with Applications to Photography Tom Davis November 15, 2004

Filters for Digital Photography

A System for Capturing High Resolution Images

Lesson 26: Reflection & Mirror Diagrams

Experimental and modeling studies of imaging with curvilinear electronic eye cameras

Digital Photography Composition. Kent Messamore 9/8/2013

Introduction to Computer Graphics. Reading: Angel ch.1 or Hill Ch1.

Understanding Network Video Security Systems

Equations, Lenses and Fractions

Help maintain homeostasis by capturing stimuli from the external environment and relaying them to the brain for processing.

WHITE PAPER. Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception

is in plane V. However, it may be more convenient to introduce a plane coordinate system in V.

Why pinhole? Long exposure times. Timeless quality. Depth of field. Limitations lead to freedom

Transcription:

Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1

Sensors and Image Formation Imaging sensors and models of image formation Coordinate systems

Digital Images Digital images are stored as arrays of numbers Numbers can represent Intensity (gray level, or each color band) Range X ray absorption coefficient etc 164 5 47 56 0 181 13 162 62 16 11 186 176 172 13 158 4 78 118 18 187 153 4 177 162 174 18 188 13 2 155 185 184 27 11 164 168 178 12 18 175 150 174 185 166 43 16 170 168 173 188 185 17 174 178 186 117 20 11 162 161 167 177 181 186 186 183 165 37 11 151 154 144 155 171 177 176 15 43 11 155 150 142 114 8 8 7 35 5 43 84 135 146 51 16 65 73 15 20 2 208 151 48 34 47 45 2 84 47 11 11 13 203 20 2 188 141 2 138 123 65 22 12 10 18 208 204 186 140 43 77 154 166 1 34 16 173 183 202 20 18 0 14 28 135 171 171 147 6 18 138 172 10 201 172 36 1 44 115 170 175 134 5 16 167 175 178 154 68 25 35 77 158 180 165 8 26 12 183 177 166 1 1 40 3 35 114 3 33 2 88 16 14 3 15 175 171 115 3

Intensity Image Sensors Basic elements of an imaging device Aperture An opening (or pupil ) to limit amount of light, and angle of incoming light rays Optical system Lenses purpose is to focus light from a scene point to a single image point Imaging photosensitive surface Film or sensors, usually a plane Image plane Optical system aperture Optical axis 4

Biological Vision A Guide to Developing? Unfortunately, it is pre attentive Difficult to measure and study detailed function Some information from experiments with Animals (electrophysiology) People s perception (psychophysics) A great deal of processing is done right in the retina Data reduction of ~0 million receptors down to ~1 million optic nerve channels Further processing done in the visual cortex Specialized detectors for motion, shape, color, binocular disparity Evidence for maps in the cortex in correspondence to the image Still, it is an existence proof that it can be done, and well Notes: Eyeball is about 20 mm in diameter Retina contains both rods and cones Fovea is about 1.5 mm in diameter, contains about 337,000 cones Focal length about 17 mm 5

Digital Camera Image plane is a 2D array of sensor elements CCD type (Charge coupled device) Charge accumulates during exposure Charges are transferred out to shift registers, digitized and read out sequentially CMOS type (complementary metal oxide on silicon) Light affects the conductivity (or gain) of each photodetector Digitized and read out using a multiplexing scheme Main design factors Number and size of sensor elements Chip size ADC resolution 6

Thin Lens Rays parallel to the optical axis are deflected to go through the focus Rays passing through the center are undeflected World point P is in focus at image point p Equation of a thin lens: 1 1 1 Z z ' f 7

Pinhole Camera Model A good lens can be modeled by a pinhole camera; ie., each ray from the scene passes undeflected to the image plane Simple equations describe projection of a scene point onto the image plane ( perspective projection ) We will use the pinhole camera model exclusively, except for a little later in the course where we model lens distortion in real cameras The pinhole camera ( camera obscura ) was used by Renaissance painters to help them understand perspective projection 8

Perspective Projection Equations For convenience (to avoid an inverted image) we treat the image plane as if it were in front of the pinhole The XYZ coordinates of the point are with respect to the camera origin P(X,Y,Z) X p(x,y) Z f = focal length We define the origin of the camera s coordinate system at the pinhole (note this is a 3D XYZ coordinate frame Field of view ( )? By similar triangles, x=f X/Z, y=f Y/Z /2 f w/2 tan( /2) = (w/2)/f

Camera vs Image Plane Coords C P(X,Y,Z) Camera coordinate system {C} A 3D coordinate system (X,Y,Z) units say, in meters Origin at the center of projection Z axis points outward along optical axis X points right, Y points down f {C} x=f X/Z y=f Y/Z { } Image plane coordinate system { } A 2D coordinate system (x,y) units in mm Origin at the intersection of the optical axis with the image plane In real systems, this is where the CCD or CMOS plane is

Examples Assume focal length = 5 mm x = f X/Z = (5 mm) (1 m)/(5 m) = 1 mm y = f Y/Z = (5 mm) (2 m)/(5 m) = 2 mm A scene point is located at (X,Y,Z) = (1m, 2m, 5m) What are the image plane coordinates (x,y) in mm? If the image plane is mm x mm, what is the field of view? /2 5 mm tan( /2) = (w/2)/f = 5/5 = 1 y x mm A building is 0m wide. How far away do we have to be in order that it fills the field of view? so /2 = 45 deg, fov is 0x0 deg 50 m tan( deg) = (W/2)/Z= 50/Z 45 deg Z so = 50 m 11

Image Buffer Image plane The real image is formed on the CCD plane (x,y) units in mm Origin in center (principal point) (0,0) y x { } Image buffer Digital (or pixel) image (row, col) indices We can also use (x im, y im ) Origin in upper left (1,1) row (or y im ) column (or x im ) {I} Center at (c x,c y ) 12

13

Conversion between real image and pixel image coordinates Assume The image center (principal point) is located at pixel (c x,c y ) in the pixel image The spacing of the pixels is (s x,s y ) in millimeters x im y im (c x, c y ) y x Then x = (x im c x ) s x y = (y im c y ) s y x im = x/s x + c x y im = y/s y + c y 14

A star is located at pixel (r,c)=(0,200) in a telescope image Example rows columns What is the 3D unit vector pointing at the star? Assume: Image is 00x00 pixels Optical center is in the center of the pixel image CCD plane is mm x mm Focal length is 1 m 15

Note on focal length Recall x = (x im c x ) s x y = (y im c y ) s y or x im = x/s x + c x y im = y/s y + c y and x = f X/Z y = f Y/Z So x im = (f / s x ) X/Z + c x y im = (f / s y ) Y/Z + c y All we really need is f x = (f / s x ) f y = (f / s y ) We don t need to know the actual values of f and s x,s y ; just their ratios We can alternatively express focal length in units of pixels 16

Example A camera observes a rectangle 1m away The rectangle is known to be 20cm x cm In the image, the rectangle measures 200 x 0 pixels Focal length in x, focal length in y (pixels)? If image size is 640x480 pixels, what is field of view (horiz, vert)? 17

Camera Parameters Intrinsic parameters Those parameters needed to relate an image point (in pixels) to a direction in the camera frame f x, f y, c x, c y Also lens distortion parameters (will discuss later) Extrinsic parameters Define the position and orientation (pose) of the camera in the world 18

Frames of Reference Image frames are 2D; others are 3D The pose (position and orientation) of a 3D rigid body has 6 degrees of freedom {Camera} Frame buffer or pixel image Image plane or real image {Model} {World} 1