Chapter 1: Machine Vision Systems & Image Processing

Similar documents
PDF Created with deskpdf PDF Writer - Trial ::

How an electronic shutter works in a CMOS camera. First, let s review how shutters work in film cameras.

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

Digital Image Requirements for New Online US Visa Application

Endoscope Optics. Chapter Introduction

White paper. In the best of light The challenges of minimum illumination

Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ.

WHITE PAPER. Are More Pixels Better? Resolution Does it Really Matter?

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Optimao. In control since Machine Vision: The key considerations for successful visual inspection

ENGINEERING METROLOGY

Comparing Digital and Analogue X-ray Inspection for BGA, Flip Chip and CSP Analysis

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

MRC High Resolution. MR-compatible digital HD video camera. User manual

Digital Photography Composition. Kent Messamore 9/8/2013

TVL - The True Measurement of Video Quality

Care and Use of the Compound Microscope

Light and Sound. Pupil Booklet

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

EXPERIMENT 6 OPTICS: FOCAL LENGTH OF A LENS

Study of the Human Eye Working Principle: An impressive high angular resolution system with simple array detectors

THE COMPOUND MICROSCOPE

EVIDENCE PHOTOGRAPHY TEST SPECIFICATIONS MODULE 1: CAMERA SYSTEMS & LIGHT THEORY (37)

Computer Vision. Image acquisition. 25 August Copyright by NHL Hogeschool and Van de Loosdrecht Machine Vision BV All rights reserved

Understanding Line Scan Camera Applications

LEICA TRI-ELMAR-M mm f/4 ASPH. 1

Machine Vision Academy

Synthetic Sensing: Proximity / Distance Sensors

Handheld USB Digital Endoscope/Microscope

Video Camera Image Quality in Physical Electronic Security Systems

Digital image processing

Digital Image Fundamentals. Selim Aksoy Department of Computer Engineering Bilkent University

A Short Introduction to Computer Graphics

VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION

SMART CAMERA SENSORS SMART CAMERA SENSOR FOR VISUAL INSPECTION AND IDENTIFICATION

Chapter 6 Telescopes: Portals of Discovery. How does your eye form an image? Refraction. Example: Refraction at Sunset.

A Cheap Visual Inspection System for Measuring Dimensions of Brass Gear

Optical Modeling of the RIT-Yale Tip-Tilt Speckle Imager Plus

Science In Action 8 Unit C - Light and Optical Systems. 1.1 The Challenge of light

Scanning Surface Inspection System with Defect-review SEM and Analysis System Solutions

PCB Component Placement Inspection

WHITE PAPER. Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception

Application Note #503 Comparing 3D Optical Microscopy Techniques for Metrology Applications

INDUSTRIAL VISION. Don Braggins, UK Industrial Vision Association

WHITE PAPER. P-Iris. New iris control improves image quality in megapixel and HDTV network cameras.

λ = c f, E = h f E = h c λ

Lenses and Telescopes

Tube Control Measurement, Sorting Modular System for Glass Tube

Operating manual ICC Camera. Version Menu

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

Untangling the megapixel lens myth! Which is the best lens to buy? And how to make that decision!

RIEGL VZ-400 NEW. Laser Scanners. Latest News March 2009

9/16 Optics 1 /11 GEOMETRIC OPTICS

The Basics of Scanning Electron Microscopy

Analecta Vol. 8, No. 2 ISSN

18-270mm F/ Di II VC PZD for Canon, Nikon (Model B008) mm F/ Di II PZD for Sony (Model B008)

How does my eye compare to the telescope?

PHYS 39a Lab 3: Microscope Optics

MACHINE VISION FOR SMARTPHONES. Essential machine vision camera requirements to fulfill the needs of our society

Revision problem. Chapter 18 problem 37 page 612. Suppose you point a pinhole camera at a 15m tall tree that is 75m away.

Beyond Built-in: Why a Better Webcam Matters

White Paper. "See" what is important

3D TOPOGRAPHY & IMAGE OVERLAY OF PRINTED CIRCUIT BOARD ASSEMBLY

Day / Night IR Color Camera User Manual V2.1.0

Realization of a UV fisheye hyperspectral camera

Back to Basics: Introduction to Industrial Barcode Reading

ZEISS T-SCAN Automated / COMET Automated 3D Digitizing - Laserscanning / Fringe Projection Automated solutions for efficient 3D data capture

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec.

How To Fix Out Of Focus And Blur Images With A Dynamic Template Matching Algorithm

Lecture 17. Image formation Ray tracing Calculation. Lenses Convex Concave. Mirrors Convex Concave. Optical instruments

Optical Design for Automatic Identification

An edge-from-focus approach to 3D inspection and metrology

Digital Photography. Author : Dr. Karl Lenhardt, Bad Kreuznach

Robot Perception Continued

Inspection and Illumination Systems for Visual Quality Assurance. Large selection Attractive prices Tailored solutions. optometron.

X-RAY TUBE SELECTION CRITERIA FOR BGA / CSP X-RAY INSPECTION

OmniBSI TM Technology Backgrounder. Embargoed News: June 22, OmniVision Technologies, Inc.

Experiment 3 Lenses and Images

Bar Code Label Detection. Size and Edge Detection APPLICATIONS

Beginners Guide to Digital Camera Settings

Infrared Viewers. Manual

Thin Lenses Drawing Ray Diagrams

Tuesday 20 May 2014 Morning

ZEISS Compact Prime and Zoom lenses Flexibility and performance in a winning combination.

Master Anamorphic T1.9/35 mm

Auto Head-Up Displays: View-Through for Drivers

Introduction to Video 101 Second Edition By Ken MacDonald

Lecture 14. Point Spread Function (PSF)

Longwave IR focal-plane binary optics

Automotive Applications of 3D Laser Scanning Introduction

Reflectance Measurements of Materials Used in the Solar Industry. Selecting the Appropriate Accessories for UV/Vis/NIR Measurements.

DICOM Correction Item

Technical Considerations Detecting Transparent Materials in Particle Analysis. Michael Horgan

Encoded Phased Array Bridge Pin Inspection

Inspection of diffusely reflecting surfaces

Personal Identity Verification (PIV) IMAGE QUALITY SPECIFICATIONS FOR SINGLE FINGER CAPTURE DEVICES

CNC-STEP. "LaserProbe4500" 3D laser scanning system Instruction manual

How does my eye compare to the telescope?

How to Choose the Right Network Cameras. for Your Surveillance Project. Surveon Whitepaper

Digital Camera Imaging Evaluation

Transcription:

Chapter 1: Machine Vision Systems & Image Processing 1.0 Introduction While other sensors, such as proximity, touch, and force sensing play a significant role in the improvement of intelligent systems, vision is recognized as the most powerful of sensory capabilities, also called universal sensor. A machine vision process may be divided into six principal areas: (1) Sensing. Sensing is the process that yields a visual image. (2) Preprocessing. Preprocessing deals with techniques such as noise reduction and enhancement of details. (3) Segmentation. Segmentation is the process that partitions an image into objects of interest. (4) Description. Description deals with the computation of features (e.g., size, shape) suitable for differentiating one type of object from another. (5) Recognition. Recognition is the process that identifies these objects (e.g., wrench, bolt, engine block). (6) Interpretation. Interpretation assigns meaning to an ensemble of recognized objects. 1.1 Sensing Image Acquisition 1.1.1 Principles: To obtain an image, one must have two important components: (1) Lens to collect and direct the lights; (2) visual sensor (CCDs) to receive incoming light. Lens: Each design form below illustrates different lens combinations and the performance associated with them. Please note that you are not limited to just these lens combinations. Common Lens Formulas: 1 f 1 1 = + d o d i where f Focal length of the lens; distance measured from the lens. d o object distance measured from the lens; d i image Magnification is defined as M = d i / d o = H i / H 0 ( H i --image height/size; height/size). H o --object

CCD Camera: Charge-Coupled Devices (CCDs) are the most common camera sensors used in machine vision applications. A CCD camera uses a small, rectangular piece of silicon rather than a piece of film to receive incoming light. This special silicon wafer is a solid-state electronic component which has been micro-manufactured and segmented into an array of individual light-sensitive cells called photosites. Each photosite is one element of the whole picture that is formed, thus it is called a picture element, or pixel. There are several standard CCD sensor sizes: 1/4", 1/3", 1/2", 2/3" and 1" (see Figure 1). All of these standards maintain a 4:3 (Horizontal:Vertical) aspect ratio. E.g. ¼ CCD (320x240pixel). Fig. 1 Standard CCD Sensor Sizes The size of the sensor s active area is important in determining the system s field of view. Given a fixed primary magnification (determined by the lens), larger sensors yield greater FOVs.Another issue is the ability of the lens to support certain CCD chip sizes. If the chip is too large for the lens design, the resulting image may appear to fade away and degrade towards the edges because of vignetting (extinction of rays which pass through the outer edges of the lens). CCDs'popularity can be linked to its characteristically small size and light weight. Additionally, CCDs have an impressive dynamic range and yield a highly-linear relationship between incoming energy and outgoing signal, making them ideal for metrology. The CCD silicon chip is an analog component, meaning that the pixel "values" are collected by means of sampling. The signal processor and encoder converts this information into an analog signal, which can be transferred to a monitor. In digital cameras, digitizing occurs as the signal is collected from the chip. Once digitized, processing and image enhancements can be done with little loss to the signal. Many digital CCD cameras such as the Duncan Tech cameras enable characteristics to be digitally controlled through a RS-232 port. 1.1.2 Fundamental Parameters of Vision Systems: Field of View (FOV): The viewable area of the object under inspection. In other words, this is the portion of the object that fills the camera s sensor. Working Distance (WD): The distance from the front of the lens to the object under inspection. Resolution: The minimum feature size of the object that can be distinguished by the vision system. Depth of Field (DOF): The maximum object depth that can be maintained entirely in focus. DOF is also the amount of object movement (in and out of best focus) allowable while maintaining a desired amount of focus. Sensor Size: The size of a camera sensor s active area, typically specified in the horizontal dimension. This parameter is important in determining the proper lens magnification required to obtain a desired field of view. Primary Magnification (PMAG) of the lens is defined as the ratio

between the sensor size and the FOV. Although sensor size and field of view are fundamental parameters, it is important to realize that PMAG is not. The following formula calculates primary magnification: PMAG = Sensor Size (mm) / Field of View (mm) Fig. 2: Illustration of fundamental parameters of an imaging system Fig. 3 Illustration of primary magnification and the relationship between sensor size and FOV

1.1.3 Image Quality: An imaging system should create sufficient image quality to allow one to extract desired information about the object from the image. Note that what may be adequate image quality for one application may prove inadequate in another. There are a variety of factors that contribute to the overall image quality, including resolution, image contrast, depth of field, perspective errors, and geometric errors (distortion). Resolution: Resolution is a measurement of the imaging system s ability to reproduce object detail. A low-resolution image is usually blurry and lacking in details. Contrast: Fig. 4: Contrast can be illustrated by the square wave The Components that affects image quality (1) Lens aperture (f/#): impacts the amount of light incident on the camera. Illumination should be increased as the lens aperture is closed (i.e., higher f/#). (2) High power lenses usually require more illumination, as smaller areas viewed reflect less light back into the lens. (3) The camera s minimum sensitivity is also important in determining the minimum amount of light required in the system. (4) CCD camera settings such as gain, shutter speed, etc. affect the sensor s sensitivity. (5) Fiber optic illumination usually involves an illuminator and light guide, each of which should be integrated to optimize lighting at the object. Desired image quality can typically be met by improving a system s illumination rather than by investing in higher resolution detectors, imaging lenses and software. 1.1.4 Illumination: Why correct illumination is critical to an image system Illumination plays an important role in a Machine Vision system since it often affects the complexity of vision algorithms. Arbitrary lighting of the environment is often not acceptable because it can result in low-contrast images, specula reflections (hot spots, blooming), shadows, and extraneous details. A well-designed lighting system illuminates a scene so that the complexity

of the resulting image is minimized, while the information required for object detection and extraction is enhanced. The consequence a poor illumination may cause Low-contrast: increase the complexity of vision algorithms. Specula reflections (hot spots or blooming) can hide important image information. Shadowing: can hide important image information, cause false edge detection and result in inaccurate measurements. Types of Illumination Table 1 summarizes the types of the illumination. Table 1 Four Basic Illumination Schemes Type Description Example Diffuse Lighting: for objects characterized by smooth, regular surfaces. Applied where surface characteristics are important. Backlighting: produces a black and white (binary) image. Ideally suited for applications in which silhouettes of objects are sufficient for recognition. Structure-lighting consists of projecting points, stripes, or grids onto the work surface. Structured- Lighting: Through establishing a known light patter on the work space, the disturbances of this pattern indicate the presence of an object.

Directional Lighting: Useful for inspection of object surfaces. Defects on the surface, such as pits and scratches, can be detected by using a highly directed light beam.