CSCI 445 Amin Atrash. Ultrasound, Laser and Vision Sensors. Introduction to Robotics L. Itti & M. J. Mataric

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "CSCI 445 Amin Atrash. Ultrasound, Laser and Vision Sensors. Introduction to Robotics L. Itti & M. J. Mataric"

Transcription

1 Introduction to Robotics CSCI 445 Amin Atrash Ultrasound, Laser and Vision Sensors

2 Today s Lecture Outline Ultrasound (sonar) Laser range-finders (ladar, not lidar) Vision Stereo vision

3 Ultrasound/Sonar Ultrasound = sonar Ultrasound comes from: ultra = (far beyond in Latin) + sound Sonar comes from: so(und) + na(vigation and)+ r(anging)

4 Ultrasound: Time of Flight Ultrasound (sonar) range sensing is based on the time-of-flight principle The emitter produces a "chirp" of sound (at a very high frequency) Sound travels away from the emitter, bounces off object(s), eventually returns to the detector The elapsed time is measured Why use high frequency sound?

5 Measuring Distance How do we get distance from the measured time-of-flight? Sound travels at a constant speed, which varies slightly based on ambient temperature At room temperature, sound travels at 331 m/sec, or around 30 cm/msec d = v*t/2

6 Sonar in Biology The process of finding one's location based on sonar is called echolocation ( echo-finding ) Inspiration for ultrasound sensing comes from nature Sound travels well in water; dolphins and whales use it to communicate over long distances Bats are best known for using sonar instead of vision (why do they use it?)

7 Bat Sonar Bats use sonar because they live in very dark caves where vision would be useless Bat sonar is extremely complex compared to synthetic sonar sensors Bats use numerous frequencies for: finding tiny fast-flying prey avoiding cave walls and other bats finding mates communicating with other bats

8 Sonar Sensor Construction Some sensors have a single emitter/detector pair Others have a shared emitter/detector Pros/cons of each?

9 Angular Resolution Typical sonar sensors have 30-degree angular resolution E.g., Polaroid transducer: Frequency 50kHz Wavelength 7mm Sensor diameter ~ 40mm Max range ~ 10m

10 Surface Properties Ultrasonic range sensors depend on reflections from objects in the environment Surface properties are important: Specular vs. diffuse reflection Angle of incidence

11 Specularity v. Diffusion Surfaces generate two forms of reflection: Specular: angle of incidence = angle of reflection Diffuse: energy absorbed and re-emitted at a broad range of angles Specular reflections are strong, but unlikely to return to detector Diffuse reflections are weak, but likely to return to detector

12 Angle of incidence Increasing angle of incidence decreasing strength of reflection I.e., grazing rays generate no detectable return Objects may appear invisible Objects may act as mirrors (multiple specular reflections)

13 Overcoming Sonar Limitations Increase diffuse reflections: Add texture to objects (with wavelength similar to sonar) Use clever post-processing: Build maps with sonar readings from multiple viewpoints Use more advanced sensors: Phased-array emitter/detectors Higher frequencies

14 Uses of Sonar Sensors In spite of specular reflection, ultrasound/sonar sensors are used very successfully Robotics applications: obstacle avoidance mapping Note: sonars do require high power (current) => bigger capacitors and batteries

15 Laser Range Finders (ladar) Sonar sensors measure range using time-of-flight of sound Sound: 330 m/sec in air Laser range finders measure time-offlight of light Light: 300,000,000 m/sec in vacuum Ladar = l(ight r)adar Infrared lasers (invisible to human eye)

16 Ladar vs. Sonar Pros: Small spot size => high good angular resolution High speed => high sampling rate Short wavelength => fewer specular reflections Cons: Small spot size => less coverage Large/heavy Complex/expensive

17 Scanning Ladars Scanning laser rangefinder: Laser range finder Rotating mirror(s) E.g., SICK 3D planar scan, 180 degree FOV E.g., Riegl 3D volume scan, 360/80 degree FOV

18 3D Laser Mapping

19 Machine Vision Cameras (try to) model biological eyes Machine vision systems are complex Machine vision has historically been a separate branch of Artificial Intelligence Vision computationally expensive: very large portions of the human brain are devoted to it (how much?)

20 HumanVision

21 The Physics of Vision The general principle of a camera: Light, scattered from objects in the environment (the scene) goes through an opening (iris, in the simplest case a pin hole, in the more sophisticated case a lens)... impinges on the retina (in biology) or the image plane (in a camera)... and then it is further processed...

22 Cameras For Machine Vision Cameras for machine vision: Lens/Iris Photosensitive pixel array (CCD, CMOS) Signal processing A/D converters (frame grabber) Historically, cameras had analog output, required frame-grabber in computer Modern cameras have digital output (USB, IEEE1394)

23 Steps in Visual Processing Processing is usually divided into discrete stages: Early, middle, late Low-level, mid-level, high-level Low-level processing: Smoothing Edge detection High-level processing: Scene reconstruction Object recognition

24 Low-level Processing Smoothing: Images have noise (pixel variance) Replace pixel values with local averages E.g., mean filter or median filter Edge detection: Strong intensity changes often correspond to object edges Replace pixel values with local gradient E.g., Canny edge detector

25 Mid-level Processing Segmentation: Divide image into components (e.g., foreground/background) Contours (connected edges) Regions (constant color or texture)

26 High-level Processing Scene reconstruction: What world produced this image? Underdetermined problem (many worlds will produce the same image) Object recognition: Given model of object, attempt to find object in the image Match invariant features

27 Use color Practical Tips Use a smaller image plane Use other sensors to complement vision (IR, sonar, contact, etc.) Use task-specific information (e.g., for driving, look for white lines on roads)

28 Stereo Vision Stereo vision (visual ranging): use two cameras with known spatial offset Each camera records slightly different image disparity Infer range of objects that appear in both images from disparity triangulation Outputs: Disparity map: per-pixel disparity Range image: per-pixel range

29 Epipolar Geometry The epipolar plane is the plane defined by a 3D point and the optical centers C and C'. The epipolar line (xy,x y ) is the straight line of intersection of the epipolar plane with the image plane. All epipolar lines intersect at the epipole. The epipole is the image, in one camera, of the image center of the other camera.

30 Stereo Vision Distance from triangulation Correlation problem z = f T d Disparity d = x x Rectification aligns epipolar lines & reduces search to 1D Need internal and external camera parameters - calibration is required

31 Stereo Correspondence?

32 Disparity Map

33 Stereo Vision vs. Ladar Pros of stereo vision: Dense 3D range data High sampling rate Cons of stereo vision: Requires textured surfaces Incomplete range maps Limited range and depth-of-field

34 Textbook Readings MM: Chapter 9

Synthetic Sensing: Proximity / Distance Sensors

Synthetic Sensing: Proximity / Distance Sensors Synthetic Sensing: Proximity / Distance Sensors MediaRobotics Lab, February 2010 Proximity detection is dependent on the object of interest. One size does not fit all For non-contact distance measurement,

More information

Range sensors. Sonar. Laser range finder. Time of Flight Camera. Structured light. 4a - Perception - Sensors. 4a 45

Range sensors. Sonar. Laser range finder. Time of Flight Camera. Structured light. 4a - Perception - Sensors. 4a 45 R. Siegwart & D. Scaramuzza, ETH Zurich - ASL 4a 45 Range sensors Sonar Laser range finder Time of Flight Camera Structured light Infrared sensors Noncontact bump sensor (1) sensing is based on light intensity.

More information

LECTURE 4-1. Common Sensing Techniques for Reactive Robots. Introduction to AI Robotics (Sec )

LECTURE 4-1. Common Sensing Techniques for Reactive Robots. Introduction to AI Robotics (Sec ) LECTURE 4-1 Common Sensing Techniques for Reactive Robots Introduction to AI Robotics (Sec. 6.1 6.5) 1 Quote of the Week Just as some newborn race of superintelligent robots are about to consume all humanity,

More information

C4 Computer Vision. 4 Lectures Michaelmas Term Tutorial Sheet Prof A. Zisserman. fundamental matrix, recovering ego-motion, applications.

C4 Computer Vision. 4 Lectures Michaelmas Term Tutorial Sheet Prof A. Zisserman. fundamental matrix, recovering ego-motion, applications. C4 Computer Vision 4 Lectures Michaelmas Term 2004 1 Tutorial Sheet Prof A. Zisserman Overview Lecture 1: Stereo Reconstruction I: epipolar geometry, fundamental matrix. Lecture 2: Stereo Reconstruction

More information

Derek Schmidlkofer PH 464. Project: IR Sensors

Derek Schmidlkofer PH 464. Project: IR Sensors Derek Schmidlkofer PH 464 Project: IR Sensors Robots need devices that allow them to interpret and interact with the world around them. Infrared sensors are an effective method of accomplishing this task.

More information

Robot Perception Continued

Robot Perception Continued Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart

More information

Lenses and Depth of Field

Lenses and Depth of Field Lenses and Depth of Field Prepared by Behzad Sajadi Borrowed from Frédo Durand s Lectures at MIT 3 major type of issues Diffraction ripples when aperture is small Third-order/spherical aberrations Rays

More information

COS Lecture 8 Autonomous Robot Navigation

COS Lecture 8 Autonomous Robot Navigation COS 495 - Lecture 8 Autonomous Robot Navigation Instructor: Chris Clark Semester: Fall 2011 1 Figures courtesy of Siegwart & Nourbakhsh Control Structure Prior Knowledge Operator Commands Localization

More information

A Short Tutorial on Three-Dimensional Cameras

A Short Tutorial on Three-Dimensional Cameras A Short Tutorial on INRIA Grenoble Rhone-Alpes Radu.Horaud@inria.fr http://perception.inrialpes.fr/ Binocular Vision This is also called stereo vision: depth can be inferred by computing a disparity field

More information

CCD TECHNOLOGY The CURE for the common Triangulation Laser

CCD TECHNOLOGY The CURE for the common Triangulation Laser CCD TECHNOLOGY The CURE for the common Triangulation Laser Martin Dumberger Technical Director 3200 Glen Royal Rd Raleigh NC 27617 ph: 919 787 9707 fax:919 787 9706 microepsilon@mindspring.com The twenty

More information

Robot Morphologies, Sensors, and Motors Spring 2014 Prof. Yanco

Robot Morphologies, Sensors, and Motors Spring 2014 Prof. Yanco Robot Morphologies, Sensors, and Motors 91.450 Spring 2014 Prof. Yanco Robot morphology (shape) Some possibilities Humanoid Trashcan robots Reconfigurable Shape shifting Biped Quadruped Why does shape

More information

Sensors: Introduction

Sensors: Introduction Sensors: Introduction Sensor is device that allows robot to interact with world Sensing must be considered as a module consisting of 1. Physical sensor 2. Software to extract relevant info from signal

More information

3D/4D acquisition. 3D acquisition taxonomy 22.10.2014. Computer Vision. Computer Vision. 3D acquisition methods. passive. active.

3D/4D acquisition. 3D acquisition taxonomy 22.10.2014. Computer Vision. Computer Vision. 3D acquisition methods. passive. active. Das Bild kann zurzeit nicht angezeigt werden. 22.10.2014 3D/4D acquisition 3D acquisition taxonomy 3D acquisition methods passive active uni-directional multi-directional uni-directional multi-directional

More information

Objectives. Understand various definitions related to sensing/perception. Understand variety of sensing techniques

Objectives. Understand various definitions related to sensing/perception. Understand variety of sensing techniques Sensing/Perception Objectives Understand various definitions related to sensing/perception Understand variety of sensing techniques Understand challenges of sensing and perception in robotics Old View

More information

Q1. Both X-ray machines and CT scanners are used to produce images of the body.

Q1. Both X-ray machines and CT scanners are used to produce images of the body. Q. Both X-ray machines and CT scanners are used to produce images of the body. (a) The diagram shows an X-ray photograph of a broken leg. Before switching on the X-ray machine, the radiographer goes behind

More information

Computational Photography and Video: More on Camera, Sensors & Color. Prof. Marc Pollefeys

Computational Photography and Video: More on Camera, Sensors & Color. Prof. Marc Pollefeys Computational Photography and Video: More on Camera, Sensors & Color Prof. Marc Pollefeys Today s schedule Last week s recap & administrivia Metering Aberrations Sensors Color sensing Today s schedule

More information

Development of intelligent systems

Development of intelligent systems Development of intelligent systems (RInS) Robot sensors Danijel Skočaj University of Ljubljana Faculty of Computer and Information Science v5.0 Academic year: 2015/16 Development of intelligent systems

More information

E190Q Lecture 5 Autonomous Robot Navigation

E190Q Lecture 5 Autonomous Robot Navigation E190Q Lecture 5 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 2014 1 Figures courtesy of Siegwart & Nourbakhsh Control Structures Planning Based Control Prior Knowledge Operator

More information

CS201 Lecture 02 Computer Vision: Image Formation and Basic Techniques. John Magee 31 August 2012

CS201 Lecture 02 Computer Vision: Image Formation and Basic Techniques. John Magee 31 August 2012 CS201 Lecture 02 Computer Vision: Image Formation and Basic Techniques John Magee 31 August 2012 1 Computer Vision How are Computer Graphics and Computer Vision Related? Recall: Computer graphics in general

More information

Understanding Infrared Camera Thermal Image Quality

Understanding Infrared Camera Thermal Image Quality Electrophysics Resource Center White Paper Noise{ Clean Signal Understanding Infared Camera Electrophysics Resource Center: Abstract You ve no doubt purchased a digital camera sometime over the past few

More information

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen Robot Sensors Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0760 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Sensors 1 / 38 Outline 1

More information

3D Scanner using Line Laser. 1. Introduction. 2. Theory

3D Scanner using Line Laser. 1. Introduction. 2. Theory . Introduction 3D Scanner using Line Laser Di Lu Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute The goal of 3D reconstruction is to recover the 3D properties of a geometric

More information

Sensors are for Perception

Sensors are for Perception Exploring Robotics Lecture D Sensing Topics: 1) Perception, Levels of Processing, Simple Sensors. 2) A look at irobot sroomba 3) RCX s Decision Making based on Sensor Inputs Sensors are for Perception

More information

SCAN IN A BOX Guide to the Ideal 3D Scan

SCAN IN A BOX Guide to the Ideal 3D Scan SCAN IN A BOX Guide to the Ideal 3D Scan Part I General Considerations This document is a guide for the person that approaches for the first time to the world of 3D scanning. The advices contained in this

More information

3D Scanners. Javad Malaquti. Intelligent Robotics Seminar. Winter Semester 2013/

3D Scanners. Javad Malaquti. Intelligent Robotics Seminar. Winter Semester 2013/ 3D Scanners Intelligent Robotics Seminar Javad Malaquti Winter Semester 2013/14 16.12.2013 Agenda Introduction Point Cloud Time-of-Flight 3D Laser Scanner Triangulation Based 3D Laser Scanner Structured

More information

Sensor Modeling for a Walking Robot Simulation. 1 Introduction

Sensor Modeling for a Walking Robot Simulation. 1 Introduction Sensor Modeling for a Walking Robot Simulation L. France, A. Girault, J-D. Gascuel, B. Espiau INRIA, Grenoble, FRANCE imagis, GRAVIR/IMAG, Grenoble, FRANCE Abstract This paper proposes models of short-range

More information

Scorpion 3D Stinger Camera

Scorpion 3D Stinger Camera Scorpion 3D Stinger Camera Scope Scorpion Stinger is a family of machine vision components and products. They provide building blocks for OEM and system integrators. The Scorpion 3D Stinger Camera is designed

More information

Homography. Dr. Gerhard Roth

Homography. Dr. Gerhard Roth Homography Dr. Gerhard Roth Epipolar Geometry P P l P r Epipolar Plane p l Epipolar Lines p r O l e l e r O r Epipoles P r = R(P l T) Homography Consider a point x = (u,v,1) in one image and x =(u,v,1)

More information

Chapter 1: Machine Vision Systems & Image Processing

Chapter 1: Machine Vision Systems & Image Processing Chapter 1: Machine Vision Systems & Image Processing 1.0 Introduction While other sensors, such as proximity, touch, and force sensing play a significant role in the improvement of intelligent systems,

More information

Page 1 of ISSUE 3 EDCR21694

Page 1 of ISSUE 3 EDCR21694 Page 1 of 15 503158 ISSUE 3 EDCR21694 Table of Contents 1 INTRODUCTION... 4 2 SAFETY, WARNINGS AND CAUTIONS... 5 2.1 WARNINGS AND CAUTIONS... 5 2.2 SAFETY FEATURES... 8 2.3 HAZARD AREAS... 8 3 OPERATING

More information

A Sensor for Simultaneously Capturing Texture and Shape by Projecting Structured Infrared Light

A Sensor for Simultaneously Capturing Texture and Shape by Projecting Structured Infrared Light A Sensor for Simultaneously Capturing Texture and Shape by Projecting Structured Infrared Light Kiyotaka Akasaka,Ryusuke Sagawa, Yasushi Yagi Institute of Scientific and Industrial Research, Osaka University,

More information

Stereo Vision (Correspondences)

Stereo Vision (Correspondences) Stereo Vision (Correspondences) EECS 598-08 Fall 2014! Foundations of Computer Vision!! Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! Readings: FP 7; SZ 11; TV 7! Date: 10/27/14!!

More information

Exampro GCSE Physics. P3 - Medical Applications of Physics Self Study Questions - Higher tier. Name: Class: Author: Date: Time: 90.

Exampro GCSE Physics. P3 - Medical Applications of Physics Self Study Questions - Higher tier. Name: Class: Author: Date: Time: 90. Exampro GCSE Physics P3 - Medical Applications of Physics Self Study Questions - Higher tier Name: Class: Author: Date: Time: 90 Marks: 90 Comments: Page of 32 Q. The diagram shows a glass prism. (i) Explain

More information

RE INVENT THE CAMERA: 3D COMPUTATIONAL PHOTOGRAPHY FOR YOUR MOBILE PHONE OR TABLET

RE INVENT THE CAMERA: 3D COMPUTATIONAL PHOTOGRAPHY FOR YOUR MOBILE PHONE OR TABLET RE INVENT THE CAMERA: 3D COMPUTATIONAL PHOTOGRAPHY FOR YOUR MOBILE PHONE OR TABLET REINVENT THE CAMERA: 3D COMPUTATIONAL PHOTOGRAPHY FOR YOUR MOBILE PHONE OR TABLET The first electronic camera (left),

More information

Traffic Monitoring Systems. Technology and sensors

Traffic Monitoring Systems. Technology and sensors Traffic Monitoring Systems Technology and sensors Technology Inductive loops Cameras Lidar/Ladar and laser Radar GPS etc Inductive loops Inductive loops signals Inductive loop sensor The inductance signal

More information

Pinhole Camera. Pinhole Camera with a mirror

Pinhole Camera.  Pinhole Camera with a mirror Pinhole Camera http://en.wikipedia.org/wiki/file:camera_obscura_box.jpg Pinhole Camera with a mirror Camera Obscura Camera Obscura, Gemma Frisius, 1558 Camera Obscura with Lens Camera obscura, from a manuscript

More information

Ch 6: Light and Telescope. Wave and Wavelength. Wavelength, Frequency and Speed. v f

Ch 6: Light and Telescope. Wave and Wavelength. Wavelength, Frequency and Speed. v f Ch 6: Light and Telescope Wave and Wavelength..\..\aTeach\PhET\wave-on-a-string_en.jar Wavelength, Frequency and Speed Wave and Wavelength A wave is a disturbance that moves through a medium or through

More information

Computer Graphics: Visualisation Lecture 3. Taku Komura Institute for Perception, Action & Behaviour

Computer Graphics: Visualisation Lecture 3. Taku Komura Institute for Perception, Action & Behaviour Computer Graphics: Visualisation Lecture 3 Taku Komura tkomura@inf.ed.ac.uk Institute for Perception, Action & Behaviour Taku Komura Computer Graphics & VTK 1 Last lecture... Visualisation can be greatly

More information

EL5223. Basic Concepts of Robot Sensors, Actuators, Localization, Navigation, and1 Mappin / 12

EL5223. Basic Concepts of Robot Sensors, Actuators, Localization, Navigation, and1 Mappin / 12 Basic Concepts of Robot Sensors, Actuators, Localization, Navigation, and Mapping Basic Concepts of Robot Sensors, Actuators, Localization, Navigation, and1 Mappin / 12 Sensors and Actuators Robotic systems

More information

Robot Hardware Non-visual Sensors. Ioannis Rekleitis

Robot Hardware Non-visual Sensors. Ioannis Rekleitis Robot Hardware Non-visual Sensors Ioannis Rekleitis Robot Sensors Sensors are devices that can sense and measure physical properties of the environment, e.g. temperature, luminance, resistance to touch,

More information

Gregory Hollows Director, Machine Vision Solutions Edmund Optics

Gregory Hollows Director, Machine Vision Solutions Edmund Optics Imaging Optics Fundamentals Gregory Hollows Director, Machine Vision Solutions Edmund Optics Topics for Discussion Fundamental Parameters of your system Field of View Working Distance Sensor Sizes Understanding

More information

High Resolution RF Analysis: The Benefits of Lidar Terrain & Clutter Datasets

High Resolution RF Analysis: The Benefits of Lidar Terrain & Clutter Datasets 0 High Resolution RF Analysis: The Benefits of Lidar Terrain & Clutter Datasets January 15, 2014 Martin Rais 1 High Resolution Terrain & Clutter Datasets: Why Lidar? There are myriad methods, techniques

More information

Lectures Remote Sensing

Lectures Remote Sensing Lectures Remote Sensing OPTICAL REMOTE SENSING dr.ir. Jan Clevers Centre of Geo-Information Environmental Sciences Wageningen UR EM Spectrum and Windows reflection emission 0.3 0.6 1.0 5.0 10 50 100 200

More information

Sensing data from ancient coins, pottery fragments and archaeological manuscripts

Sensing data from ancient coins, pottery fragments and archaeological manuscripts Sensing data from ancient coins, pottery fragments and archaeological manuscripts Martin Kampel Vienna University of Technology Computer Vision Lab Email: martin.kampel@tuwien.ac.at Wiedner Hauptstr. 17/3a,

More information

Chapter 6 Telescopes: Portals of Discovery. How does your eye form an image? Refraction. Example: Refraction at Sunset.

Chapter 6 Telescopes: Portals of Discovery. How does your eye form an image? Refraction. Example: Refraction at Sunset. Chapter 6 Telescopes: Portals of Discovery 6.1 Eyes and Cameras: Everyday Light Sensors Our goals for learning:! How does your eye form an image?! How do we record images? How does your eye form an image?

More information

CHAPTER 8 OBJECT REGISTERING SENSORS

CHAPTER 8 OBJECT REGISTERING SENSORS CHAPTER 8 OBJECT REGISTERING SENSORS 8.1. Contact sensors for object registering Mechanical switches can be used as contact sensors for object registering. They generate signals of the on/off type as a

More information

National Performance Evaluation Facility for LADARs

National Performance Evaluation Facility for LADARs National Performance Evaluation Facility for LADARs Kamel S. Saidi (presenter) Geraldine S. Cheok William C. Stone The National Institute of Standards and Technology Construction Metrology and Automation

More information

Range Image Analysis for Controlling an Adaptive 3D Camera. Peter Einramhof, Robert Schwarz and Markus Vincze. Work in progress paper

Range Image Analysis for Controlling an Adaptive 3D Camera. Peter Einramhof, Robert Schwarz and Markus Vincze. Work in progress paper Range Image Analysis for Controlling an Adaptive 3D Camera Work in progress paper Peter Einramhof, Robert Schwarz and Markus Vincze Vienna University of Technology Automation and Control Institute Outline

More information

Lasers Design and Laser Systems

Lasers Design and Laser Systems Lasers Design and Laser Systems Tel: 04-8563674 Nir Dahan Tel: 04-8292151 nirdahan@tx.technion.ac.il Thank You 1 Measuring the width of a laser beam is like trying to measure the size of a cotton ball

More information

Sensors and Cellphones

Sensors and Cellphones Sensors and Cellphones What is a sensor? A converter that measures a physical quantity and converts it into a signal which can be read by an observer or by an instrument What are some sensors we use every

More information

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010 Lecture 12: Cameras and Geometry CAP 5415 Fall 2010 The midterm What does the response of a derivative filter tell me about whether there is an edge or not? Things aren't working Did you look at the filters?

More information

Machine Vision Basics: Optics Part One

Machine Vision Basics: Optics Part One Machine Vision Basics: Optics Part One Webinar Gregory Hollows Director, Machine Vision Solutions Edmund Optics, Inc. Celia Hoyer Product Marketing Vision Systems Cognex Corporation Agenda Introduction

More information

2/16/2016. Reflection and Refraction WHITEBOARD WHITEBOARD. Chapter 21 Lecture What path did the light follow to reach the wall?

2/16/2016. Reflection and Refraction WHITEBOARD WHITEBOARD. Chapter 21 Lecture What path did the light follow to reach the wall? Chapter 21 Lecture What path did the light follow to reach the wall? Reflection and Refraction Represent the path from the laser to the wall with an arrow. Why can t you see the beam of light itself but

More information

Robotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information.

Robotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Robotics Lecture 3: Sensors See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Locomotion Practical

More information

Lecture 14: Convolutional neural networks for computer vision

Lecture 14: Convolutional neural networks for computer vision Lecture 14: Convolutional neural networks for computer vision Dr. Richard E. Turner (ret26@cam.ac.uk) November 20, 2014 Big picture Goal: how to produce good internal representations of the visual world

More information

A low cost 3D scanner based on structured light. C. Rocchini, P. Cignoni, C. Montani, P. Pingi, R. Scopigno

A low cost 3D scanner based on structured light. C. Rocchini, P. Cignoni, C. Montani, P. Pingi, R. Scopigno A low cost 3D scanner based on structured light C. Rocchini, P. Cignoni, C. Montani, P. Pingi, R. Scopigno Introduction 3D Scanner + Software Claudio Rocchini, Visual Computing Group 2 Characteristics

More information

INFRARED THERMAL IMAGING DEFINITIONS AND TERMS

INFRARED THERMAL IMAGING DEFINITIONS AND TERMS INFRARED THERMAL IMAGING DEFINITIONS AND TERMS Following is a collection of terms and words used in the thermal imaging market that you should know and understand: ABSOLUTE TEMPERATURE SCALE - Thermodynamic

More information

Enhance Your Vision Applications with Optical Filtering

Enhance Your Vision Applications with Optical Filtering Enhance Your Vision Applications with Optical Filtering Jason Dougherty Managing Director Midwest Optical Systems Agenda Why should you use optical filters? Optimize your lighting with optical filters

More information

Pathfinding Based on Edge Detection and Infrared Distance Measuring Sensor

Pathfinding Based on Edge Detection and Infrared Distance Measuring Sensor Acta Polytechnica Hungarica Vol. 6, No. 1, 2009 Pathfinding Based on Edge Detection and Infrared Distance Measuring Sensor Bojan Kuljić, János Simon, Tibor Szakáll Polytechnical Engineering College Marka

More information

Wii Remote Calibration Using the Sensor Bar

Wii Remote Calibration Using the Sensor Bar Wii Remote Calibration Using the Sensor Bar Alparslan Yildiz Abdullah Akay Yusuf Sinan Akgul GIT Vision Lab - http://vision.gyte.edu.tr Gebze Institute of Technology Kocaeli, Turkey {yildiz, akay, akgul}@bilmuh.gyte.edu.tr

More information

ENGINEERING METROLOGY

ENGINEERING METROLOGY ENGINEERING METROLOGY ACADEMIC YEAR 92-93, SEMESTER ONE COORDINATE MEASURING MACHINES OPTICAL MEASUREMENT SYSTEMS; DEPARTMENT OF MECHANICAL ENGINEERING ISFAHAN UNIVERSITY OF TECHNOLOGY Coordinate Measuring

More information

Machine Vision Basics: Optics Part Two

Machine Vision Basics: Optics Part Two Machine Vision Basics: Optics Part Two Webinar Gregory Hollows Director, Machine Vision Solutions Edmund Optics, Inc. Celia Hoyer Product Marketing Vision Systems Cognex Corporation Agenda Quick Review

More information

Analytical Technologies in Biotechnology Dr. Ashwani K. Sharma Department of Biotechnology Indian Institute of Technology, Roorkee

Analytical Technologies in Biotechnology Dr. Ashwani K. Sharma Department of Biotechnology Indian Institute of Technology, Roorkee Analytical Technologies in Biotechnology Dr. Ashwani K. Sharma Department of Biotechnology Indian Institute of Technology, Roorkee Module 1 Microscopy Lecture - 2 Basic concepts in microscopy 2 In this

More information

Flash for Selfies. An overview of illuminance requirements and why shorter flash pulses are the preferred solution for front flash.

Flash for Selfies. An overview of illuminance requirements and why shorter flash pulses are the preferred solution for front flash. WHITE PAPER Flash for Selfies An overview of illuminance requirements and why shorter flash pulses are the preferred solution for front flash Introduction Smartphones are ubiquitous in everybody s daily

More information

Biophotonics. Basic Microscopy. NPTEL Biophotonics 1

Biophotonics. Basic Microscopy. NPTEL Biophotonics 1 Biophotonics Basic Microscopy NPTEL Biophotonics 1 Overview In this lecture you will learn Elements of a basic microscope Some imaging techniques Keywords: optical microscopy, microscope construction,

More information

USING THE XBOX KINECT TO DETECT FEATURES OF THE FLOOR SURFACE

USING THE XBOX KINECT TO DETECT FEATURES OF THE FLOOR SURFACE USING THE XBOX KINECT TO DETECT FEATURES OF THE FLOOR SURFACE By STEPHANIE COCKRELL Submitted in partial fulfillment of the requirements For the degree of Master of Science Thesis Advisor: Gregory Lee

More information

Types of Scanning. Different radiation principles Same operating principles Same image geometry. Multispectral scanning Thermal scanning

Types of Scanning. Different radiation principles Same operating principles Same image geometry. Multispectral scanning Thermal scanning Types of Scanning Multispectral scanning Thermal scanning Different radiation principles Same operating principles Same image geometry Hyperspectral scanning Multispectral Scanners Advantages over multiband

More information

physics 112N interference and diffraction

physics 112N interference and diffraction physics 112N interference and diffraction the limits of ray optics shadow of the point of a pin physics 112N 2 the limits of ray optics physics 112N 3 the limits of ray optics physics 112N 4 this is how

More information

Epipolar Geometry and Stereo Vision

Epipolar Geometry and Stereo Vision 04/12/11 Epipolar Geometry and Stereo Vision Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem Many slides adapted from Lana Lazebnik, Silvio Saverese, Steve Seitz, many figures from

More information

Visual Data Combination for Object Detection and Localization for Autonomous Robot Manipulation Tasks

Visual Data Combination for Object Detection and Localization for Autonomous Robot Manipulation Tasks Visual Data Combination for Object Detection and Localization for Autonomous Robot Manipulation Tasks Luis A. Morgado-Ramirez, Sergio Hernandez-Mendez, Luis F. Marin- Urias, Antonio Marin-Hernandez, Homero

More information

Day/Night 480/540 PIR Alarm Sensor Covert IR CCTV Camera with 10 Metres Night Vision

Day/Night 480/540 PIR Alarm Sensor Covert IR CCTV Camera with 10 Metres Night Vision Data Sheet Day/Night 480/540 PIR Alarm Sensor Covert IR CCTV Camera with 10 Metres Night Vision (Model: XVCPIR) Looks like a PIR Alarm Sensor, but is actually a high resolution camera, ideal for hidden

More information

Jenna Stallings. Keybot Master. Instructors: Dr. Arroyo Dr. Schwartz. TAs: Josh Weaver, Ryan Stevens, Tim Martin

Jenna Stallings. Keybot Master. Instructors: Dr. Arroyo Dr. Schwartz. TAs: Josh Weaver, Ryan Stevens, Tim Martin Jenna Stallings Keybot Master Instructors: Dr. Arroyo Dr. Schwartz TAs: Josh Weaver, Ryan Stevens, Tim Martin Table of Contents Abstract The Keybot Master is an autonomous robot that will search for misplaced

More information

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY V. Knyaz a, *, Yu. Visilter, S. Zheltov a State Research Institute for Aviation System (GosNIIAS), 7, Victorenko str., Moscow, Russia

More information

Image Formation. The two parts of the image formation process. Asimple model

Image Formation. The two parts of the image formation process. Asimple model Image Formation The two parts of the image formation process The geometry of image formation which determines where in the image plane the projection of a point in the scene will be located. The physics

More information

Handbook of Robotics. Chapter 22 - Range Sensors

Handbook of Robotics. Chapter 22 - Range Sensors Handbook of Robotics Chapter 22 - Range Sensors Robert B. Fisher School of Informatics University of Edinburgh rbf@inf.ed.ac.uk Kurt Konolige Artificial Intelligence Center SRI International konolige@ai.sri.com

More information

Selecting the right infrared temperature sensor

Selecting the right infrared temperature sensor 48 June 1998 InTech Selecting the right infrared temperature sensor By Karen Ackland 10 1000 What is the temperature range of your process? What size is the target? How close to the target can the instrument

More information

Automatic Labeling of Lane Markings for Autonomous Vehicles

Automatic Labeling of Lane Markings for Autonomous Vehicles Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 jkiske@stanford.edu 1. Introduction As autonomous vehicles become more popular,

More information

Image formation. Image formation. Pinhole camera. Pinhole camera. Image formation. Matlab tutorial. Physical parameters of image formation

Image formation. Image formation. Pinhole camera. Pinhole camera. Image formation. Matlab tutorial. Physical parameters of image formation Image formation Image formation Matlab tutorial How are objects in the world captured in an image? Tuesday, Sept 2 Physical parameters of image formation Geometric Type of projection Camera pose Optical

More information

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK vii LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK LIST OF CONTENTS LIST OF TABLES LIST OF FIGURES LIST OF NOTATIONS LIST OF ABBREVIATIONS LIST OF APPENDICES

More information

HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAYS AND TUNNEL LININGS. HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAY AND ROAD TUNNEL LININGS.

HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAYS AND TUNNEL LININGS. HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAY AND ROAD TUNNEL LININGS. HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAYS AND TUNNEL LININGS. HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAY AND ROAD TUNNEL LININGS. The vehicle developed by Euroconsult and Pavemetrics and described

More information

A semi-autonomous sewer surveillance and inspection vehicle

A semi-autonomous sewer surveillance and inspection vehicle A semi-autonomous sewer surveillance and inspection vehicle R.M. Gooch, T.A. Clarke, & T.J. Ellis. Dept of Electrical, Electronic and Information Engineering, City University, Northampton Square, LONDON

More information

RIEGL VZ-400 NEW. Laser Scanners. Latest News March 2009

RIEGL VZ-400 NEW. Laser Scanners. Latest News March 2009 Latest News March 2009 NEW RIEGL VZ-400 Laser Scanners The following document details some of the excellent results acquired with the new RIEGL VZ-400 scanners, including: Time-optimised fine-scans The

More information

Reflectance Measurements of Materials Used in the Solar Industry. Selecting the Appropriate Accessories for UV/Vis/NIR Measurements.

Reflectance Measurements of Materials Used in the Solar Industry. Selecting the Appropriate Accessories for UV/Vis/NIR Measurements. T e c h n i c a l N o t e Reflectance Measurements of Materials Used in the Solar Industry UV/Vis/NIR Author: Dr. Jeffrey L. Taylor PerkinElmer, Inc. 710 Bridgeport Avenue Shelton, CT 06484 USA Selecting

More information

Lecture PowerPoints. Chapter 23 Physics: Principles with Applications, 7 th edition Giancoli

Lecture PowerPoints. Chapter 23 Physics: Principles with Applications, 7 th edition Giancoli Lecture PowerPoints Chapter 23 Physics: Principles with Applications, 7 th edition Giancoli This work is protected by United States copyright laws and is provided solely for the use of instructors in teaching

More information

Alignement of a ring cavity laser

Alignement of a ring cavity laser Alignement of a ring cavity laser 1 Introduction This manual describes a procedure to align the cavity of our Ti:Sapphire ring laser and its injection with an Argon-Ion pump laser beam. The setup is shown

More information

APPLICATION NOTE. Basler racer Migration Guide. Mechanics. www.baslerweb.com. Flexible Mount Concept. Housing

APPLICATION NOTE. Basler racer Migration Guide. Mechanics. www.baslerweb.com. Flexible Mount Concept. Housing 62 62 APPLICATION NOTE www.baslerweb.com Basler racer Migration Guide This paper describes what to consider when replacing the Basler L100 Camera Link or the Basler runner Gigabit Ethernet (GigE) line

More information

Vision-Based Pedestrian Detection for Driving Assistance

Vision-Based Pedestrian Detection for Driving Assistance Vision-Based Pedestrian Detection for Driving Assistance Literature Survey Multidimensional DSP Project, Spring 2005 Marco Perez Abstract This survey focuses on some of the most important and recent algorithms

More information

WHITE PAPER. Are More Pixels Better? www.basler-ipcam.com. Resolution Does it Really Matter?

WHITE PAPER. Are More Pixels Better? www.basler-ipcam.com. Resolution Does it Really Matter? WHITE PAPER www.basler-ipcam.com Are More Pixels Better? The most frequently asked question when buying a new digital security camera is, What resolution does the camera provide? The resolution is indeed

More information

Notes on the Use of Multiple Image Sizes at OpenCV stereo

Notes on the Use of Multiple Image Sizes at OpenCV stereo Notes on the Use of Multiple Image Sizes at OpenCV stereo Antonio Albiol April 5, 2012 Abstract This paper explains how to use different image sizes at different tasks done when using OpenCV for stereo

More information

Introduction to the Digital Video Camera Fabian Winkler

Introduction to the Digital Video Camera  Fabian Winkler Introduction to the Digital Video Camera http://www.cla.purdue.edu/vpa/etb/ Fabian Winkler Required hardware for this workshop: Description Digital video camera, with as many manual setting options as

More information

IMAGE FORMATION. Antonino Furnari

IMAGE FORMATION. Antonino Furnari IPLab - Image Processing Laboratory Dipartimento di Matematica e Informatica Università degli Studi di Catania http://iplab.dmi.unict.it IMAGE FORMATION Antonino Furnari furnari@dmi.unict.it http://dmi.unict.it/~furnari

More information

Geomorphology G322 Introduction to Aerial Photographs

Geomorphology G322 Introduction to Aerial Photographs Geomorphology G322 Introduction to Aerial Photographs I. Introduction to Air Photos A. Aerial photographs are acquired by airpcraft especially equipped with cameras and view ports. 1. Early Work... Balloon-based

More information

REFLECTION & REFRACTION

REFLECTION & REFRACTION REFLECTION & REFRACTION OBJECTIVE: To study and verify the laws of reflection and refraction using a plane mirror and a glass block. To see the virtual images that can be formed by the reflection and refraction

More information

Implementation of Machine Vision System for Finding Defects in Wheel Alignment

Implementation of Machine Vision System for Finding Defects in Wheel Alignment Implementation of Machine Vision System for Finding Defects in Wheel Alignment 1 Akshay Padegaonkar, 2 Madhura Brahme, 3 Mohit Bangale, 4 Alex Noel Joseph Raj 1, 2, 3 M. Tech Student, Embedded Systems,

More information

Shading. Reading. Pinhole camera. Basic 3D graphics. Brian Curless CSE 557 Fall 2013. Required: Shirley, Chapter 10

Shading. Reading. Pinhole camera. Basic 3D graphics. Brian Curless CSE 557 Fall 2013. Required: Shirley, Chapter 10 Reading Required: Shirley, Chapter 10 Shading Brian Curless CSE 557 Fall 2013 1 2 Basic 3D graphics With affine matrices, we can now transform virtual 3D obects in their local coordinate systems into a

More information

Motion Capture Sistemi a marker passivi

Motion Capture Sistemi a marker passivi Motion Capture Sistemi a marker passivi N. Alberto Borghese Laboratory of Human Motion Analysis and Virtual Reality (MAVR) Department of Computer Science University of Milano 1/41 Outline Introduction:

More information

12. CONFOCAL MICROSCOPY. Confocal microscopy can render depth-resolved slices through a 3D object by

12. CONFOCAL MICROSCOPY. Confocal microscopy can render depth-resolved slices through a 3D object by 12. CONFOCAL MICROSCOPY Confocal microscopy can render depth-resolved slices through a 3D object by rejecting much of the out of focus light via a pinhole. The image is reconstructed serially, i.e. point

More information

Infrared Viewers. Manual

Infrared Viewers. Manual Infrared Viewers Manual Contents Introduction 3 How it works 3 IR viewer in comparison with a CCD camera 4 Visualization of infrared laser beam in mid-air 4 Power Density 5 Spectral sensitivity 6 Operation

More information

Limitations of Human Vision. What is computer vision? What is computer vision (cont d)?

Limitations of Human Vision. What is computer vision? What is computer vision (cont d)? What is computer vision? Limitations of Human Vision Slide 1 Computer vision (image understanding) is a discipline that studies how to reconstruct, interpret and understand a 3D scene from its 2D images

More information