Sensors: Introduction
|
|
- Basil Parks
- 7 years ago
- Views:
Transcription
1 Sensors: Introduction Sensor is device that allows robot to interact with world Sensing must be considered as a module consisting of 1. Physical sensor 2. Software to extract relevant info from signal Module incorporated as perceptual schema Process depicted in following diagram: Note that percept is defined as a data structure Task of perceptual schema is to extract appropriate info from sensory input and to store it in percept Following terminology used to distinguish among sensor types: 1. Sensor: Device to measure some attribute of the world 2. Transducer: part of sensor that converts what is being measured into another form of energy 3. Passive sensor: Relies on environment to produce energy being sensed 4. Active sensor: Generates energy being sensed 5. Modality: Type of energy being sensed 1
2 Sensors: Introduction (2) 6. Logical sensor: Abstract sensor independent of any physical implementation Entails all available alternative methods of obtaining percept Methods may vary in terms of Conditions under which they operate Time to produce signal Modality of operation,... All are logically equivalent Included is algorithm for selecting appropriate method Definition has been generalized to standard meaning of logical Black box that performs a given task Perceptual schema is logical sensor wrt this definition Implication is that physical sensor can be used as many logical sensors 2
3 Sensors: Sensor Fusion Sensor fusion: Process of combining input from multiple sensors into a single percept Basic ways of combining sensors: 1. Redundancy/competition Use multiple devices to sense same percept Useful In noisy environments With sensors that are imprecise Types of redundancy: Physical redundancy: Sensors have same modality Logical redundancy: Sensors use different modalities to sense same percept Types of sensor misreadings: False positive: Sensing of signal that isn t there False negative: Failure to detect signal 2. Complementation Use multiple sensors, each id ing different aspect of percept 3. Coordination Use sensors in sequence Often used to focus attention 3
4 Sensors: Sensor Fusion (2) Fusion can be incorporated into behaviors in a number of ways: 1. Sensor fission Coined by Brooks He argued that fusion does not occur at behavioral level Sensor s outputs not combined Rather overall behavior result of primitive competing behaviors Uses one sensor per behavior Behaviors can independently share sensors Resultant behavior is result of competition among triggered behaviors 2. Action-oriented sensor fusion Cognitive psychology shows that behavioral fusion does indeed occur Stimuli travel over separate paths to multiple behavior areas of brain Fusion of sensory inputs occurs at a particular behavior Result is common representation Fusion occurs to support a particular action associated with a particular behavior 4
5 3. Sensor fashion Sensors: Sensor Fusion (3) Refers to coordination of sensors (addressed by neither fission or fusion) Percept chosen depends on circumstances 5
6 Sensors: Sensor Suites Sensor suite: Set of sensors Operating environment determines set of sensors needed for robot Sensors can be categorized as: 1. Proprioceptive: Measure movements of robot wrt internal frame of reference 2. Exteroceptive: Measure environment wrt internal frame of reference 3. Exproprioceptive: Measure position of robot wrt environment Reactive robots always have exteroceptive sensors Hardware/sensor characteristics to be considered: 1. Field of view 2. Range 3. Accuracy Inaccuracy usable if always off by consistent amount 4. Repeatability 5. Resolution 6. Responsiveness (in domain) Must have adequate signal-to-noise ratio 7. Power consumption Hotel load: Amount of power needed to support sensor suite Locomotion load: Amount of power needed to move robot 8. Hardware reliability 9. Size Factor of operating environment Directly affects power requirements 6
7 Sensors: Sensor Suites (2) Software characteristics to be considered: 1. Computational complexity 2. Interpretation reliability How reliable is algorithm? Desirable characteristics: 1. Simplicity Implies ease of maintenance and interpretation 2. Modularity Implies ease of reconfiguration 3. Redundancy Implies reliability Major problem is for robot to identify when a sensor has failed Fault tolerance: Ability to survive a failure 7
8 Sensors: Types of Sensors 1. Location sensors (a) Shaft encoders Proprioceptive Measure distance Associated with motor Accuracy dependent on locomotion method and terrain May affect traction (b) Inertial navigation system (c) GPS Exteroceptive Based on accelerometers Large and costly Accuracy dependent on smoothness of motion Uses triangulation wrt 4 satellites Not reliable in cities Military and civilian versions Neither proprioceptive nor exteroceptive 8
9 Sensors: Types of Sensors (2) 2. Proximity sensors (a) Sonar (echo location) Field of view 30 o (5 foot range) Produces main lobe (primary sound wave) and side lobes (secondary waves) Generally assume echo produced by main lobe Generally consider main lobe with FOV = 8-15 degrees Cannot detect objects within 11 inches due to time to damp vibrations 4 regions of interest: i. Inside range ii. Outside range iii. Within range iv. Behind object 9
10 Sensors: Types of Sensors (3) Problems: i. Specular reflection: reflection at an acute angle Signal not returned ii. Multiple reflection (result of specular) False distance (farther than actual) iii. Foreshortening: False distance based on assumption that echo is from dead ahead iv. Crosstalk: Interference from multiple sensors Eliminate by firing in sequence v. Size (of obstacles) Small objects do not generate sufficient echo As distance increases, obstacles produce reduced signal vi. Power consumption vii. Generally high viii. Need a strong signal 10
11 (b) IR Problems: i. Bright environments ii. Dark objects iii. Short range (3-5 inches) (c) Touch sensors (tactile sensors) Types of Sensors (4) Problems: i. Placement important - must insure parts of robot do not extend beyond sensor, or that objects may be missed 11
12 Based on electromagnetic energy Sensors: Computer Vision - Introduction Maps multiple readings from 3D space onto 2D grid Each grid cell called a pixel Modality of camera determines what image measures: Light (visual) image Heat map... Computer vision separate field Variety of algorithms developed for Edge detection Noise filtering Image enhancement... These generally not amenable for mobile robots due to complexity and memory requirements 12
13 Sensors: Computer Vision - Devices Video camera Consists of array of charged couple devices (CCDs) Each detector is metal-oxide semiconductor (MOS) capacitor Responsive to visible light Each represents a rectangular pixel Signal is analog Image can be captured One row at a time (line transfer) Entire array at a time (frame transfer) Signal must be digitized Use A/D converter Process too slow for real-time control To compensate: Use multiple buffers to create a pipeline of frames Use low frame rate Frame grabber Digitizes analog image 13
14 Sensors: Computer Vision - Color Grayscale Monochrome 8 bit representation Color RGB Additive primaries Used for CRTs Color cube Plotted on Cartesian coordinate system Black at origin White at FUR corner Main diagonal represents grays Usually represented using 8 bits per primary 2 ways to represent internally: 1. Interleaved Each pixel represented as 3 contiguous values: #define RED 0 #define GREEN 1 #define BLUE 2 int image[row][col][col_plane]; red = image[row][col][red]; green = image[row][col][green]; blue = image[row][col][blue]; displaycolor(red, green, blue); 14
15 Sensors: Computer Vision - Color (2) 2. Separately Pixel represented as entry across 3 color arrays: #define RED 0 #define GREEN 1 #define BLUE 2 int imagered[row][col]; int imagegreen[row][col]; int imageblue[row][col]; red = imagered[row][col]; green = imagegreen[row][col]; blue = imageblue[row][col]; displaycolor(red, green, blue); Problems for robotics: 1. Perceived color dependent on (a) Color of illumination source (b) Surface characteristics of object (c) Sensitivity of camera 2. Visual erosion: Distance of robot from object and angular relation to illumination source affect perceived intensity 3. CCD devices less sensitive to red than green and blue 15
16 HSV Sensors: Computer Vision - Color (3) Based on Hue: Absolute color Saturation: Purity of color Value: Intensity Color cone Cylindrical coordinate system Vertical axis V (value) - represents lightness H (hue) measured as angle around V axis R at 0 o S (saturation) measured as distance from V axis to edge Edge represents S = 1 Apex of cone at origin Black at V = 0, S = 0 White at V = 1, S = 0 Pure colors at V = 1, S = 1 Advantages over RGB model: Hue measures absolute wavelength of color Easy to represent internally Problems: 1. Need special cameras and frame grabbers - expensive (Standard models use RGB) 2. Conversion to HSV from RGB Computationally expensive Singularities exists where R == G == B Since CCD devices have flat response to R, increases liklihood of singularities 16
17 SCT Sensors: Computer Vision - Color (4) Spherical Coordinate Transform Based on RGB color cube Represents colors in terms of polar coordinate system: Vector r represents color Angular coordinates represent hue Length represents intensity 17
18 Sensors: Computer Vision - Color (5) 18
19 Sensors: Computer Vision - Region Segmentation Major task of behavior-based robots is to id objects based on color General process called region segmentation Need to separate image into 1. Foreground: Pixels that represent object of interest 2. Background: Remaining pixels Two general steps involved: 1. Thresholding Id pixels in region with same color 2. Region growing Generate clusters of those pixels Thresholding In simplest case, assume binary image Pixels have color-of-interest or not Code: for (i = 0; i < rows; i++) for (j = 0; j < cols; j++) if ((image[i][j][red] == redvalue) && (image[i][j][green] == greenvalue) && (image[i][j][blue] == bluevalue)) imageout[i][j] = 255; else imageout[i][j] = 0; Use 0 and 255 because color eye cannot distinguish between output levels of 0 and 1 on devices using 8+ bits per pixel 19
20 Sensors: Computer Vision - Region Segmentation (2) Use multivalued image Since object s perceived color sensitive to environment, will not be seen as blob of constant color Use threshold to id pixels of interest: for (i = 0; i < rows; i++) for (j = 0; j < cols; j++) if (((image[i][j][red] >= redlowvalue) && (image[i][j][red] <= redhighvalue)) && ((image[i][j][green] >= greenlowvalue) && (image[i][j][green] <= greenhighvalue)) && ((image[i][j][blue] >= bluelowvalue) && (image[i][j][blue] <= bluehighvalue))) imageout[i][j] = 255; else imageout[i][j] = 0; Region growing Want to id center of region of interest Performed by perceptual schema Approaches: 1. Use weighted centroid of all pixels of interest 2. Find largest area of adjacent pixels of interest and find its centroid 20
21 Sensors: Computer Vision - Histogramming Thresholding based on multicolored objects Histogram consist of buckets Each bucket contains number of pixels that fall within a specified range Implementation for grayscale and HSV simple Implementation for RGB more difficult: Need 1 histogram per color plane Recognition achieved via histogram intersection Simply subtract 1 histogram from another on bucket-by-bucket basis Difference is per cent of pixels that do not match intersection = b j=1 abs(i j E j ) bj=1 E j where b is number of buckets, I and E are histograms Can be used to implement releasers Store histogram of releasing image in perceptual schema Intersect current image with it If difference small enough, set releaser Does not violate principles of behavior-based robots Intersection also useful to represent stimulus strength 21
22 Sensors: Computer Vision - Range Finding: Stereopsis Cameras can be used in several ways to determine distance of robot from an object: 1. Stereopsis 2. Light striping 3. Laser ranging Stereopsis (stereo disparity, binocular vision) uses 2 cameras in tandem Called stereo pair Cameras mounted on a fixed base Distance between them called baseline Result will be a depth map 2D grayscale image Intensity represents distance 2 general approaches: 22
23 1. Vergeance Sensors: Computer Vision - Range Finding: Stereopsis (2) Each camera can rotate about a fixed axis Cameras aimed at same point Angles between line of sight and main axis determined Distance can be determined by triangulation Vergeance refers to process of focusing cameras at same point Problems: (a) Expensive (b) Correspondence problem: Insuring both cameras aimed at same point in space Interest operator: Algorithm for id ing pixels of interest - those with unique values Difficult due to problems associated with perceived color Interest operator usually id s set of pixels and matching algorithm tries to find best correlation between left and right projections Once aligned, assign depth values 23
24 2. Fixed stereo pair Sensors: Computer Vision - Range Finding: Stereopsis (3) Cameras aimed straight ahead with optic axes parallel Projection of pixel of interest intersects planes at different points Disparity: Difference between points Rectified images: Images after they have been correlated Pixel of interest should appear in same row in each image Epipolar lines: Paired lines of images Id of corresponding points computationally easy as only compare points in lines, not in whole 2D image Problems: (a) Must have perfect alignment Calibration process frequently used to insure Stereopsis expensive Stereo matching algorithms O(n 2 m 2 ) for an n m grid Color segmentation O(nm) 24
25 Sensors: Computer Vision - Range Finding: Light Striping Principle is to cast a light pattern onto a surface Flat surface results in continuous image Observed discontinuities provide depth, size and shape info Granularity of pattern determines amount of depth info extracted Relatively cheap 1. Do not need to be as precise as stereopsis devices 2. Can use regular light 3. Processing cheaper - do not consider every pixel Problems: 1. Well suited for recognition, but not so much for reactive robots 2. Difficulty when reflected light and object s color similar 3. Environmental light may interfere with reflection 25
26 Sensors: Computer Vision - Range Finding: Laser Ranging Also called laser radar, ladar, lidar Based on same principle as sonar range finder FOV much narrower than for sonar Scans environment like CRT Can generate frame at rates of pixels per second Two grayscale images generated: 1. Depth 2. Intensity Problems: 1. Differences in reflectivity of different surfaces 2. Expensive 3. Generally need 2 26
Robot Perception Continued
Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart
More informationE190Q Lecture 5 Autonomous Robot Navigation
E190Q Lecture 5 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 2014 1 Figures courtesy of Siegwart & Nourbakhsh Control Structures Planning Based Control Prior Knowledge Operator
More informationSynthetic Sensing: Proximity / Distance Sensors
Synthetic Sensing: Proximity / Distance Sensors MediaRobotics Lab, February 2010 Proximity detection is dependent on the object of interest. One size does not fit all For non-contact distance measurement,
More informationOverview. Raster Graphics and Color. Overview. Display Hardware. Liquid Crystal Display (LCD) Cathode Ray Tube (CRT)
Raster Graphics and Color Greg Humphreys CS445: Intro Graphics University of Virginia, Fall 2004 Color models Color models Display Hardware Video display devices Cathode Ray Tube (CRT) Liquid Crystal Display
More informationWhite paper. CCD and CMOS sensor technology Technical white paper
White paper CCD and CMOS sensor technology Technical white paper Table of contents 1. Introduction to image sensors 3 2. CCD technology 4 3. CMOS technology 5 4. HDTV and megapixel sensors 6 5. Main differences
More informationOutline. Quantizing Intensities. Achromatic Light. Optical Illusion. Quantizing Intensities. CS 430/585 Computer Graphics I
CS 430/585 Computer Graphics I Week 8, Lecture 15 Outline Light Physical Properties of Light and Color Eye Mechanism for Color Systems to Define Light and Color David Breen, William Regli and Maxim Peysakhov
More informationRobot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen
Robot Sensors Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0760 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Sensors 1 / 38 Outline 1
More informationTraffic Monitoring Systems. Technology and sensors
Traffic Monitoring Systems Technology and sensors Technology Inductive loops Cameras Lidar/Ladar and laser Radar GPS etc Inductive loops Inductive loops signals Inductive loop sensor The inductance signal
More informationAn Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network
Proceedings of the 8th WSEAS Int. Conf. on ARTIFICIAL INTELLIGENCE, KNOWLEDGE ENGINEERING & DATA BASES (AIKED '9) ISSN: 179-519 435 ISBN: 978-96-474-51-2 An Energy-Based Vehicle Tracking System using Principal
More informationComputer Vision. Color image processing. 25 August 2014
Computer Vision Color image processing 25 August 2014 Copyright 2001 2014 by NHL Hogeschool and Van de Loosdrecht Machine Vision BV All rights reserved j.van.de.loosdrecht@nhl.nl, jaap@vdlmv.nl Color image
More informationCBIR: Colour Representation. COMPSCI.708.S1.C A/P Georgy Gimel farb
CBIR: Colour Representation COMPSCI.708.S1.C A/P Georgy Gimel farb Colour Representation Colour is the most widely used visual feature in multimedia context CBIR systems are not aware of the difference
More informationAutomatic Labeling of Lane Markings for Autonomous Vehicles
Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 jkiske@stanford.edu 1. Introduction As autonomous vehicles become more popular,
More information3D Scanner using Line Laser. 1. Introduction. 2. Theory
. Introduction 3D Scanner using Line Laser Di Lu Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute The goal of 3D reconstruction is to recover the 3D properties of a geometric
More informationScanners and How to Use Them
Written by Jonathan Sachs Copyright 1996-1999 Digital Light & Color Introduction A scanner is a device that converts images to a digital file you can use with your computer. There are many different types
More informationVECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION
VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION Mark J. Norris Vision Inspection Technology, LLC Haverhill, MA mnorris@vitechnology.com ABSTRACT Traditional methods of identifying and
More informationRobotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information.
Robotics Lecture 3: Sensors See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Locomotion Practical
More informationA System for Capturing High Resolution Images
A System for Capturing High Resolution Images G.Voyatzis, G.Angelopoulos, A.Bors and I.Pitas Department of Informatics University of Thessaloniki BOX 451, 54006 Thessaloniki GREECE e-mail: pitas@zeus.csd.auth.gr
More informationENGINEERING METROLOGY
ENGINEERING METROLOGY ACADEMIC YEAR 92-93, SEMESTER ONE COORDINATE MEASURING MACHINES OPTICAL MEASUREMENT SYSTEMS; DEPARTMENT OF MECHANICAL ENGINEERING ISFAHAN UNIVERSITY OF TECHNOLOGY Coordinate Measuring
More informationDigital Image Basics. Introduction. Pixels and Bitmaps. Written by Jonathan Sachs Copyright 1996-1999 Digital Light & Color
Written by Jonathan Sachs Copyright 1996-1999 Digital Light & Color Introduction When using digital equipment to capture, store, modify and view photographic images, they must first be converted to a set
More informationBasler. Line Scan Cameras
Basler Line Scan Cameras High-quality line scan technology meets a cost-effective GigE interface Real color support in a compact housing size Shading correction compensates for difficult lighting conditions
More informationMouse Control using a Web Camera based on Colour Detection
Mouse Control using a Web Camera based on Colour Detection Abhik Banerjee 1, Abhirup Ghosh 2, Koustuvmoni Bharadwaj 3, Hemanta Saikia 4 1, 2, 3, 4 Department of Electronics & Communication Engineering,
More informationWHITE PAPER. Are More Pixels Better? www.basler-ipcam.com. Resolution Does it Really Matter?
WHITE PAPER www.basler-ipcam.com Are More Pixels Better? The most frequently asked question when buying a new digital security camera is, What resolution does the camera provide? The resolution is indeed
More informationThe Limits of Human Vision
The Limits of Human Vision Michael F. Deering Sun Microsystems ABSTRACT A model of the perception s of the human visual system is presented, resulting in an estimate of approximately 15 million variable
More information2/26/2008. Sensors For Robotics. What is sensing? Why do robots need sensors? What is the angle of my arm? internal information
Sensors For Robotics What makes a machine a robot? Sensing Planning Acting information about the environment action on the environment where is the truck? What is sensing? Sensing is converting a quantity
More informationSpatial location in 360 of reference points over an object by using stereo vision
EDUCATION Revista Mexicana de Física E 59 (2013) 23 27 JANUARY JUNE 2013 Spatial location in 360 of reference points over an object by using stereo vision V. H. Flores a, A. Martínez a, J. A. Rayas a,
More informationSelf-Calibrated Structured Light 3D Scanner Using Color Edge Pattern
Self-Calibrated Structured Light 3D Scanner Using Color Edge Pattern Samuel Kosolapov Department of Electrical Engineering Braude Academic College of Engineering Karmiel 21982, Israel e-mail: ksamuel@braude.ac.il
More informationLIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK
vii LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK LIST OF CONTENTS LIST OF TABLES LIST OF FIGURES LIST OF NOTATIONS LIST OF ABBREVIATIONS LIST OF APPENDICES
More informationDigital Image Fundamentals. Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr Imaging process Light reaches surfaces in 3D. Surfaces reflect. Sensor element receives
More informationLaser Gesture Recognition for Human Machine Interaction
International Journal of Computer Sciences and Engineering Open Access Research Paper Volume-04, Issue-04 E-ISSN: 2347-2693 Laser Gesture Recognition for Human Machine Interaction Umang Keniya 1*, Sarthak
More informationNavigation Aid And Label Reading With Voice Communication For Visually Impaired People
Navigation Aid And Label Reading With Voice Communication For Visually Impaired People A.Manikandan 1, R.Madhuranthi 2 1 M.Kumarasamy College of Engineering, mani85a@gmail.com,karur,india 2 M.Kumarasamy
More informationHSI BASED COLOUR IMAGE EQUALIZATION USING ITERATIVE n th ROOT AND n th POWER
HSI BASED COLOUR IMAGE EQUALIZATION USING ITERATIVE n th ROOT AND n th POWER Gholamreza Anbarjafari icv Group, IMS Lab, Institute of Technology, University of Tartu, Tartu 50411, Estonia sjafari@ut.ee
More informationAbstract. Introduction
SPACECRAFT APPLICATIONS USING THE MICROSOFT KINECT Matthew Undergraduate Student Advisor: Dr. Troy Henderson Aerospace and Ocean Engineering Department Virginia Tech Abstract This experimental study involves
More information3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Christian Zinner Safe and Autonomous Systems
More informationIntelligent Flexible Automation
Intelligent Flexible Automation David Peters Chief Executive Officer Universal Robotics February 20-22, 2013 Orlando World Marriott Center Orlando, Florida USA Trends in AI and Computing Power Convergence
More informationThe Information Processing model
The Information Processing model A model for understanding human cognition. 1 from: Wickens, Lee, Liu, & Becker (2004) An Introduction to Human Factors Engineering. p. 122 Assumptions in the IP model Each
More informationProcessing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230
Processing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230 Optical Illusions http://www.michaelbach.de/ot/mot_mib/index.html Vision We construct images unconsciously
More informationDigital Camera Imaging Evaluation
Digital Camera Imaging Evaluation Presenter/Author J Mazzetta, Electro Optical Industries Coauthors Dennis Caudle, Electro Optical Industries Bob Wageneck, Electro Optical Industries Contact Information
More information3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Manfred Gruber Safe and Autonomous Systems
More informationUSING THE XBOX KINECT TO DETECT FEATURES OF THE FLOOR SURFACE
USING THE XBOX KINECT TO DETECT FEATURES OF THE FLOOR SURFACE By STEPHANIE COCKRELL Submitted in partial fulfillment of the requirements For the degree of Master of Science Thesis Advisor: Gregory Lee
More informationDefinitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots
Definitions A robot is A programmable, multifunction manipulator designed to move material, parts, tools, or specific devices through variable programmed motions for the performance of a variety of tasks.
More informationUnderstanding Megapixel Camera Technology for Network Video Surveillance Systems. Glenn Adair
Understanding Megapixel Camera Technology for Network Video Surveillance Systems Glenn Adair Introduction (1) 3 MP Camera Covers an Area 9X as Large as (1) VGA Camera Megapixel = Reduce Cameras 3 Mega
More informationThe Scientific Data Mining Process
Chapter 4 The Scientific Data Mining Process When I use a word, Humpty Dumpty said, in rather a scornful tone, it means just what I choose it to mean neither more nor less. Lewis Carroll [87, p. 214] In
More informationDesigning Behavior-Based Systems
Designing Behavior-Based Systems Objectives Use schema theory to design and program behaviors Design a complete behavioral system Understand how to develop a behavioral table for behaviors Understand how
More informationPersonal Identity Verification (PIV) IMAGE QUALITY SPECIFICATIONS FOR SINGLE FINGER CAPTURE DEVICES
Personal Identity Verification (PIV) IMAGE QUALITY SPECIFICATIONS FOR SINGLE FINGER CAPTURE DEVICES 1.0 SCOPE AND PURPOSE These specifications apply to fingerprint capture devices which scan and capture
More informationA Game of Numbers (Understanding Directivity Specifications)
A Game of Numbers (Understanding Directivity Specifications) José (Joe) Brusi, Brusi Acoustical Consulting Loudspeaker directivity is expressed in many different ways on specification sheets and marketing
More informationHow To Use Trackeye
Product information Image Systems AB Main office: Ågatan 40, SE-582 22 Linköping Phone +46 13 200 100, fax +46 13 200 150 info@imagesystems.se, Introduction TrackEye is the world leading system for motion
More informationUnderstanding Line Scan Camera Applications
Understanding Line Scan Camera Applications Discover the benefits of line scan cameras, including perfect, high resolution images, and the ability to image large objects. A line scan camera has a single
More informationMobile Robot FastSLAM with Xbox Kinect
Mobile Robot FastSLAM with Xbox Kinect Design Team Taylor Apgar, Sean Suri, Xiangdong Xi Design Advisor Prof. Greg Kowalski Abstract Mapping is an interesting and difficult problem in robotics. In order
More informationCSCA0201 FUNDAMENTALS OF COMPUTING. Chapter 4 Output Devices
CSCA0201 FUNDAMENTALS OF COMPUTING Chapter 4 Output Devices 1 Topics: Output Devices Examples of Output Device Printer Monitor Speakers Projector 2 Output Devices Any peripheral that receives or displays
More informationRange sensors. Sonar. Laser range finder. Time of Flight Camera. Structured light. 4a - Perception - Sensors. 4a 45
R. Siegwart & D. Scaramuzza, ETH Zurich - ASL 4a 45 Range sensors Sonar Laser range finder Time of Flight Camera Structured light Infrared sensors Noncontact bump sensor (1) sensing is based on light intensity.
More informationAutomatic and Objective Measurement of Residual Stress and Cord in Glass
Automatic and Objective Measurement of Residual Stress and Cord in Glass GlassTrend - ICG TC15/21 Seminar SENSORS AND PROCESS CONTROL 13-14 October 2015, Eindhoven Henning Katte, ilis gmbh copyright ilis
More informationCanny Edge Detection
Canny Edge Detection 09gr820 March 23, 2009 1 Introduction The purpose of edge detection in general is to significantly reduce the amount of data in an image, while preserving the structural properties
More informationHow Landsat Images are Made
How Landsat Images are Made Presentation by: NASA s Landsat Education and Public Outreach team June 2006 1 More than just a pretty picture Landsat makes pretty weird looking maps, and it isn t always easy
More informationT-REDSPEED White paper
T-REDSPEED White paper Index Index...2 Introduction...3 Specifications...4 Innovation...6 Technology added values...7 Introduction T-REDSPEED is an international patent pending technology for traffic violation
More informationSpace Perception and Binocular Vision
Space Perception and Binocular Vision Space Perception Monocular Cues to Three-Dimensional Space Binocular Vision and Stereopsis Combining Depth Cues 9/30/2008 1 Introduction to Space Perception Realism:
More informationInternational Year of Light 2015 Tech-Talks BREGENZ: Mehmet Arik Well-Being in Office Applications Light Measurement & Quality Parameters
www.led-professional.com ISSN 1993-890X Trends & Technologies for Future Lighting Solutions ReviewJan/Feb 2015 Issue LpR 47 International Year of Light 2015 Tech-Talks BREGENZ: Mehmet Arik Well-Being in
More informationA Short Introduction to Computer Graphics
A Short Introduction to Computer Graphics Frédo Durand MIT Laboratory for Computer Science 1 Introduction Chapter I: Basics Although computer graphics is a vast field that encompasses almost any graphical
More informationDigital Remote Sensing Data Processing Digital Remote Sensing Data Processing and Analysis: An Introduction and Analysis: An Introduction
Digital Remote Sensing Data Processing Digital Remote Sensing Data Processing and Analysis: An Introduction and Analysis: An Introduction Content Remote sensing data Spatial, spectral, radiometric and
More informationA PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA
A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - nzarrin@qiau.ac.ir
More informationRobotics. Chapter 25. Chapter 25 1
Robotics Chapter 25 Chapter 25 1 Outline Robots, Effectors, and Sensors Localization and Mapping Motion Planning Motor Control Chapter 25 2 Mobile Robots Chapter 25 3 Manipulators P R R R R R Configuration
More informationComputer Peripherals
Computer Peripherals Reading: Chapter 10 (except 10.6) Peripherals Devices that are separate from the basic computer Not the CPU, memory, or power supply Classified as input, output, and storage Connect
More informationCalibration Best Practices
Calibration Best Practices for Manufacturers SpectraCal, Inc. 17544 Midvale Avenue N., Suite 100 Shoreline, WA 98133 (206) 420-7514 info@spectracal.com http://studio.spectracal.com Calibration Best Practices
More informationHigh Resolution RF Analysis: The Benefits of Lidar Terrain & Clutter Datasets
0 High Resolution RF Analysis: The Benefits of Lidar Terrain & Clutter Datasets January 15, 2014 Martin Rais 1 High Resolution Terrain & Clutter Datasets: Why Lidar? There are myriad methods, techniques
More informationUnderstanding Network Video Security Systems
Understanding Network Video Security Systems Chris Adesanya Panasonic System Solutions Company adesanyac@us.panasonic.com Introduction and Overview This session will provide vendor neutral introduction
More informationLecture 16: A Camera s Image Processing Pipeline Part 1. Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011)
Lecture 16: A Camera s Image Processing Pipeline Part 1 Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Today (actually all week) Operations that take photons to an image Processing
More information3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM
3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM Dr. Trikal Shivshankar 1, Patil Chinmay 2, Patokar Pradeep 3 Professor, Mechanical Engineering Department, SSGM Engineering
More informationApplication Report: Running µshape TM on a VF-20 Interferometer
: Running µshape TM on a VF-20 Interferometer General This report describes how a fiber interferometer from Arden Photonics Ltd was used together with the µshape TM Generic software package. The VF-20
More informationCorrecting the Lateral Response Artifact in Radiochromic Film Images from Flatbed Scanners
Correcting the Lateral Response Artifact in Radiochromic Film Images from Flatbed Scanners Background The lateral response artifact (LRA) in radiochromic film images from flatbed scanners was first pointed
More informationPDF Created with deskpdf PDF Writer - Trial :: http://www.docudesk.com
CCTV Lens Calculator For a quick 1/3" CCD Camera you can work out the lens required using this simple method: Distance from object multiplied by 4.8, divided by horizontal or vertical area equals the lens
More informationData Sheet. definiti 3D Stereo Theaters + definiti 3D Stereo Projection for Full Dome. S7a1801
S7a1801 OVERVIEW In definiti 3D theaters, the audience wears special lightweight glasses to see the world projected onto the giant dome screen with real depth perception called 3D stereo. The effect allows
More informationSensors and Cellphones
Sensors and Cellphones What is a sensor? A converter that measures a physical quantity and converts it into a signal which can be read by an observer or by an instrument What are some sensors we use every
More informationFixplot Instruction Manual. (data plotting program)
Fixplot Instruction Manual (data plotting program) MANUAL VERSION2 2004 1 1. Introduction The Fixplot program is a component program of Eyenal that allows the user to plot eye position data collected with
More informationEPSON SCANNING TIPS AND TROUBLESHOOTING GUIDE Epson Perfection 3170 Scanner
EPSON SCANNING TIPS AND TROUBLESHOOTING GUIDE Epson Perfection 3170 Scanner SELECT A SUITABLE RESOLUTION The best scanning resolution depends on the purpose of the scan. When you specify a high resolution,
More informationIntroduction to Robotics Analysis, Systems, Applications
Introduction to Robotics Analysis, Systems, Applications Saeed B. Niku Mechanical Engineering Department California Polytechnic State University San Luis Obispo Technische Urw/carsMt Darmstadt FACHBEREfCH
More informationRobust and accurate global vision system for real time tracking of multiple mobile robots
Robust and accurate global vision system for real time tracking of multiple mobile robots Mišel Brezak Ivan Petrović Edouard Ivanjko Department of Control and Computer Engineering, Faculty of Electrical
More informationMultisensor Data Fusion and Applications
Multisensor Data Fusion and Applications Pramod K. Varshney Department of Electrical Engineering and Computer Science Syracuse University 121 Link Hall Syracuse, New York 13244 USA E-mail: varshney@syr.edu
More informationEndoscope Optics. Chapter 8. 8.1 Introduction
Chapter 8 Endoscope Optics Endoscopes are used to observe otherwise inaccessible areas within the human body either noninvasively or minimally invasively. Endoscopes have unparalleled ability to visualize
More informationUsing angular speed measurement with Hall effect sensors to observe grinding operation with flexible robot.
Using angular speed measurement with Hall effect sensors to observe grinding operation with flexible robot. François Girardin 1, Farzad Rafieian 1, Zhaoheng Liu 1, Marc Thomas 1 and Bruce Hazel 2 1 Laboratoire
More informationDifferentiation of 3D scanners and their positioning method when applied to pipeline integrity
11th European Conference on Non-Destructive Testing (ECNDT 2014), October 6-10, 2014, Prague, Czech Republic More Info at Open Access Database www.ndt.net/?id=16317 Differentiation of 3D scanners and their
More informationUsing Image J to Measure the Brightness of Stars (Written by Do H. Kim)
Using Image J to Measure the Brightness of Stars (Written by Do H. Kim) What is Image J? Image J is Java-based image processing program developed at the National Institutes of Health. Image J runs on everywhere,
More informationTutorial for Tracker and Supporting Software By David Chandler
Tutorial for Tracker and Supporting Software By David Chandler I use a number of free, open source programs to do video analysis. 1. Avidemux, to exerpt the video clip, read the video properties, and save
More informationNational Performance Evaluation Facility for LADARs
National Performance Evaluation Facility for LADARs Kamel S. Saidi (presenter) Geraldine S. Cheok William C. Stone The National Institute of Standards and Technology Construction Metrology and Automation
More informationHigh Resolution Planetary Imaging Workflow
High Resolution Planetary Imaging Workflow Fighting the Atmosphere Getting out of the Atmosphere Adaptive Optics Lucky Imaging Lucky Imaging is the process of capturing planets using a CCD video camera.
More informationInterference. Physics 102 Workshop #3. General Instructions
Interference Physics 102 Workshop #3 Name: Lab Partner(s): Instructor: Time of Workshop: General Instructions Workshop exercises are to be carried out in groups of three. One report per group is due by
More informationIntegrated sensors for robotic laser welding
Proceedings of the Third International WLT-Conference on Lasers in Manufacturing 2005,Munich, June 2005 Integrated sensors for robotic laser welding D. Iakovou *, R.G.K.M Aarts, J. Meijer University of
More information1. Introduction to image processing
1 1. Introduction to image processing 1.1 What is an image? An image is an array, or a matrix, of square pixels (picture elements) arranged in columns and rows. Figure 1: An image an array or a matrix
More informationImplementation of Canny Edge Detector of color images on CELL/B.E. Architecture.
Implementation of Canny Edge Detector of color images on CELL/B.E. Architecture. Chirag Gupta,Sumod Mohan K cgupta@clemson.edu, sumodm@clemson.edu Abstract In this project we propose a method to improve
More informationSensor Modeling for a Walking Robot Simulation. 1 Introduction
Sensor Modeling for a Walking Robot Simulation L. France, A. Girault, J-D. Gascuel, B. Espiau INRIA, Grenoble, FRANCE imagis, GRAVIR/IMAG, Grenoble, FRANCE Abstract This paper proposes models of short-range
More informationColour Image Segmentation Technique for Screen Printing
60 R.U. Hewage and D.U.J. Sonnadara Department of Physics, University of Colombo, Sri Lanka ABSTRACT Screen-printing is an industry with a large number of applications ranging from printing mobile phone
More informationChoosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ.
Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ., Raleigh, NC One vital step is to choose a transfer lens matched to your
More informationVideo Conferencing Display System Sizing and Location
Video Conferencing Display System Sizing and Location As video conferencing systems become more widely installed, there are often questions about what size monitors and how many are required. While fixed
More informationHow To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud
REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR Paul Mrstik, Vice President Technology Kresimir Kusevic, R&D Engineer Terrapoint Inc. 140-1 Antares Dr. Ottawa, Ontario K2E 8C4 Canada paul.mrstik@terrapoint.com
More information3D/4D acquisition. 3D acquisition taxonomy 22.10.2014. Computer Vision. Computer Vision. 3D acquisition methods. passive. active.
Das Bild kann zurzeit nicht angezeigt werden. 22.10.2014 3D/4D acquisition 3D acquisition taxonomy 3D acquisition methods passive active uni-directional multi-directional uni-directional multi-directional
More informationDisplays. Cathode Ray Tube. Semiconductor Elements. Basic applications. Oscilloscope TV Old monitors. 2009, Associate Professor PhD. T.
Displays Semiconductor Elements 1 Cathode Ray Tube Basic applications Oscilloscope TV Old monitors 2 1 Idea of Electrostatic Deflection 3 Inside an Electrostatic Deflection Cathode Ray Tube Gun creates
More informationPoker Vision: Playing Cards and Chips Identification based on Image Processing
Poker Vision: Playing Cards and Chips Identification based on Image Processing Paulo Martins 1, Luís Paulo Reis 2, and Luís Teófilo 2 1 DEEC Electrical Engineering Department 2 LIACC Artificial Intelligence
More informationParticles, Flocks, Herds, Schools
CS 4732: Computer Animation Particles, Flocks, Herds, Schools Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Control vs. Automation Director's
More informationFRC WPI Robotics Library Overview
FRC WPI Robotics Library Overview Contents 1.1 Introduction 1.2 RobotDrive 1.3 Sensors 1.4 Actuators 1.5 I/O 1.6 Driver Station 1.7 Compressor 1.8 Camera 1.9 Utilities 1.10 Conclusion Introduction In this
More informationOverview. Proven Image Quality and Easy to Use Without a Frame Grabber. Your benefits include:
Basler runner Line Scan Cameras High-quality line scan technology meets a cost-effective GigE interface Real color support in a compact housing size Shading correction compensates for difficult lighting
More informationCanalis. CANALIS Principles and Techniques of Speaker Placement
Canalis CANALIS Principles and Techniques of Speaker Placement After assembling a high-quality music system, the room becomes the limiting factor in sonic performance. There are many articles and theories
More informationAnalog control unit for mobile robots
Analog control unit for mobile robots Soldering kit for experimentation For Fischertechnik robots and others Most diverse functions Requires no programming Patented sensor technology Summary We are pleased
More information