Robotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information.



Similar documents
Robot Perception Continued

E190Q Lecture 5 Autonomous Robot Navigation

MSc in Autonomous Robotics Engineering University of York

Robotics. Chapter 25. Chapter 25 1

CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen

How To Program An Nxt Mindstorms On A Computer Or Tablet Computer

Tutorial for Programming the LEGO MINDSTORMS NXT

understanding sensors

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Synthetic Sensing: Proximity / Distance Sensors

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Mobile Robot FastSLAM with Xbox Kinect

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

2/26/2008. Sensors For Robotics. What is sensing? Why do robots need sensors? What is the angle of my arm? internal information

Definitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots

Static Environment Recognition Using Omni-camera from a Moving Vehicle

Sensors and Cellphones

C.I. La chaîne d information LES CAPTEURS. Page 1 sur 5

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

Figure 1. The Ball and Beam System.

TwinCAT NC Configuration

ARGUS GR RFHQ\ VWDQX WHFKQLF]QHJR

Servo Info and Centering

Mobile Robotics I: Lab 2 Dead Reckoning: Autonomous Locomotion Using Odometry

PID Control. Proportional Integral Derivative (PID) Control. Matrix Multimedia 2011 MX009 - PID Control. by Ben Rowland, April 2011

Power Electronics. Prof. K. Gopakumar. Centre for Electronics Design and Technology. Indian Institute of Science, Bangalore.

Automatic Labeling of Lane Markings for Autonomous Vehicles

Using NI Vision & Motion for Automated Inspection of Medical Devices and Pharmaceutical Processes. Morten Jensen 2004

Solving Simultaneous Equations and Matrices

TETRIX Add-On Extensions. Encoder Programming Guide (ROBOTC )

Sound Power Measurement

Programming LEGO NXT Robots using NXC

Path Tracking for a Miniature Robot

How does the Kinect work? John MacCormick

Encoders for Linear Motors in the Electronics Industry

Robot coined by Karel Capek in a 1921 science-fiction Czech play

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Development of Docking System for Mobile Robots Using Cheap Infrared Sensors

The Basics of Robot Mazes Teacher Notes

Polarization of Light

TRIMBLE ATS TOTAL STATION ADVANCED TRACKING SYSTEMS FOR HIGH-PRECISION CONSTRUCTION APPLICATIONS

Understanding and Applying Kalman Filtering

Chapter 3. Track and Wheel Load Testing

Robotics. Lecture 1: Introduction to Robotics. See course website for up to date information.

Introduction to Robotics Analysis, Systems, Applications

Alignment Laser System.

Automated Container Handling in Port Terminals

State Newton's second law of motion for a particle, defining carefully each term used.

North Texas FLL Coaches' Clinics. Beginning Programming October Patrick R. Michaud republicofpi.org

EV3 Programming. Overview for FLL Coaches. A very big high five to Tony Ayad

Servo Motors (SensorDAQ only) Evaluation copy. Vernier Digital Control Unit (DCU) LabQuest or LabPro power supply

Self-Balancing Robot Project Proposal Abstract. Strategy. Physical Construction. Spencer Burdette March 9, 2007

Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System

Microcontrollers, Actuators and Sensors in Mobile Robots

TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA

NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS

MOBILE ROBOT TRACKING OF PRE-PLANNED PATHS. Department of Computer Science, York University, Heslington, York, Y010 5DD, UK

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

Types of 3D Scanners and 3D Scanning Technologies.

T-REDSPEED White paper

Problem 6.40 and 6.41 Kleppner and Kolenkow Notes by: Rishikesh Vaidya, Physics Group, BITS-Pilani

Force/position control of a robotic system for transcranial magnetic stimulation

Industrial Robotics. Training Objective

CONTENTS. What is ROBOTC? Section I: The Basics

Parameter identification of a linear single track vehicle model

2. Dynamics, Control and Trajectory Following

Digital Systems Based on Principles and Applications of Electrical Engineering/Rizzoni (McGraw Hill

Current Loop Tuning Procedure. Servo Drive Current Loop Tuning Procedure (intended for Analog input PWM output servo drives) General Procedure AN-015

AEO Head Movement Tracker X-GYRO 1000 USER MANUAL(V1.1bata )

EE 402 RECITATION #13 REPORT

By: M.Habibullah Pagarkar Kaushal Parekh Jogen Shah Jignasa Desai Prarthna Advani Siddhesh Sarvankar Nikhil Ghate

Algebra Academic Content Standards Grade Eight and Grade Nine Ohio. Grade Eight. Number, Number Sense and Operations Standard

REPORT ON CANDIDATES WORK IN THE CARIBBEAN ADVANCED PROFICIENCY EXAMINATION MAY/JUNE 2008 ELECTRICAL AND ELECTRONIC TECHNOLOGY (TRINIDAD AND TOBAGO)

KINEMATICS OF PARTICLES RELATIVE MOTION WITH RESPECT TO TRANSLATING AXES

FRC WPI Robotics Library Overview

Active Vibration Isolation of an Unbalanced Machine Spindle

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

AS COMPETITION PAPER 2008

Sensor Modeling for a Walking Robot Simulation. 1 Introduction

Experiment #1, Analyze Data using Excel, Calculator and Graphs.

AN INTERACTIVE USER INTERFACE TO THE MOBILITY OBJECT MANAGER FOR RWI ROBOTS

Basic Principles of Inertial Navigation. Seminar on inertial navigation systems Tampere University of Technology

Three Channel Optical Incremental Encoder Modules Technical Data

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

A Measurement of 3-D Water Velocity Components During ROV Tether Simulations in a Test Tank Using Hydroacoustic Doppler Velocimeter

ELECTRICAL ENGINEERING

Time series analysis Matlab tutorial. Joachim Gross

Data Storage. Chapter 3. Objectives. 3-1 Data Types. Data Inside the Computer. After studying this chapter, students should be able to:

Advanced Programming with LEGO NXT MindStorms

Experimental Results from TelOpTrak - Precision Indoor Tracking of Tele-operated UGVs

MANUAL FOR RX700 LR and NR

Epec 4W30 INSTRUCTIONS FOR USE. Seinäjoki Finland MEASURING EQUIPMENT NOTE! Read this manual and observe all safety instructions.

CNC-STEP. "LaserProbe4500" 3D laser scanning system Instruction manual

J. P. Oakley and R. T. Shann. Department of Electrical Engineering University of Manchester Manchester M13 9PL U.K. Abstract

NEW MEXICO Grade 6 MATHEMATICS STANDARDS

What s Left in E11? Technical Writing E11 Final Report

Fiber Optics: Fiber Basics

Field and Service Robotics. Odometry sensors

Transcription:

Robotics Lecture 3: Sensors See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London

Review: Locomotion Practical from Lecture 2 Gradual motion drift from perfect square Causes: initial alignment errors; wheel slip; miscalibration; unequal left/right motors; others...? Using synchronized PID control improves matters but we will never achieve a perfect result every time in this experiment.

A Well-Calibrated Robot After careful calibration the robot should on average return to the desired location, but scatter will remain due to uncontrollable factors (variable wheel slip, surface, air currents!?... ) Systematic error removed; what remains are zero mean errors. We can model the zero mean errors probabilistically: in many cases a Gaussian (normal) distribution is suitable. The size of the distribution of the errors will grow as the robot moves further around the square.

W R R W What does this mean for estimating motion? Perfect motion integration from odometry is not possible: y x y R Coordinate Frame Carried With Robot θ Fixed World Coordinate Frame W Recall from the last lecture: state update equations: During a straight-line period of motion of distance D: x new y new = x + D cos θ y + D sin θ θ new θ x W During a pure rotation of angle angle α: x new y new θ new = x y θ + α

Uncertainty in Motion A better model acknowledges that this ideal trajectory is affected by uncertain perturbations ( motion noise ). For example, we could use this simple model: During a straight-line period of motion of distance D: x new x + (D + e) cos θ y new θ new = y + (D + e) sin θ θ + f During a pure rotation of angle angle α: x new x y new θ new = y θ + α + g Here e, f and g are noise terms, with zero mean and a Gaussian distribution, describing how actual motion might deviate from the ideal trajectory. Adding these terms won t help us to move a robot more accurately when it is guided with only odometry; but are important later when we probabilistically combine odometry with other sensing.

Sensors: Proprioceptive and Outward-Looking As we saw in the first lecture, sensors are either proprioceptive (literally self-sensing) or outward-looking. Proprioceptive sensors (such as motor encoders or internal force sensors) will improve a robot s sense of its own internal state and motion. But without outward-looking sensors a mobile robot is moving blindly. It needs them to, for example: Localise without drift with respect to a map. Recognise places and objects it has seen before. Map out free space and avoid obstacles. Interact with objects and people. In general, be aware of its environment.

Sensor Measurements: Proprioceptive Sensors gather numerical readings or measurements. In the case of proprioceptive sensors, the value of the measurement z p will depend on (be a function of) just the state of the robot x: z p = z p (x). The state of a robot is a vector of variables we use to describe its current status. e.g. for a simple robot moving on a flat ground plane we might have: x x = y θ More generally, a proprioceptive measurement might depend not just on the current state but also previous states in the robot s history or the current rate of change of state. e.g. wheel odometry will report a reading depending on the difference between the current and previous state. A gyro will report a reading depending on the current rate of rotation

Sensor Measurements: Outward-Looking A measurement from an outward looking sensor will depend both on the state of the robot x and the state of the world around it y: z o = z o (x, y). The state of the world might be parameterised in many ways; e.g. a list of the geometric coordinates of walls or landmarks; and may either be uncertain or perfectly known.

Single and Multiple Value Sensors Touch, light and sonar sensors each return a single value within a given range. Sensors such as a camera or laser range-finder return an array of values. This can be achieved by scanning a single sensing element (as in a laser range-finder) or by having an array of sensing elements (such as the pixels of a camera s CCD chip).

Touch Sensors Binary on/off state no processing required. Switch open no current flows. Switch closed current flows (hit).

Touch Sensors for Bump Detection Sensor detects when an obstacle has been hit (last line of defence). Demands immediate reaction evasive manoeuvre, or stop forward motion at least. Two bump sensors which should never be activated at the same time could be wired in parallel to the same input.

Multiple Touch Sensors Where was I hit? Touch sensors mounted inside floating skirt around circular robot: Four sensors give the ability to measure eight bump directions.

Strategies After a Collision Explore reverse, and try to go around (e.g. turn to a fixed angle from hit and proceed). Moving object? Follow (turn towards hit, wait, then drive forward), or run away (turn away from hit and drive forward).

Light Sensors Detect intensity of light incident from single forward direction. Multiple sensors pointing in different directions can guide steering behaviours. The Lego sensors also have a mode where they emit their own light, which will reflect off close targets and can be used for following a line on the floor or quite effective short-range obstacle avoidance.

Sonar (Ultrasonic) Sensors Measure depth by emitting an ultrasonic pulse and timing interval until echo returns. Fairly accurate depth measurement in one direction but can give noisy measurements in the presence of complicated shapes. Robots sometimes have a ring of sonar sensors for obstable detection. Useful underwater where it is the only serious option beyond very short ranges.

Probabilistic Sensor Modelling Like robot motion, robot sensing is also fundamentally uncertain. Real sensors do not report the exact truth of the quantities they are measuring but a perturbed version. Having characterized (modelled; calibrated?) a sensor and understood the uncertainty in its measurements we can build a probabilistic measurement model for how it works. This will be a probability distribution (specifically a likelihood function) of the form: p(z o x, y). Such a distribution will often have a Gaussian shape.

Likelihood Functions p(z v) A likelihood function fully describes a sensor s performance. p(z v) is a function of both measurement variables z and ground truth v and can be plotted as a probability surface. e.g. for a depth sensor: p(z v) p(z v) (0, 0, 0) z (0, 0, 0) z z v z=v v z=v v z=v z=1.1v Constant Uncertainty Growing Systematic Error (Biased)

Robust Likelihood for Sonar Update We will return to this in later lectures on probabilistic robotics, but a suitable likelihood function for a sonar sensor is as follows: it says what is the probability of obtaining sensor measurement z given that the ground truth value I expect is m? This distribution has a narrow Gaussian band around the expected value, plus a constant additive band representing a fixed percentage of garbage measurements. p(z m) K 0 z=m z (z m) 2 p(z m) e 2σs 2 + K

External Sensing: Laser Range-Finder Returns an array of depth measurements from its scanning beam. Very accurate measurement of both depth and direction from time-of-flight measurement of scanning laser beam Normally scans in a 2D plane but 3D versions are also available Rather bulky (and expensive) for small robots

External Sensing: Vision A camera measures light intensity in many directions simultaneously by directing incident light onto a sensing chip with an array of light sensitive elements Normally returns a large, rectangular array of measurements. A single camera measures angles, but not depth information. 3D information comes from either multiple cameras or motion Vast research area: object recognition, location recognition, tracking, 3D reconstruction, etc. Huge motivation from biology Highly attractive for general purpose, low-cost robotics

Servoing Servoing is a robot control technique where control parameters (such as the desired speed of a motor) are coupled directly to a sensor reading and updated regularly in a negative feedback loop. It is also sometimes known as closed loop control. Servoing needs high frequency update of the sensor/control cycle or motion may oscillate. The Mindstorms NXT motors or the servos used in model radio-controlled cars and aircraft use internal position encoders to perform position control:

Proportional Control using an External Sensor: Servoing In servoing, a control demand is set which over time aims to bring the current value of a sensor reading into agreement with a desired value. Proportional control: set demand proportional to negative error (difference between desired sensor value and actual sensor value): e.g. set velocity proportional to error: v = k p (z desired z actual ), where κ is the proportional gain constant. (Note that since this is a different control loop, k p will not have the same value as in last week s motor tuning but will need to be individually adjusted through trial and error.) actual desired Proprotional control is a special case of more general PID Control (Proportional, Integral, Differential).

Outward-Looking Sensor Servoing: Some Details In the previous slide we saw a proportional law to control velocity; but in our robots when we set a velocity demand there is another level of PID control underneath (as we looked at last week) which monitors the motor encoders translates velocity demands into a sequence of PWM values which are actually sent to the motors. This is an example of cascade control where the output of one control loop is used to set the input to another. While in principle both controllers could be combined into a unified whole, this is a very useful abstraction which hides the details of motor/encoder control from the upper level work with sensor control. Problems will not arise unless the top level controller requests velocity changes too rapidly, exceeding the bandwidth of the lower level controller.j Outward looking sensors such as sonar or cameras will sometimes produce garbage readings for a number of different possible reasons. It can often be sensible to use a strategy to try to remove these outliers, such as median filtering which doesn t use the most recent sensor value in the control law but a smoothed value such as the median of the past n measurements. Note though that this will also reduce the responsiveness of the system.

Very High Speed Visual Servoing Research from the University of Tokyo, late 1990s, using a custom 1000Hz camera and high performance robotics.

Visual Servoing to Control Steering For a robot with a tricycle or car-type wheel configuration. Simple steering law which will guide robot to collide with target: s = k p α Steering law which will guide robot to avoid obstace at a safe radius: subtract offset: s = k p (α sin 1 R D ) O R 1 sin (R/D) α v h D s

Visual Servoing Trajectories z z z x x x Compare with simpler steering law which will guide robot to collide with target: s = k p α

Combining Sensors: World Model Approach Capture data; store and manipulate it using symbolic representations. Plan a sequence of actions to achieve a given goal Execute plan. If the world changes during execution, stop and re-plan. Powerful, but computationally expensive and complicated! Shakey: one of the first mobile robots. Probabilistic state inference and planning is the modern version of this able to cope with uncertainty in sensors.

Combining: Sensing/Action Loops No modelling and planning! Consider each local servo -like sensing-action loop as a behaviour. Sense Act The challenge is in combining many behaviours into useful overall activity: see TR Programs, Subsumption, Braitenberg vehicles.

This week s practical: Simple Sensor Control Loops and Wall Following Example wall follower: https://www.youtube.com/watch?v=bu9k5z0ckjs. We will try to do even better with smooth proportional gain!