BEST2015 Autonomous Mobile Robots Lecture 3: Mobile Robot Localization
|
|
- Nathaniel Mitchell
- 7 years ago
- Views:
Transcription
1 BEST2015 Autonomous Mobile Robots Lecture 3: Mobile Robot Localization Renaud Ronsse École polytechnique de Louvain, UCLouvain July Introduction c Siegwart et al., 2011, Fig. 6.27, p. 417 Localization is about determining the robot s position within the environment. 2
2 Do we need to localize or not? To go from A to B, does the robot need to know where it is? c Siegwart et al., 2011, Fig. 5.6, p. 276 Navigation without hitting obstacles + detection of goal location Possible by following always the left wall, but how to detect that the goal is reached? What about efficiency? 3 1 Introduction 2 The Challenge of Localization 3 Odometry 4 Belief Representation 5 Probabilistic Map-Based Localization 6 Kalman Filter Localization 7 References 4
3 Global localization requires external devices Example 1: GPS... but this is not accurate! c Siegwart et al., 2011, ETH-Z lecture slides Cannot function indoor and in obstructed areas. 5 Global localization requires external devices Example 2: Beacon Based Localization Systems c Siegwart et al., 2011, Fig. 5.41, p. 347 Ex 1: Poles with highly reflective surface and a laser for detecting them Ex 2: Colored beacons and an omnidirectional camera for detecting them Advanced hardware! 6
4 Global localization requires external devices Example 3: Motion Capture Systems c Siegwart et al., 2011, ETH-Z lecture slides High resolution (from VGA up to 16 Mpixels) Very high frame rate (several hundreds of Hz) Good for ground truth reference and multi-robot control strategies Popular brands: VICON, OptoTrak Very expensive! 7 Beacon Based Localization Systems (1/2) Objective: localize the robot w.r.t. the beacon... or the beacon w.r.t. the robot! BEACON rotating laser ROBOT What is the orientation θ and the distance d? θ = θ 1 + θ 2 2 d = R sin β sin α 2 = R ( sin β ) sin θ2 θ 1 2 R sin ( θ2 θ 1 2 ) 8
5 Beacon Based Localization Systems (1/2) Then, absolute localization can be obtained by triangulation, with at least 3 beacons whose absolute localization is known. c Pierlot and Van Droogenbroeck, IEEE T-RO 2013 Actually, only the angles θ are necessary... 9 Beacon Based Localization Systems (2/2) KIVA Systems, Boston (MA) Warehouse Robots at Work Youtube Unique marker with known absolute 2D position in the map. 10
6 Noise and Aliasing Perception and motion play an important role in localization. Sensor noise: Limitation on the consistency of sensor readings in the same environmental state. Examples: effect of luminosity (e.g. sun) on color-based vision sensors; changes in material reflexion properties for sonars; interferences;... Solution? Temporal and/or multi-sensory fusion. Sensor aliasing: Nonuniqueness of sensor readings: several environment states trigger the same sensory readings! Effector noise: A single action taken by a mobile robot may have several different possible results. Error in motion is viewed as an error in odometry. The source of error generally lies in an incomplete model of the environment: slopes, slip, (human) perturbations, etc. 11 Odometry Position update based on proprioceptive sensors (i.e. no external devices). Position error accumulates over time. Major Error Sources in Odometry: Limited resolution during integration (time increments, measurement resolution) Misalignment of the wheels (deterministic) Unequal wheel diameter (deterministic) Variation in the contact point of the wheel Unequal floor contact (slipping, not planar,... ) deterministic (systematic) non-deterministic (non-systematic) Deterministic errors can be eliminated by proper calibration of the system. Non-deterministic errors have to be described by error models and will always lead to uncertain position estimate. 12
7 Odometry of differential-drive vehicle c Siegwart et al., 2011, Fig. 5.3, p. 271 Assuming rolling without sliding: ω(r + d) = v l ω(r d) = v r c Siciliano and Khatib (Eds.), 2008, Fig. 20.1, p. 477 such that V = ωr = v r+v l and 2 x(t) = V (t) cos (θ(t))dt; y(t) = V (t) sin (θ(t))dt; θ(t) = ω(t)dt 13 Error model for odometric position estimation Incremental travel distances: x p = y = p + p = θ = x y θ x y θ + + s cos (θ + θ/2) s sin (θ + θ/2) θ s r + s l ( ) 2 cos θ + s r s l ( 4b ) 2 sin θ + s r s l 4b s r + s l s r s l 2b So, p = f(x, y, θ, s r, s l ). 14
8 Error model for odometric position estimation So, p = f(x, y, θ, s r, s l ). Assuming now an error model: Σ = covar( s r, s l ) = [ kr s r 0 0 k l s l ] The two errors of the individually driven wheels are independent; The variance of the errors (left and right wheels) are proportional to the absolute value of the traveled distances. The error propagation law gives: Σ p = p f Σ p p f T + rl f Σ rl f T [ ] where p f = f x, f y, f θ and rl f = function gradients. [ f s r, ] f s l are the 15 Error model for odometric position estimation So, p = f(x, y, θ, s r, s l ). p f = rl f = = [ f x, f y, f ] θ [ ] f f, s r s l 1 2 cθ s 2b sθ 1 2 sθ + s 2b cθ ( where cθ = cos θ + θ 2 1 b ) = ( ) 1 0 s sin θ + θ ( 2) 0 1 s cos θ + θ cθ + s 2b sθ 1 2 sθ s 2b cθ 1 b ( and sθ = sin θ + θ 2 ). 16
9 Evolution of pose uncertainty straight-line movement circular movement c Siegwart et al., 2011, Figs. 5.4 and 5.5, pp Errors perpendicular to the direction of movement are growing much faster! Errors ellipse does not remain perpendicular to the direction of movement! 17 Evolution of pose uncertainty c Siegwart et al., 2011, ETH-Z lecture slides from [Fox, Thrun, Burgard, Dellaert, 2000] Note: In reality, errors are not shaped like ellipses! 18
10 1 Introduction 2 The Challenge of Localization 3 Odometry 4 Belief Representation 5 Probabilistic Map-Based Localization 6 Kalman Filter Localization 7 References 19 Belief Representation How do we represent the robot position, where the robot believes to be? a) Continuous map with single hypothesis probability distribution b) Continuous map with multiple hypotheses probability distribution c) Discretized map with probability distribution d) Discretized topological map with probability distribution c Siegwart et al., 2011, Fig. 5.9, p
11 Single-hypothesis Belief Continuous Line-Map Single-hypothesis: the robot postulates its unique position. c Siegwart et al., 2011, Fig. 5.10, p. 281 The robot represents its position as a single point in the continuous set: (x, y, θ). 21 Single-hypothesis Belief Grid and Topological Map c Siegwart et al., 2011, Fig. 5.10, p. 281 c) The position is noted with the same level of fidelity as the map cell size. d) The map is abstract and identifying the position is equivalent to identifying a single node i in the graph. Single belief = no ambiguity but challenging! 22
12 Multi-hypotheses Belief Grid-based Representation The robot tracks a possibly infinite set of positions. It is convenient to sort these possible positions by providing a preference ordering, e.g. by using a mathematical distribution (continuous or discrete). c Siegwart et al., 2011, Figs. 5.9 and 5.11, pp. 279 and 282 Multi belief = the robot can maintain uncertainty about its position. This can also be incorporated in planning (i.e. choosing the path minimizing localization uncertainty). Challenge: how to make decisions? 23 Representation of the Environment Envir. Modeling c Siegwart et al., 2011, Fig. 5.10, p. 281 Raw sensor data, e.g. laser range data, grayscale images large volume of data, low distinctiveness on the level of individual values makes use of all acquired information Low level features, e.g. line other geometric features medium volume of data, average distinctiveness filters out the useful information, still ambiguities High level features, e.g. doors, a car, the Eiffel tower low volume of data, high distinctiveness filters out the useful information, few/no ambiguities, not enough information 24
13 Map representation Often the fidelity of the position representation is bounded by the fidelity of the map: 1 The precision of the map must appropriately match the precision with which the robot needs to achieve its goals. 2 The precision of the map and the type of features represented must match the precision and data types returned by the robot s sensors. 3 The complexity of the map representation has direct impact on the computational complexity of reasoning about mapping, localization, and navigation. 25 Map representation Two approaches: Continuous representation = exact Decomposition strategies c Siegwart et al., 2011, Fig. 5.12, p. 285 c Siegwart et al., 2011, Figs. 5.14, 5.16 and 5.18, pp. 288, 290 and
14 1 Introduction 2 The Challenge of Localization 3 Odometry 4 Belief Representation 5 Probabilistic Map-Based Localization 6 Kalman Filter Localization 7 References 27 Problem statement Consider a mobile robot moving in a known environment: As it starts moving (say from a precisely known location) it can keep track of its location using wheel odometry. After a certain movement the robot will get very uncertain about its position. To keep position uncertainty from growing unbounded, the robot must localize itself in relation to its environment map. To localize, the robot uses its on-board exteroceptive sensors (e.g. ultrasonic, laser, vision sensors) to make observations of its environment. Observations lead to an estimate of the robot position which can then be fused with the odometric estimation to get the best possible update of the actual robot s position. Its uncertainty shrinks. c Siegwart et al., 2011, ETH-Z lecture slides 28
15 Illustration: Improving belief state by moving The robot is placed somewhere in the environment but is not told its location. The robot queries its sensors and finds it is next to a pillar. The robot moves one meter forward. To account for inherent noise in robot motion the new belief is smoother. The robot queries its sensors and again it finds itself next to a pillar. Finally, it updates its belief by combining this information with its previous belief. c Siegwart et al., 2011, Fig. 5.22, p Why a probabilistic approach for localization? Because the data coming from the robot sensors are affected by measurement errors, we can only compute the probability that the robot is in a given configuration. The key idea in probabilistic robotics is to represent uncertainty using probability theory: instead of giving a single best estimate of the current robot configuration, probabilistic robotics represents the robot configuration as a probability distribution over the all possible robot poses. This probability is called belief. 30
16 Examples of probability distributions Example 1: Uniform distribution the robot does not know where it is: the information about the robot configuration is null: c Siegwart et al., 2011, ETH-Z lecture slides Example 2: Multimodal distribution the robot is in the region around a or b: c Siegwart et al., 2011, ETH-Z lecture slides 31 Examples of probability distributions Example 3: Dirac distribution the robot has a probability of 1.0 (i.e. 100%) to be at a given position: c Siegwart et al., 2011, ETH-Z lecture slides δ(x) = { if x = 0 0 otherwise with + δ(x)dx = 1. 32
17 Examples of probability distributions Example 4: Gaussian distribution also called normal distribution (N(µ, σ)): c Siegwart et al., 2011, ETH-Z lecture slides p(x) = 1 σ 2π e (x µ) 2 2σ 2 where µ is the mean value and σ is the standard deviation. 33 Action and perception updates In robot localization, we distinguish two update steps: ACTION (or prediction) update: the robot moves and estimates its position through its proprioceptive sensors. The robot uncertainty grows. PERCEPTION (or measurement) update: the robot makes an observation using its exteroceptive sensors and corrects its position by combining its belief before the observation with the probability of making exactly that observation. The robot uncertainty shrinks. c Siegwart et al., 2011, Fig. 5.20, p
18 Solution to the probabilistic localization problem Solving the probabilistic localization problem consists in solving separately the Action and Perception updates. A probabilistic approach to the mobile robot localization problem is a method able to compute the probability distribution of the robot configuration during each Action and Perception step. 35 Solution to the probabilistic localization problem The ingredients are: 1 The initial probability distribution 2 The statistical error model of the proprioceptive sensors, e.g. wheel encoders: [ ] kr s Σ = covar( s r, s l ) = r 0 0 k l s l 3 The statistical error model of the exteroceptive sensors, e.g. laser, sonar, camera: c Siegwart et al., 2011, ETH-Z lecture slides 4 A map of the environment (If the map is not known a priori then the robot needs to build a map of the environment and then localizes in it. This is called SLAM, Simultaneous Localization And Mapping). 36
19 Basic concepts of probability theory Theorem of Total probability: p(x) = p(x y)p(y)dy where p(x y) = p(x,y) p(y) is the conditional or posterior probability (used for action update). Bayes rule: (used for perception update). y p(x y) = p(y x)p(x) p(y) 37 Principles of Action and Perception updates (Markov localization) Action (=prediction) update: the robot estimates its current position bel(x t ) based on the knowledge of the previous position bel(x t 1 ) and the odometric input u t (Markov assumption): bel(x t ) = p(x t u t, x t 1 )bel(x t 1 )dx t 1 = p(x t u t, x t 1 ) bel(x t 1 ) Perception (=measurement) update: the robot corrects its previous position (i.e. its former belief) by opportunely combining it with the information from its exteroceptive sensors z t : p(x t z t ) = p(z t x t )p(x t ) p(z t ) bel(x t ) = ηp(z t x t, M)bel(x t ) where η is a normalization factor which makes the probability integrate to 1. This localization algorithm is also called a Bayes filter. 38
20 Illustration of probabilistic bap based localization Initial probability distribution: p(x) t0 bel(x 0 ). Perception update: bel(x t ) = ηp(z t x t, M)bel(x t ). Action update: bel(x t ) = p(x t u t, x t 1 ) bel(x t 1 ). Next perception update: p(z t x t ). bel(x t ) = ηp(z t x t, M)bel(x t ). c Siegwart et al., 2011, Fig. 5.22, p Markov versus Kalman localization Two approaches exist to represent the probability distribution and to compute the Total Probability and Bayes Rule during the Action and Perception phases: c Siegwart et al., 2011, ETH-Z lecture slides Markov localization allows for localization starting from any unknown position and can thus recover from ambiguous situations position tracking and global localization. Kalman filter localization tracks the robot from an initially known position position tracking only. 40
21 1 Introduction 2 The Challenge of Localization 3 Odometry 4 Belief Representation 5 Probabilistic Map-Based Localization 6 Kalman Filter Localization 7 References 41 Kalman Filter Localization: principle Special case of Markov localization: the robot belief bel(x t ) is assumed to be Gaussian. Main computation: update the mean and the covariance very efficient. c Siegwart et al., 2011, Fig. 5.29, p
22 Probability theory with Gaussian noise Assuming x 1 = N(µ 1, σ1 2) and x 2 = N(µ 2, σ2 2 ) are two independent and normally distributed variables. If y = Ax 1 + Bx 2 is a linear function of x 1 and x 2, then: or µ y = Aµ 1 + Bµ 2, σ 2 y = A 2 σ B 2 σ 2 2 Σ y = AΣ 1 A T + BΣ 2 B T if x 1 and x 2 are vectors (Σ i denotes the covariance of x i ). If y = f(x 1, x 2 ) is not a linear function of of x 1 and x 2, then it is practical to consider its first-order approximation: such that: y f(µ 1, µ 2 ) + F x1 (x 1 µ 1 ) + F x2 (x 2 µ 2 ) µ y = f(µ 1, µ 2 ), Σ y = F x1 Σ 1 F T x 1 + F x2 Σ 2 F T x 2 43 Bayes rule maximum likelihood principle Assuming p 1 (q) = N(ˆq 1, σ1 2 ) (e.g. the prediction-based belief) and p 2 (q) = N(ˆq 2, σ2 2 ) (e.g. the measurement-based belief): with p 1 (q)p 2 (q) = (q ˆq 1 1 ) 2 2σ e 1 2 σ 1 2π (q ˆq 1 2 ) 2 2σ e 2 2 σ 2 2π = Ωe (q ˆq)2 2σ 2 Ω =... 1 ˆq σ1 ˆq = σ 2 1 σ σ 2 2 ˆq 2 σ 2 = σ 2 1 σ2 2 σ σ2 2 σ min(σ 1, σ 2 ): uncertainty decreases by combining two measurements. c Siegwart et al., 2011, Fig. 5.31, p
23 Application to mobile robotics c Siegwart et al., 2011, Fig. 5.32, p. 332 Fusion of the prediction update and of the measurement update: Prediction update: applying the theorem of total probability. Measurement update: applying 4 successive steps. 45 Prediction update The robot position at time t is inferred from the old localization and the control input: ˆx t = f(x t 1, u t ). Therefore: ˆP t = F x P t 1 F T x + F u Q t F T u Remember the odometry update equation! c Siegwart et al., 2011, Fig. 5.33, p
24 Measurement update 1/4: Observation The second step it to obtain the observation Z (measurements) from the robot s sensors at the new location The observation usually consists of a set n of single observations z j extracted from the different sensors signals. It represents features like lines, doors or any kind of landmarks. The parameters of the targets are usually observed in the sensor frame {S} (local to the robot). Therefore the features in the map (in the world frame {W}) have to be transformed into the sensor frame {S}. This transformation is specified in the function h j (see later). 47 Measurement update 1/4: Observation c Siegwart et al., 2011, Fig. 5.34, p. 339 Each line feature is extracted from sensors and consist of two parameters (distance and angle) and a 2 2 covariance matrix R i t. 48
25 Measurement update 2/4: Measurement prediction The predicted robot position ˆx and the map are used to generate multiple predicted observations ẑ j which have to be transformed into the sensor frame: ẑ j t = hj (ˆx t, m j ) where m j represents the j-th feature of the map. We can now define the measurement prediction as the set containing all n i predicted observations: } Ẑ = {ẑ j 1 j n i The function h j is mainly the coordinate transformation between the world frame and the sensor frame. 49 Measurement update 2/4: Measurement prediction For prediction, only the walls that are in the field of view of the robot are selected. c Siegwart et al., 2011, Figs and 5.36, pp The measurement prediction are uncertain, because the prediction of robot position is uncertain. 50
26 Measurement update 3/4: Matching Identifying all of the single observations that match specific predicted features well enough to be used during the estimation. For each measurement prediction for which an corresponding observation is found the innovation is calculated: [ ] [ ] v ij t = zt i ẑ j t = zt i h j (ˆx t, m j ) and its innovation covariance found by applying the error propagation law: Σ ij IN t = hj ˆPt h jt + R i t where Rt i represents the covariance (noise) of the actual observation zt. i The validity of the correspondence between measurement and prediction can e.g. be evaluated through the Mahalanobis distance: v ij T ( ) t Σ ij 1 IN t v ij t g 2 where g is called the validation gate. 51 Measurement update 3/4: Matching c Siegwart et al., 2011, Fig. 5.37, pp. 342 If one observation falls in multiple validation gates, the best matching candidate is selected or multiple hypotheses are tracked. Observations that do not fall in the validation gate are simply ignored for localization. 52
27 Measurement update 4/4: Estimation Applying the Bayes rule The new robot position x t (red) is obtained by fusing the prediction of robot position (magenta) with the innovation gained by the measurements (green): x t = ˆx t + K t v t P t = ˆP t K t Σ IN t K T t where K t = ˆP t H T t (Σ IN t ) 1 is the Kalman gain. c Siegwart et al., 2011, Fig. 5.38, p Markov versus Kalman: summary Both methods solve the convolution and Bayes rules for the action and the perception update respectively, but: c Siegwart et al., 2011, ETH-Z lecture slides 54
28 Summary Behavior based navigation and map based navigation are two approaches to solve the localization problem. Usually, the second one is more robust and more flexible to strategic changes. Global localization requires external devices which are usually complex to deploy and expensive. Perception and motion both bring noise to the localization challenge. Odometry is based on internal and proprioceptive sensors. The localization error accumulates over time, faster to the direction being perpendicular to movement. Belief can be represented in a continuous or discrete framework, with single or multiple hypotheses. Probabilistic map-based localization consists in fusing observation-based estimates of the robot position with the odometric estimation solving separately the Action and Perception updates. Two approaches for probabilistic map-based localization: Markov and Kalman, with their respective pros and cons. 55 References Autonomous Mobile Robots (2nd Edition) Siegwart et al.; The MIT Press, Chapter 5 56
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Maren Bennewitz, Diego Tipaldi, Luciano Spinello 1 Motivation Recall: Discrete filter Discretize
More informationE190Q Lecture 5 Autonomous Robot Navigation
E190Q Lecture 5 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 2014 1 Figures courtesy of Siegwart & Nourbakhsh Control Structures Planning Based Control Prior Knowledge Operator
More informationRobotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information.
Robotics Lecture 3: Sensors See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Locomotion Practical
More informationRobot Perception Continued
Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart
More informationROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino
ROBOTICS 01PEEQW Basilio Bona DAUIN Politecnico di Torino Probabilistic Fundamentals in Robotics Robot Motion Probabilistic models of mobile robots Robot motion Kinematics Velocity motion model Odometry
More informationStatic Environment Recognition Using Omni-camera from a Moving Vehicle
Static Environment Recognition Using Omni-camera from a Moving Vehicle Teruko Yata, Chuck Thorpe Frank Dellaert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 USA College of Computing
More informationRobotics. Chapter 25. Chapter 25 1
Robotics Chapter 25 Chapter 25 1 Outline Robots, Effectors, and Sensors Localization and Mapping Motion Planning Motor Control Chapter 25 2 Mobile Robots Chapter 25 3 Manipulators P R R R R R Configuration
More informationUnderstanding and Applying Kalman Filtering
Understanding and Applying Kalman Filtering Lindsay Kleeman Department of Electrical and Computer Systems Engineering Monash University, Clayton 1 Introduction Objectives: 1. Provide a basic understanding
More informationRobot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen
Robot Sensors Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0760 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Sensors 1 / 38 Outline 1
More informationCE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras
1 CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation Prof. Dr. Hani Hagras Robot Locomotion Robots might want to move in water, in the air, on land, in space.. 2 Most of the
More informationP r oba bi l i sti c R oboti cs. Yrd. Doç. Dr. SIRMA YAVUZ sirma@ce.yildiz.edu.tr Room 130
P r oba bi l i sti c R oboti cs Yrd. Doç. Dr. SIRMA YAVUZ sirma@ce.yildiz.edu.tr Room 130 Orgazinational Lecture: Thursday 13:00 15:50 Office Hours: Tuesday 13:00-14:00 Thursday 11:00-12:00 Exams: 1 Midterm
More informationMobile Robot FastSLAM with Xbox Kinect
Mobile Robot FastSLAM with Xbox Kinect Design Team Taylor Apgar, Sean Suri, Xiangdong Xi Design Advisor Prof. Greg Kowalski Abstract Mapping is an interesting and difficult problem in robotics. In order
More informationMulti-ultrasonic sensor fusion for autonomous mobile robots
Multi-ultrasonic sensor fusion for autonomous mobile robots Zou Yi *, Ho Yeong Khing, Chua Chin Seng, and Zhou Xiao Wei School of Electrical and Electronic Engineering Nanyang Technological University
More informationRobot Navigation. Johannes Maurer, Institute for Software Technology TEDUSAR Summerschool 2014. u www.tugraz.at
1 Robot Navigation u www.tugraz.at 2 Motivation challenges physical laws e.g. inertia, acceleration uncertainty e.g. maps, observations geometric constraints e.g. shape of a robot dynamic environment e.g.
More informationReal-Time Indoor Mapping for Mobile Robots with Limited Sensing
Real-Time Indoor Mapping for Mobile Robots with Limited Sensing Ying Zhang, Juan Liu Gabriel Hoffmann Palo Alto Research Center Palo Alto, California 94304 Email: yzhang@parc.com Mark Quilling, Kenneth
More informationMSc in Autonomous Robotics Engineering University of York
MSc in Autonomous Robotics Engineering University of York Practical Robotics Module 2015 A Mobile Robot Navigation System: Labs 1a, 1b, 2a, 2b. Associated lectures: Lecture 1 and lecture 2, given by Nick
More informationArtificial Intelligence
Artificial Intelligence Robotics RWTH Aachen 1 Term and History Term comes from Karel Capek s play R.U.R. Rossum s universal robots Robots comes from the Czech word for corvee Manipulators first start
More informationMobile Robotics I: Lab 2 Dead Reckoning: Autonomous Locomotion Using Odometry
Mobile Robotics I: Lab 2 Dead Reckoning: Autonomous Locomotion Using Odometry CEENBoT Mobile Robotics Platform Laboratory Series CEENBoT v2.21 '324 Platform The Peter Kiewit Institute of Information Science
More informationMaster s thesis tutorial: part III
for the Autonomous Compliant Research group Tinne De Laet, Wilm Decré, Diederik Verscheure Katholieke Universiteit Leuven, Department of Mechanical Engineering, PMA Division 30 oktober 2006 Outline General
More informationAutomatic Labeling of Lane Markings for Autonomous Vehicles
Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 jkiske@stanford.edu 1. Introduction As autonomous vehicles become more popular,
More informationRobotic Mapping: A Survey
Robotic Mapping: A Survey Sebastian Thrun February 2002 CMU-CS-02-111 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Abstract This article provides a comprehensive introduction
More informationWireless Networking Trends Architectures, Protocols & optimizations for future networking scenarios
Wireless Networking Trends Architectures, Protocols & optimizations for future networking scenarios H. Fathi, J. Figueiras, F. Fitzek, T. Madsen, R. Olsen, P. Popovski, HP Schwefel Session 1 Network Evolution
More informationTexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA
2015 School of Information Technology and Electrical Engineering at the University of Queensland TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA Schedule Week Date
More informationC# Implementation of SLAM Using the Microsoft Kinect
C# Implementation of SLAM Using the Microsoft Kinect Richard Marron Advisor: Dr. Jason Janet 4/18/2012 Abstract A SLAM algorithm was developed in C# using the Microsoft Kinect and irobot Create. Important
More informationBasics of Statistical Machine Learning
CS761 Spring 2013 Advanced Machine Learning Basics of Statistical Machine Learning Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu Modern machine learning is rooted in statistics. You will find many familiar
More informationVEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS
VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS Aswin C Sankaranayanan, Qinfen Zheng, Rama Chellappa University of Maryland College Park, MD - 277 {aswch, qinfen, rama}@cfar.umd.edu Volkan Cevher, James
More informationImplementation of kalman filter for the indoor location system of a lego nxt mobile robot. Abstract
Implementation of kalman filter for the indoor location system of a lego nxt mobile robot Leidy López Osorio * Giovanni Bermúdez Bohórquez ** Miguel Pérez Pereira *** submitted date: March 2013 received
More informationModel-based Synthesis. Tony O Hagan
Model-based Synthesis Tony O Hagan Stochastic models Synthesising evidence through a statistical model 2 Evidence Synthesis (Session 3), Helsinki, 28/10/11 Graphical modelling The kinds of models that
More informationEstimation of Position and Orientation of Mobile Systems in a Wireless LAN
Proceedings of the 46th IEEE Conference on Decision and Control New Orleans, LA, USA, Dec. 12-14, 2007 Estimation of Position and Orientation of Mobile Systems in a Wireless LAN Christof Röhrig and Frank
More information1 Solving LPs: The Simplex Algorithm of George Dantzig
Solving LPs: The Simplex Algorithm of George Dantzig. Simplex Pivoting: Dictionary Format We illustrate a general solution procedure, called the simplex algorithm, by implementing it on a very simple example.
More informationMultivariate Normal Distribution
Multivariate Normal Distribution Lecture 4 July 21, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Lecture #4-7/21/2011 Slide 1 of 41 Last Time Matrices and vectors Eigenvalues
More informationAN INTERACTIVE USER INTERFACE TO THE MOBILITY OBJECT MANAGER FOR RWI ROBOTS
AN INTERACTIVE USER INTERFACE TO THE MOBILITY OBJECT MANAGER FOR RWI ROBOTS Innocent Okoloko and Huosheng Hu Department of Computer Science, University of Essex Colchester Essex C04 3SQ, United Kingdom
More information6.099, Spring Semester, 2006 Assignment for Week 13 1
6.099, Spring Semester, 2006 Assignment for Week 13 1 MASSACHVSETTS INSTITVTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.099 Introduction to EECS I Spring Semester, 2006
More informationMAP ESTIMATION WITH LASER SCANS BASED ON INCREMENTAL TREE NETWORK OPTIMIZER
MAP ESTIMATION WITH LASER SCANS BASED ON INCREMENTAL TREE NETWORK OPTIMIZER Dario Lodi Rizzini 1, Stefano Caselli 1 1 Università degli Studi di Parma Dipartimento di Ingegneria dell Informazione viale
More informationPHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY
PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY V. Knyaz a, *, Yu. Visilter, S. Zheltov a State Research Institute for Aviation System (GosNIIAS), 7, Victorenko str., Moscow, Russia
More informationFastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem With Unknown Data Association
FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem With Unknown Data Association Michael Montemerlo 11th July 2003 CMU-RI-TR-03-28 The Robotics Institute Carnegie Mellon
More informationBayesian Statistics: Indian Buffet Process
Bayesian Statistics: Indian Buffet Process Ilker Yildirim Department of Brain and Cognitive Sciences University of Rochester Rochester, NY 14627 August 2012 Reference: Most of the material in this note
More informationDeterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking
Proceedings of the IEEE ITSC 2006 2006 IEEE Intelligent Transportation Systems Conference Toronto, Canada, September 17-20, 2006 WA4.1 Deterministic Sampling-based Switching Kalman Filtering for Vehicle
More informationOn Fleet Size Optimization for Multi-Robot Frontier-Based Exploration
On Fleet Size Optimization for Multi-Robot Frontier-Based Exploration N. Bouraqadi L. Fabresse A. Doniec http://car.mines-douai.fr Université de Lille Nord de France, Ecole des Mines de Douai Abstract
More information1 Review of Newton Polynomials
cs: introduction to numerical analysis 0/0/0 Lecture 8: Polynomial Interpolation: Using Newton Polynomials and Error Analysis Instructor: Professor Amos Ron Scribes: Giordano Fusco, Mark Cowlishaw, Nathanael
More informationVisual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System
Ref: C0287 Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System Avital Bechar, Victor Bloch, Roee Finkelshtain, Sivan Levi, Aharon Hoffman, Haim Egozi and Ze ev Schmilovitch,
More informationMobile Robot Localization using Range Sensors : Consecutive Scanning and Cooperative Scanning
International Mobile Journal Robot of Control, Localization Automation, using and Range Systems, Sensors vol. : Consecutive 3, no. 1, pp. Scanning 1-14, March and 2005 Cooperative Scanning 1 Mobile Robot
More informationA Reliability Point and Kalman Filter-based Vehicle Tracking Technique
A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video
More informationA Statistical Framework for Operational Infrasound Monitoring
A Statistical Framework for Operational Infrasound Monitoring Stephen J. Arrowsmith Rod W. Whitaker LA-UR 11-03040 The views expressed here do not necessarily reflect the views of the United States Government,
More informationDP-SLAM: Fast, Robust Simultaneous Localization and Mapping Without Predetermined Landmarks
DP-SLAM: Fast, Robust Simultaneous Localization and Mapping Without Predetermined Landmarks Austin Eliazar and Ronald Parr Department of Computer Science Duke University eliazar, parr @cs.duke.edu Abstract
More informationEffective Use of Android Sensors Based on Visualization of Sensor Information
, pp.299-308 http://dx.doi.org/10.14257/ijmue.2015.10.9.31 Effective Use of Android Sensors Based on Visualization of Sensor Information Young Jae Lee Faculty of Smartmedia, Jeonju University, 303 Cheonjam-ro,
More informationLocalization of Mobile Robots Using Odometry and an External Vision Sensor
Sensors 21, 1, 3655-368; doi:1.339/s143655 OPEN ACCESS sensors ISSN 1424-822 www.mdpi.com/journal/sensors Article Localization of Mobile Robots Using Odometry and an External Vision Sensor Daniel Pizarro,
More informationTracking of Moving Objects from a Moving Vehicle Using a Scanning Laser Rangefinder
Tracking of Moving Objects from a Moving Vehicle Using a Scanning Laser Rangefinder Robert A. MacLachlan, Member, IEEE, Christoph Mertz Abstract The capability to use a moving sensor to detect moving objects
More informationSensor Data Consistency Monitoring for the Prevention of Perceptual Failures in Outdoor Robotics
Sensor Data Consistency Monitoring for the Prevention of Perceptual Failures in Outdoor Robotics Thierry Peynot, James Underwood and Abdallah Kassir Abstract This work aims to promote integrity in autonomous
More information3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Christian Zinner Safe and Autonomous Systems
More information226-332 Basic CAD/CAM. CHAPTER 5: Geometric Transformation
226-332 Basic CAD/CAM CHAPTER 5: Geometric Transformation 1 Geometric transformation is a change in geometric characteristics such as position, orientation, and size of a geometric entity (point, line,
More informationMetrics on SO(3) and Inverse Kinematics
Mathematical Foundations of Computer Graphics and Vision Metrics on SO(3) and Inverse Kinematics Luca Ballan Institute of Visual Computing Optimization on Manifolds Descent approach d is a ascent direction
More informationActive Exploration Planning for SLAM using Extended Information Filters
Active Exploration Planning for SLAM using Extended Information Filters Robert Sim Department of Computer Science University of British Columbia 2366 Main Mall Vancouver, BC V6T 1Z4 simra@cs.ubc.ca Nicholas
More informationMultisensor Data Fusion and Applications
Multisensor Data Fusion and Applications Pramod K. Varshney Department of Electrical Engineering and Computer Science Syracuse University 121 Link Hall Syracuse, New York 13244 USA E-mail: varshney@syr.edu
More informationJiří Matas. Hough Transform
Hough Transform Jiří Matas Center for Machine Perception Department of Cybernetics, Faculty of Electrical Engineering Czech Technical University, Prague Many slides thanks to Kristen Grauman and Bastian
More information3D Scanner using Line Laser. 1. Introduction. 2. Theory
. Introduction 3D Scanner using Line Laser Di Lu Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute The goal of 3D reconstruction is to recover the 3D properties of a geometric
More information8. Linear least-squares
8. Linear least-squares EE13 (Fall 211-12) definition examples and applications solution of a least-squares problem, normal equations 8-1 Definition overdetermined linear equations if b range(a), cannot
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct
More informationLecture 9: Introduction to Pattern Analysis
Lecture 9: Introduction to Pattern Analysis g Features, patterns and classifiers g Components of a PR system g An example g Probability definitions g Bayes Theorem g Gaussian densities Features, patterns
More informationBlind Deconvolution of Barcodes via Dictionary Analysis and Wiener Filter of Barcode Subsections
Blind Deconvolution of Barcodes via Dictionary Analysis and Wiener Filter of Barcode Subsections Maximilian Hung, Bohyun B. Kim, Xiling Zhang August 17, 2013 Abstract While current systems already provide
More informationLecture 8: Signal Detection and Noise Assumption
ECE 83 Fall Statistical Signal Processing instructor: R. Nowak, scribe: Feng Ju Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(, σ I n n and S = [s, s,...,
More informationDefinitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots
Definitions A robot is A programmable, multifunction manipulator designed to move material, parts, tools, or specific devices through variable programmed motions for the performance of a variety of tasks.
More informationRobust and accurate global vision system for real time tracking of multiple mobile robots
Robust and accurate global vision system for real time tracking of multiple mobile robots Mišel Brezak Ivan Petrović Edouard Ivanjko Department of Control and Computer Engineering, Faculty of Electrical
More informationParticle Filters and Their Applications
Particle Filters and Their Applications Kaijen Hsiao Henry de Plinval-Salgues Jason Miller Cognitive Robotics April 11, 2005 1 Why Particle Filters? Tool for tracking the state of a dynamic system modeled
More informationPosition Estimation Using Principal Components of Range Data
Position Estimation Using Principal Components of Range Data James L. Crowley, Frank Wallner and Bernt Schiele Project PRIMA-IMAG, INRIA Rhône-Alpes 655, avenue de l'europe 38330 Montbonnot Saint Martin,
More informationLeast-Squares Intersection of Lines
Least-Squares Intersection of Lines Johannes Traa - UIUC 2013 This write-up derives the least-squares solution for the intersection of lines. In the general case, a set of lines will not intersect at a
More informationPath Tracking for a Miniature Robot
Path Tracking for a Miniature Robot By Martin Lundgren Excerpt from Master s thesis 003 Supervisor: Thomas Hellström Department of Computing Science Umeå University Sweden 1 Path Tracking Path tracking
More informationAnalecta Vol. 8, No. 2 ISSN 2064-7964
EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,
More informationMulti-Robot Tracking of a Moving Object Using Directional Sensors
Multi-Robot Tracking of a Moving Object Using Directional Sensors Manuel Mazo Jr., Alberto Speranzon, Karl H. Johansson Dept. of Signals, Sensors & Systems Royal Institute of Technology SE- 44 Stockholm,
More informationSpatial Accuracy Assessment of Digital Surface Models: A Probabilistic Approach
Spatial Accuracy Assessment of Digital Surface Models: A Probabilistic Approach André Jalobeanu Centro de Geofisica de Evora, Portugal part of the SPACEFUSION Project (ANR, France) Outline Goals and objectives
More informationWhat is a robot? Lecture 2: Robot Basics. Remember the Amigobot? Describing the Amigobot. The Unicycle Model. Modeling Robot Interaction.
What is a robot? Lectre 2: Basics CS 344R/393R: ics Benjamin Kipers A robot is an intelligent sstem that interacts with the phsical environment throgh sensors and effectors. Toda we discss: Abstraction
More information3D Arm Motion Tracking for Home-based Rehabilitation
hapter 13 3D Arm Motion Tracking for Home-based Rehabilitation Y. Tao and H. Hu 13.1 Introduction This paper presents a real-time hbrid solution to articulated 3D arm motion tracking for home-based rehabilitation
More information2. Colour based particle filter for 3-D tracking
MULTI-CAMERA 3-D TRACKING USING PARTICLE FILTER Pablo Barrera, José M. Cañas, Vicente Matellán, and Francisco Martín Grupo de Robótica, Universidad Rey Juan Carlos Móstoles, Madrid (Spain) {barrera,jmplaza,vmo,fmartin}@gsyc.escet.urjc.es
More informationAutomatic parameter regulation for a tracking system with an auto-critical function
Automatic parameter regulation for a tracking system with an auto-critical function Daniela Hall INRIA Rhône-Alpes, St. Ismier, France Email: Daniela.Hall@inrialpes.fr Abstract In this article we propose
More information1 Review of Least Squares Solutions to Overdetermined Systems
cs4: introduction to numerical analysis /9/0 Lecture 7: Rectangular Systems and Numerical Integration Instructor: Professor Amos Ron Scribes: Mark Cowlishaw, Nathanael Fillmore Review of Least Squares
More informationBayesian probability theory
Bayesian probability theory Bruno A. Olshausen arch 1, 2004 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using probability. The foundations
More informationBayesian Filters for Location Estimation
Bayesian Filters for Location Estimation Dieter Fo Jeffrey Hightower, Lin Liao Dirk Schulz Gaetano Borriello, University of Washington, Dept. of Computer Science & Engineering, Seattle, WA Intel Research
More informationSafe robot motion planning in dynamic, uncertain environments
Safe robot motion planning in dynamic, uncertain environments RSS 2011 Workshop: Guaranteeing Motion Safety for Robots June 27, 2011 Noel du Toit and Joel Burdick California Institute of Technology Dynamic,
More informationBayesian Image Super-Resolution
Bayesian Image Super-Resolution Michael E. Tipping and Christopher M. Bishop Microsoft Research, Cambridge, U.K..................................................................... Published as: Bayesian
More informationAre we ready for Autonomous Driving? The KITTI Vision Benchmark Suite
Are we ready for Autonomous Driving? The KITTI Vision Benchmark Suite Philip Lenz 1 Andreas Geiger 2 Christoph Stiller 1 Raquel Urtasun 3 1 KARLSRUHE INSTITUTE OF TECHNOLOGY 2 MAX-PLANCK-INSTITUTE IS 3
More informationTracking in flussi video 3D. Ing. Samuele Salti
Seminari XXIII ciclo Tracking in flussi video 3D Ing. Tutors: Prof. Tullio Salmon Cinotti Prof. Luigi Di Stefano The Tracking problem Detection Object model, Track initiation, Track termination, Tracking
More informationEpipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R.
Epipolar Geometry We consider two perspective images of a scene as taken from a stereo pair of cameras (or equivalently, assume the scene is rigid and imaged with a single camera from two different locations).
More informationPractical Tour of Visual tracking. David Fleet and Allan Jepson January, 2006
Practical Tour of Visual tracking David Fleet and Allan Jepson January, 2006 Designing a Visual Tracker: What is the state? pose and motion (position, velocity, acceleration, ) shape (size, deformation,
More informationCrater detection with segmentation-based image processing algorithm
Template reference : 100181708K-EN Crater detection with segmentation-based image processing algorithm M. Spigai, S. Clerc (Thales Alenia Space-France) V. Simard-Bilodeau (U. Sherbrooke and NGC Aerospace,
More informationLeast Squares Estimation
Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN-13: 978-0-470-86080-9 ISBN-10: 0-470-86080-4 Editors Brian S Everitt & David
More informationTracking of Small Unmanned Aerial Vehicles
Tracking of Small Unmanned Aerial Vehicles Steven Krukowski Adrien Perkins Aeronautics and Astronautics Stanford University Stanford, CA 94305 Email: spk170@stanford.edu Aeronautics and Astronautics Stanford
More informationRobot coined by Karel Capek in a 1921 science-fiction Czech play
Robotics Robot coined by Karel Capek in a 1921 science-fiction Czech play Definition: A robot is a reprogrammable, multifunctional manipulator designed to move material, parts, tools, or specialized devices
More informationEECS 556 Image Processing W 09. Interpolation. Interpolation techniques B splines
EECS 556 Image Processing W 09 Interpolation Interpolation techniques B splines What is image processing? Image processing is the application of 2D signal processing methods to images Image representation
More informationCourse 8. An Introduction to the Kalman Filter
Course 8 An Introduction to the Kalman Filter Speakers Greg Welch Gary Bishop Kalman Filters in 2 hours? Hah! No magic. Pretty simple to apply. Tolerant of abuse. Notes are a standalone reference. These
More informationLinear Threshold Units
Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear
More informationA Study on SURF Algorithm and Real-Time Tracking Objects Using Optical Flow
, pp.233-237 http://dx.doi.org/10.14257/astl.2014.51.53 A Study on SURF Algorithm and Real-Time Tracking Objects Using Optical Flow Giwoo Kim 1, Hye-Youn Lim 1 and Dae-Seong Kang 1, 1 Department of electronices
More informationVehicle Tracking in Occlusion and Clutter
Vehicle Tracking in Occlusion and Clutter by KURTIS NORMAN MCBRIDE A thesis presented to the University of Waterloo in fulfilment of the thesis requirement for the degree of Master of Applied Science in
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationOnline Learning for Offroad Robots: Using Spatial Label Propagation to Learn Long Range Traversability
Online Learning for Offroad Robots: Using Spatial Label Propagation to Learn Long Range Traversability Raia Hadsell1, Pierre Sermanet1,2, Jan Ben2, Ayse Naz Erkan1, Jefferson Han1, Beat Flepp2, Urs Muller2,
More informationLecture 5 Least-squares
EE263 Autumn 2007-08 Stephen Boyd Lecture 5 Least-squares least-squares (approximate) solution of overdetermined equations projection and orthogonality principle least-squares estimation BLUE property
More informationSensor Coverage using Mobile Robots and Stationary Nodes
In Procedings of the SPIE, volume 4868 (SPIE2002) pp. 269-276, Boston, MA, August 2002 Sensor Coverage using Mobile Robots and Stationary Nodes Maxim A. Batalin a and Gaurav S. Sukhatme a a University
More informationWe can display an object on a monitor screen in three different computer-model forms: Wireframe model Surface Model Solid model
CHAPTER 4 CURVES 4.1 Introduction In order to understand the significance of curves, we should look into the types of model representations that are used in geometric modeling. Curves play a very significant
More informationThe Basics of Graphical Models
The Basics of Graphical Models David M. Blei Columbia University October 3, 2015 Introduction These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. Many figures
More informationVISION-BASED POSITION ESTIMATION IN MULTIPLE QUADROTOR SYSTEMS WITH APPLICATION TO FAULT DETECTION AND RECONFIGURATION
VISION-BASED POSITION ESTIMATION IN MULTIPLE QUADROTOR SYSTEMS WITH APPLICATION TO FAULT DETECTION AND RECONFIGURATION MASTER THESIS, 212-213 SCHOOL OF ENGINEERS, UNIVERSITY OF SEVILLE Author Alejandro
More informationHow To Analyze Ball Blur On A Ball Image
Single Image 3D Reconstruction of Ball Motion and Spin From Motion Blur An Experiment in Motion from Blur Giacomo Boracchi, Vincenzo Caglioti, Alessandro Giusti Objective From a single image, reconstruct:
More information