UAV Pose Estimation using POSIT Algorithm
|
|
- May Harper
- 8 years ago
- Views:
Transcription
1 International Journal of Digital ontent Technology and its Applications. Volume 5, Number 4, April 211 UAV Pose Estimation using POSIT Algorithm *1 M. He, 2. Ratanasawanya, 3 M. Mehrandezh, 4 R. Paranjape *1 ollege of Electrical & Information Engineering, Hunan University, hina, hemin67@163.com 2 Electronic Systems Engineering, University of Regina, anada, ratanasc@uregina.ca 3 Associate Professor, Industrial Systems Engineering, University of Regina, anada, mehran.mehrandezh@uregina.ca 4 Professor, Electronic Systems Engineering, University of Regina, anada, raman.paranjape@uregina.ca doi:1.4156/jdcta.vol5. issue4.19 Abstract Vision-based pose estimation is widely employed to Mini Unmanned Aerial Vehicles (MUAV) with limited payloads. The Pose from Orthography and Scaling and Iterations (POSIT) is one of the most important solutions to estimate the pose by 2-D images and 3-D model of objects. In order to evaluate the performance of POSIT algorithm, a test platform that consists of a MUAV, a wireless camera, a computer workstation, and a motion capture (Optitrack) system is developed. The pose of the MUAV is calculated by the POSIT algorithm with a set of 2-D images captured by the on-board camera, and the calculated pose is compared to the actual pose reading from the Optitrack system. The experimental result demonstrates that the error remains within acceptable bounds and the POSIT is a useful alternative for pose estimation of a MUAV. 1. Introduction Keywords: Pose Estimation, MUAV, Optitrack System, POSIT. UAVs (Unmanned Aerial Vehicles) recently draw a great deal of attention within the public and private sectors as a useful tool for mitigation, prevention, and timely response to emergency situations. The problem of the object s position and orientation (aka pose) estimation arises in several domains of application such as localization, visual servoing, object tracking and so on [1-5]. ompared to typical inertial, sonar, atmospheric, and GPS based sensors, camera appears as an ideal sensor for deployment in small UAVs with limited payloads due to its compact size and abundant information in captured images. As a result, vision-based pose estimation methods have been focused in many literatures [6-8]. urrent pose estimation algorithms can be classified into model-based [9] and model-free [1] methods depending on the requirement of the knowledge about the 3D target model and the camera parameters. ith the knowledge of both the 3D model of a target object and the feature correspondence between the object and its 2D image, model-based methods estimate the pose of the camera relative to the object by using single view of image. In the case of this method, pose estimation from image points is the most well-known technique. For example, RANSA (Random Sample onsensus) solved Location Determination Problem from three and four coplanar feature points or six points in general position. Unfortunately, no general result is given about the uniqueness of the solution [11]. Lowe s algorithm defined an error function to express the distance between image features in physical space and the projection of the corresponding points at the current camera location, and an iterative process is used to correct possible projection error. More accurate result could be obtained from Lowe s algorithm; however, it is more complex and computationally demanding as well as an approximate pose is needed to initiate the iteration process [12]. The POSIT algorithm estimates the pose of the camera with respect to an object and optimizes the error by using an iterative process. POSIT avoided initial pose estimation and matrix inversion computation, and guaranteed accurate pose estimation. Therefore, it is a simple, efficient, and suitable alternative for real-time applications [13]. ith the original version of the POSIT algorithm, the performance evaluation was implemented by using synthetic images of a tetrahedron and a cube. The pose of the objects which were used to produce
2 International Journal of Digital ontent Technology and its Applications. Volume 5, Number 4, April 211 the images and the pose of the objects computed by POSIT from the synthetic images were compared. However, the validation of this algorithm was questionable in practice in terms of the pose estimation accuracy. In order to evaluate the performance of POSIT for real-time pose estimation of UAVs, we developed a test platform which consists of a Mini UAV (MUAV), a wireless camera, a computer work station, and a motion capture (Optitrack) system. The pose of the MUAV is calculated by POSIT algorithm using a set of images captured by the on-board camera. The calculated pose is compared to the actual pose reading from the Optitrack system. This paper is organized as follows. Section 2 presents the components of the method, including the test platform setup, POSIT algorithm, the homogeneous transformation for pose estimation results comparison. Test results can be found in section 3. Section 4 concludes this paper with a brief discussion. 2. Method and Algorithm 2.1. Experimental setup The POSIT-based UAV pose estimation test platform is shown in Figure 1. The platform consists of a MUAV, the Optitrack system, a computer work station, a wireless video camera, and a 3D object, which is a white cardboard box. (a) (b) Figure 1. Mini UAV test setup (a) Optitrack system and ball-x4 (b) ball-x4 and the target object The ball-x4 helicopter is selected as the MUAV [14]. The ball-x4 is an innovative quadrotor helicopter suitable for a wide variety of UAV research applications. ith the help of four motors fitted with 1-inch propellers, it is able to fly under 6 Degrees-of-Freedom (DOF); 3 translational DOF and 3 rotational DOF (roll, pitch, and yaw). The entire quadrotor is enclosed within a protective carbon fiber cage which gives the ball-x4 a decisive advantage over other vehicles that would suffer significant damage if contact occurs between the vehicle and other obstacles. A light-weighted wireless camera is attached to the ball-x4 body for providing real-time images of the target object. The box is attached to the wall in the field of view of the wireless camera. The background is black for the best contrast to the color of the box in Red-Green-lue (RG) color space, which makes it much easier to identify the corners of the box as image feature points. The Optitrack is a motion capture system which tracks the movement of Infrared (IR) reflectors attached to any object in the workspace using 6 IR cameras [15]. These IR cameras are arranged around an approximately 6 cubic meters workspace in which the ball-x4 moves. The Optitrack system is able to provide the pose of an object defined by a group of IR reflectors relative to the origin of the system s coordinates. In our experiment, we also chose the coordinate frame of Optitrack system as the reference of the world frame,. The origin of the workspace coordinates must be defined during camera calibration for the six infrared cameras. To define the ball-x4 as a trackable object for Optitrack system, three reflectors are attached to the ends of the two cross bars except the front end where the wireless camera is attached as shown in Figure 2(a). Figure 2(b) shows the IR reflectors as seen by the cameras (blue points); the virtual center of gravity (c.g.) of the trackable object is defined during signal processing (red point). The pose of the object is given at c.g. expressed in the world frame
3 International Journal of Digital ontent Technology and its Applications. Volume 5, Number 4, April 211 All signal processing are finished by the work station, including Optitrack cameras and wireless camera calibration, manual feature point selection, pose by POSIT algorithm and so on. (a) (b) Figure 2. Trackable object definition for ball-x4: (a) three reflectors attached on ball-x4, (b) image of IR reflectors and the vitual c.g POSIT Algorithm POSIT algorithm was proposed for finding the pose of an object relative to the camera from noncoplanar feature points contained in a single image. It is the combination of two algorithms, namely POS (Pose from Orthography and Scaling) and IT (Iterations). The POS algorithm approximates the perspective projection with scaled orthographic projection (SOP) to find the transformation (rotation and translation) between a coordinate frame attached to the object (object frame) and another coordinate frame attached to the center of projection of the camera (camera frame) by solving a linear system; IT algorithm is an iterative error optimization operation that updates the parameters of the approximate pose found in the previous step and repeats the POS algorithm several times in order to compute better scaled orthographic projections of the feature points. ith some requirements such as a 3D model of target object, camera intrinsic parameters, a minimum of four non-coplanar image feature points and their relative geometry matched with the corresponding points in the 3D model, the POSIT algorithm calculates the rotation matrix and translation vector of the object with respect to the camera. In other words, POSIT algorithm supplies the transformation information for a point expressed in the object (the box) frame,, with respect to the camera frame,. Frame is attached to the center of projection with z-axis pointing outwards from the camera. Figure 3 shows the diagram of POSIT algorithm. More details about POSIT can be found in the original paper by DeMenthon [13] Homogeneous transformation Figure 3. Schematic Diagram of POSIT Algorithm The pose of the ball-x4 is calculated using the results from POSIT algorithm, homogeneous transformation, and inverse kinematics. As shown in Figure 4, there are 4 coordinate frames in the test setup: the world frame,, the ball-x4 frame,, which is attached rigidly to the MUAV, the camera frame,, and the object or box frame,, attached to the lower front left corner of the box. In order to compare the pose estimation result of POSIT, expressed in frame, to the pose reading from the
4 International Journal of Digital ontent Technology and its Applications. Volume 5, Number 4, April 211 Optitrack system, expressed in frame, the two coordinate frames need to be aligned using homogeneous transformation. Homogeneous transformation matrix, A T, is a matrix which shows how coordinate frame is transformed with respect to frame A. It is also used to convert the location of a point between the two frames. Figure 4. Different coordinate frame in our system The four coordinate frames in the experiment are related as follows: T = T T T (1) here T is the homogeneous transformation matrix from the box frame to the world frame, is the transformation matrix from the UAV frame to the world frame, T is the homogeneous transfor- mation matrix from the camera frame to the UAV frame, matrix from the box frame to the camera frame. From (1), T is unknown, therefore: ( ) -1 T ( T ) -1 = T T T is the homogeneous transformation T (2) here (*) -1 is the inverse operator. The coordinate frame of the box was assumed to have the same orientation as the world frame but differ in translational position, so the homogeneous transformation between them is made up only of the translational component. é ù T = ê ú ê ú cm (3) êë 1 úû ecause the camera is mounted on the ball-x4, the transformation between frames and is a known constant: T é1 = ê ê êë -1-1 ù ú ú 1 úû cm (4)
5 International Journal of Digital ontent Technology and its Applications. Volume 5, Number 4, April 211 ith the result T from POSIT, we can obtain the pose of ball-x4 in the world frame T, and the corresponding translation vector and rotation matrix are obtained by inverse kinematics formula. 3. Experimental Results In order to test the performance of POSIT algorithm for pose estimation of ball-x4, the MUAV was moved around the workspace and planed randomly in 17 different locations. The box was always kept in the view of the camera at those locations. For each location, the attached cameras records an image of the box, meanwhile the Optitrack system captures the pose of the ball-x4. Five corners of the white box were manually detected by the user as non-coplanar feature points needed by POSIT algorithm, the bottom left front corner is the reference point and other four corners from the top side. Since the structure of the box is known as a priory, the pose of the camera relative to the box is calculated by the 3D model configuration of the feature points and their corresponding 2D image coordinates. ith the help of homogeneous transformation and inverse kinematics, the ball-x4 pose is calculated using equation (2), and then the results are compared to Optitrack readings. The x, y, and z coordinates of ball-x4 are shown in Figure 5, as well as the roll angle around z axis, pitch angle around x axis, yaw angle around y axis. The pink square points are results from Equation (2), which indicate the value from POSIT algorithm. The blue diamond points are the measurements from the Optitrack system measurement ball-x4 pose: x (cm) ball-x4 pose: z (cm) measurement ball-x4 pose: y (cm) measurment ball-x4 pose: Roll angle (degree) maesurment measurment ball-x4 pose: yaw angle (degree) ball-x4 pose: pitch angle(degree) measurement Figure 5. omparison of ball-x4 pose estimation results The error comparison results are listed in Table 1, where the maximum, minimum, and mean error of x, y, and z coordinates and error of roll, yaw, and pitch angles can be found. omparing to the reading of the Optitrack system, the POSIT algorithm gives the pose of camera and hence the pose of the MUAV with less than four degrees rotation mean error and less than 7 cm position error
6 International Journal of Digital ontent Technology and its Applications. Volume 5, Number 4, April 211 Table 1. Relative position errors of 6-DOF parameters x(cm) y(cm) z(cm) roll(degree) yaw(degree) pith(degree) Maximum error Minimum error Mean error onclusion The POSIT algorithm was tested for pose estimation of a MUAV from a set of images containing four non-coplanar feature points of a box. The performance of POSIT is evaluated by comparing to the recorded results from the Optitrack system. The experimental result appears to remain within reasonable error and the POSIT proves to be a useful alternative for pose estimation of a MUAV. Some possible causes of the existing error are the Optitrack measurement accuracy of 4cm, and the imaginary c.g. of the ball-x4 trackable object does not correspond exactly to actual c.g. of the MUAV used to define T, which is the homogeneous transformation matrix from the camera frame to the MUAV frame. 5. Acknowledgements e are grateful for the support of the Natural Science and Engineering Research ouncil of anada (NSER), the Hunan Provincial Natural Science Foundation of hina (No.1JJ386) and the Fundamental Research Funds for the entral Universities of hina. 6. References [1] G. hesi and K. Hashimoto, "A simple technique for improving camera displacement estimation in eye-in-hand visual servoing," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 26, pp , 24. [2] T. Gramegna, L. Venturino, G. icirelli, G. Attolico and A. Distant, "Optimization of the POSIT algorithm for indoor autonomous navigation," Robotics and Autonomous Systems, vol. 48, pp , 24. [3] T. Hamel and R. Mahony, "Image based visual servo control for a class of aerial robotic systems," Automatica, vol. 43, pp , 27. [4] L. ei and E.-J. Lee, "Multi-pose Face Recognition Using Head Pose Estimation and PA Approach," JDTA: International Journal of Digital ontent Technology and its Applications, vol. 4, pp , 21. [5] Y. Zhang and L. u, "Face Pose Estimation by haotic Artificial ee olony," JDTA: International Journal of Digital ontent Technology and its Applications, vol. 5, pp , 211. [6] J. ourbon, Y. Mezouar, N. Guénard and P. Martinet, "Vision-based navigation of unmanned aerial vehicles," ontrol Engineering Practice, vol. 18, pp , 21. [7] G. Xu, Y. Zhang, S. Ji, Y. heng and Y. Tian, "Research on computer vision-based for UAV autonomous landing on a ship," Pattern Recognition Letters, vol. 3, pp. 6-65, 29. [8] Y. K.Yu, K. H. ong and M.M.Y.hang, "Pose estimation for augmented reality applications using genetic algorithm," Systems, Man, and ybernetics, Part : ybernetics, IEEE Transactions on, vol. 35, pp , 25. [9]. Ünsalan, "A model based approach for pose estimation and rotation invariant object matching," Pattern Recognition Letters, vol. 28, pp , 27. [1] E. Malis and F. haumette, "Theoretical improvements in the stability analysis of a new class of model-free visual servoing methods," Robotics and Automation, IEEE Transactions on, vol. 18, pp , 22. [11] M. A. Fischler and R.. olles, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," ommunications of the AM, vol. 24, pp ,
7 International Journal of Digital ontent Technology and its Applications. Volume 5, Number 4, April 211 [12] D. G. Lowe, "Three-dimensional object recognition from single two-dimensional images," Artificial Intelligence, vol. 31, pp , [13] D. F. Dementhon and L. S. Davis, "Model-based object pose in 25 lines of code," International Journal of omputer Vision, vol. 15, pp , [14] uanser Inc., "uanser ball-x4 User Manual," Toronto, anada, 21. [15] Natural Point Inc., "NaturalPoint Tracking Tools Users Manual," orvallis, Oregon, USA,
Onboard electronics of UAVs
AARMS Vol. 5, No. 2 (2006) 237 243 TECHNOLOGY Onboard electronics of UAVs ANTAL TURÓCZI, IMRE MAKKAY Department of Electronic Warfare, Miklós Zrínyi National Defence University, Budapest, Hungary Recent
More informationCHAPTER 1 INTRODUCTION
CHAPTER 1 INTRODUCTION 1.1 Background of the Research Agile and precise maneuverability of helicopters makes them useful for many critical tasks ranging from rescue and law enforcement task to inspection
More informationHow To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud
REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR Paul Mrstik, Vice President Technology Kresimir Kusevic, R&D Engineer Terrapoint Inc. 140-1 Antares Dr. Ottawa, Ontario K2E 8C4 Canada paul.mrstik@terrapoint.com
More informationForce/position control of a robotic system for transcranial magnetic stimulation
Force/position control of a robotic system for transcranial magnetic stimulation W.N. Wan Zakaria School of Mechanical and System Engineering Newcastle University Abstract To develop a force control scheme
More informationTracking of Small Unmanned Aerial Vehicles
Tracking of Small Unmanned Aerial Vehicles Steven Krukowski Adrien Perkins Aeronautics and Astronautics Stanford University Stanford, CA 94305 Email: spk170@stanford.edu Aeronautics and Astronautics Stanford
More informationAutomatic Labeling of Lane Markings for Autonomous Vehicles
Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 jkiske@stanford.edu 1. Introduction As autonomous vehicles become more popular,
More informationVisual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle
Visual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle Miguel A. Olivares-Méndez mig olivares@hotmail.com Pascual Campoy Cervera pascual.campoy@upm.es Iván Mondragón ivanmond@yahoo.com Carol
More informationDesign of a six Degree-of-Freedom Articulated Robotic Arm for Manufacturing Electrochromic Nanofilms
Abstract Design of a six Degree-of-Freedom Articulated Robotic Arm for Manufacturing Electrochromic Nanofilms by Maxine Emerich Advisor: Dr. Scott Pierce The subject of this report is the development of
More informationDevelopment of Docking System for Mobile Robots Using Cheap Infrared Sensors
Development of Docking System for Mobile Robots Using Cheap Infrared Sensors K. H. Kim a, H. D. Choi a, S. Yoon a, K. W. Lee a, H. S. Ryu b, C. K. Woo b, and Y. K. Kwak a, * a Department of Mechanical
More informationDaniel F. DeMenthon and Larry S. Davis. Center for Automation Research. University of Maryland
Model-Based Object Pose in 25 Lines of Code Daniel F. DeMenthon and Larry S. Davis Computer Vision Laboratory Center for Automation Research University of Maryland College Park, MD 20742 Abstract In this
More information3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Christian Zinner Safe and Autonomous Systems
More informationMechanical Design of a 6-DOF Aerial Manipulator for assembling bar structures using UAVs
Mechanical Design of a 6-DOF Aerial Manipulator for assembling bar structures using UAVs R. Cano*. C. Pérez* F. Pruaño* A. Ollero** G. Heredia** *Centre for Advanced Aerospace Technologies, Seville, Spain
More informationEffective Use of Android Sensors Based on Visualization of Sensor Information
, pp.299-308 http://dx.doi.org/10.14257/ijmue.2015.10.9.31 Effective Use of Android Sensors Based on Visualization of Sensor Information Young Jae Lee Faculty of Smartmedia, Jeonju University, 303 Cheonjam-ro,
More informationVEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS
VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS Aswin C Sankaranayanan, Qinfen Zheng, Rama Chellappa University of Maryland College Park, MD - 277 {aswch, qinfen, rama}@cfar.umd.edu Volkan Cevher, James
More informationZMART Technical Report The International Aerial Robotics Competition 2014
ZMART Technical Report The International Aerial Robotics Competition 2014 ZJU s Micro-Aerial Robotics Team (ZMART) 1 Zhejiang University, Hangzhou, Zhejiang Province, 310027, P.R.China Abstract The Zhejiang
More informationIntelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research
20th International Congress on Modelling and Simulation, Adelaide, Australia, 1 6 December 2013 www.mssanz.org.au/modsim2013 Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and
More informationStatic Environment Recognition Using Omni-camera from a Moving Vehicle
Static Environment Recognition Using Omni-camera from a Moving Vehicle Teruko Yata, Chuck Thorpe Frank Dellaert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 USA College of Computing
More informationINSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users
INSTRUCTOR WORKBOOK for MATLAB /Simulink Users Developed by: Amir Haddadi, Ph.D., Quanser Peter Martin, M.A.SC., Quanser Quanser educational solutions are powered by: CAPTIVATE. MOTIVATE. GRADUATE. PREFACE
More informationRobot Perception Continued
Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart
More informationCONTRIBUTIONS TO THE AUTOMATIC CONTROL OF AERIAL VEHICLES
1 / 23 CONTRIBUTIONS TO THE AUTOMATIC CONTROL OF AERIAL VEHICLES MINH DUC HUA 1 1 INRIA Sophia Antipolis, AROBAS team I3S-CNRS Sophia Antipolis, CONDOR team Project ANR SCUAV Supervisors: Pascal MORIN,
More informationHigh-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound
High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound Ralf Bruder 1, Florian Griese 2, Floris Ernst 1, Achim Schweikard
More informationControl of a quadrotor UAV (slides prepared by M. Cognetti)
Sapienza Università di Roma Corso di Laurea in Ingegneria Elettronica Corso di Fondamenti di Automatica Control of a quadrotor UAV (slides prepared by M. Cognetti) Unmanned Aerial Vehicles (UAVs) autonomous/semi-autonomous
More information3D Tranformations. CS 4620 Lecture 6. Cornell CS4620 Fall 2013 Lecture 6. 2013 Steve Marschner (with previous instructors James/Bala)
3D Tranformations CS 4620 Lecture 6 1 Translation 2 Translation 2 Translation 2 Translation 2 Scaling 3 Scaling 3 Scaling 3 Scaling 3 Rotation about z axis 4 Rotation about z axis 4 Rotation about x axis
More informationVision based Vehicle Tracking using a high angle camera
Vision based Vehicle Tracking using a high angle camera Raúl Ignacio Ramos García Dule Shu gramos@clemson.edu dshu@clemson.edu Abstract A vehicle tracking and grouping algorithm is presented in this work
More informationBasic Principles of Inertial Navigation. Seminar on inertial navigation systems Tampere University of Technology
Basic Principles of Inertial Navigation Seminar on inertial navigation systems Tampere University of Technology 1 The five basic forms of navigation Pilotage, which essentially relies on recognizing landmarks
More informationCE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras
1 CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation Prof. Dr. Hani Hagras Robot Locomotion Robots might want to move in water, in the air, on land, in space.. 2 Most of the
More informationObstacle Avoidance Design for Humanoid Robot Based on Four Infrared Sensors
Tamkang Journal of Science and Engineering, Vol. 12, No. 3, pp. 249 258 (2009) 249 Obstacle Avoidance Design for Humanoid Robot Based on Four Infrared Sensors Ching-Chang Wong 1 *, Chi-Tai Cheng 1, Kai-Hsiang
More informationGANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM
Advances in Production Engineering & Management 4 (2009) 4, 255-262 ISSN 1854-6250 Technical paper GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM Ata, A., A.*; Elaryan, M.**; Gemaee, M.**;
More informationLanding on a Moving Target using an Autonomous Helicopter
Landing on a Moving Target using an Autonomous Helicopter Srikanth Saripalli and Gaurav S. Sukhatme Robotic Embedded Systems Laboratory Center for Robotics and Embedded Systems University of Southern California
More informationAn inertial haptic interface for robotic applications
An inertial haptic interface for robotic applications Students: Andrea Cirillo Pasquale Cirillo Advisor: Ing. Salvatore Pirozzi Altera Innovate Italy Design Contest 2012 Objective Build a Low Cost Interface
More informationVISION-BASED POSITION ESTIMATION IN MULTIPLE QUADROTOR SYSTEMS WITH APPLICATION TO FAULT DETECTION AND RECONFIGURATION
VISION-BASED POSITION ESTIMATION IN MULTIPLE QUADROTOR SYSTEMS WITH APPLICATION TO FAULT DETECTION AND RECONFIGURATION MASTER THESIS, 212-213 SCHOOL OF ENGINEERS, UNIVERSITY OF SEVILLE Author Alejandro
More informationWii Remote Calibration Using the Sensor Bar
Wii Remote Calibration Using the Sensor Bar Alparslan Yildiz Abdullah Akay Yusuf Sinan Akgul GIT Vision Lab - http://vision.gyte.edu.tr Gebze Institute of Technology Kocaeli, Turkey {yildiz, akay, akgul}@bilmuh.gyte.edu.tr
More informationLEGO NXT-based Robotic Arm
Óbuda University e Bulletin Vol. 2, No. 1, 2011 LEGO NXT-based Robotic Arm Ákos Hámori, János Lengyel, Barna Reskó Óbuda University barna.resko@arek.uni-obuda.hu, hamoriakos@gmail.com, polish1987@gmail.com
More informationAerospace Information Technology Topics for Internships and Bachelor s and Master s Theses
Aerospace Information Technology s for Internships and Bachelor s and Master s Theses Version Nov. 2014 The Chair of Aerospace Information Technology addresses several research topics in the area of: Avionic
More informationGLOVE-BASED GESTURE RECOGNITION SYSTEM
CLAWAR 2012 Proceedings of the Fifteenth International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines, Baltimore, MD, USA, 23 26 July 2012 747 GLOVE-BASED GESTURE
More informationTHE problem of visual servoing guiding a robot using
582 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 13, NO. 4, AUGUST 1997 A Modular System for Robust Positioning Using Feedback from Stereo Vision Gregory D. Hager, Member, IEEE Abstract This paper
More information3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving
3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Manfred Gruber Safe and Autonomous Systems
More informationCS 534: Computer Vision 3D Model-based recognition
CS 534: Computer Vision 3D Model-based recognition Ahmed Elgammal Dept of Computer Science CS 534 3D Model-based Vision - 1 High Level Vision Object Recognition: What it means? Two main recognition tasks:!
More informationE190Q Lecture 5 Autonomous Robot Navigation
E190Q Lecture 5 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 2014 1 Figures courtesy of Siegwart & Nourbakhsh Control Structures Planning Based Control Prior Knowledge Operator
More informationA PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA
A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - nzarrin@qiau.ac.ir
More informationA Reliability Point and Kalman Filter-based Vehicle Tracking Technique
A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video
More informationDesign Specifications of an UAV for Environmental Monitoring, Safety, Video Surveillance, and Urban Security
Design Specifications of an UAV for Environmental Monitoring, Safety, Video Surveillance, and Urban Security A. Alessandri, P. Bagnerini, M. Gaggero, M. Ghio, R. Martinelli University of Genoa - Faculty
More informationTRIMBLE ATS TOTAL STATION ADVANCED TRACKING SYSTEMS FOR HIGH-PRECISION CONSTRUCTION APPLICATIONS
TRIMBLE ATS TOTAL STATION ADVANCED TRACKING SYSTEMS FOR HIGH-PRECISION CONSTRUCTION APPLICATIONS BY MARTIN WAGENER APPLICATIONS ENGINEER, TRIMBLE EUROPE OVERVIEW Today s construction industry demands more
More informationDESIGN, IMPLEMENTATION, AND COOPERATIVE COEVOLUTION OF AN AUTONOMOUS/TELEOPERATED CONTROL SYSTEM FOR A SERPENTINE ROBOTIC MANIPULATOR
Proceedings of the American Nuclear Society Ninth Topical Meeting on Robotics and Remote Systems, Seattle Washington, March 2001. DESIGN, IMPLEMENTATION, AND COOPERATIVE COEVOLUTION OF AN AUTONOMOUS/TELEOPERATED
More informationVisual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System
Ref: C0287 Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System Avital Bechar, Victor Bloch, Roee Finkelshtain, Sivan Levi, Aharon Hoffman, Haim Egozi and Ze ev Schmilovitch,
More informationAn Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network
Proceedings of the 8th WSEAS Int. Conf. on ARTIFICIAL INTELLIGENCE, KNOWLEDGE ENGINEERING & DATA BASES (AIKED '9) ISSN: 179-519 435 ISBN: 978-96-474-51-2 An Energy-Based Vehicle Tracking System using Principal
More informationTime Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication
Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication Thomas Reilly Data Physics Corporation 1741 Technology Drive, Suite 260 San Jose, CA 95110 (408) 216-8440 This paper
More informationMicrocontrollers, Actuators and Sensors in Mobile Robots
SISY 2006 4 th Serbian-Hungarian Joint Symposium on Intelligent Systems Microcontrollers, Actuators and Sensors in Mobile Robots István Matijevics Polytechnical Engineering College, Subotica, Serbia mistvan@vts.su.ac.yu
More information3D Arm Motion Tracking for Home-based Rehabilitation
hapter 13 3D Arm Motion Tracking for Home-based Rehabilitation Y. Tao and H. Hu 13.1 Introduction This paper presents a real-time hbrid solution to articulated 3D arm motion tracking for home-based rehabilitation
More informationIntegrated sensors for robotic laser welding
Proceedings of the Third International WLT-Conference on Lasers in Manufacturing 2005,Munich, June 2005 Integrated sensors for robotic laser welding D. Iakovou *, R.G.K.M Aarts, J. Meijer University of
More informationInteractive Computer Graphics
Interactive Computer Graphics Lecture 18 Kinematics and Animation Interactive Graphics Lecture 18: Slide 1 Animation of 3D models In the early days physical models were altered frame by frame to create
More informationWireless Sensor Networks Coverage Optimization based on Improved AFSA Algorithm
, pp. 99-108 http://dx.doi.org/10.1457/ijfgcn.015.8.1.11 Wireless Sensor Networks Coverage Optimization based on Improved AFSA Algorithm Wang DaWei and Wang Changliang Zhejiang Industry Polytechnic College
More informationRobotics. Chapter 25. Chapter 25 1
Robotics Chapter 25 Chapter 25 1 Outline Robots, Effectors, and Sensors Localization and Mapping Motion Planning Motor Control Chapter 25 2 Mobile Robots Chapter 25 3 Manipulators P R R R R R Configuration
More informationEffective Interface Design Using Face Detection for Augmented Reality Interaction of Smart Phone
Effective Interface Design Using Face Detection for Augmented Reality Interaction of Smart Phone Young Jae Lee Dept. of Multimedia, Jeonju University #45, Backma-Gil, Wansan-Gu,Jeonju, Jeonbul, 560-759,
More informationFrequently Asked Questions
Frequently Asked Questions Basic Facts What does the name ASIMO stand for? ASIMO stands for Advanced Step in Innovative Mobility. Who created ASIMO? ASIMO was developed by Honda Motor Co., Ltd., a world
More informationIndustrial Robotics. Training Objective
Training Objective After watching the program and reviewing this printed material, the viewer will learn the basics of industrial robot technology and how robots are used in a variety of manufacturing
More informationMetrics on SO(3) and Inverse Kinematics
Mathematical Foundations of Computer Graphics and Vision Metrics on SO(3) and Inverse Kinematics Luca Ballan Institute of Visual Computing Optimization on Manifolds Descent approach d is a ascent direction
More informationExmoR A Testing Tool for Control Algorithms on Mobile Robots
ExmoR A Testing Tool for Control Algorithms on Mobile Robots F. Lehmann, M. Ritzschke and B. Meffert Institute of Informatics, Humboldt University, Unter den Linden 6, 10099 Berlin, Germany E-mail: falk.lehmann@gmx.de,
More informationA non-contact optical technique for vehicle tracking along bounded trajectories
Home Search Collections Journals About Contact us My IOPscience A non-contact optical technique for vehicle tracking along bounded trajectories This content has been downloaded from IOPscience. Please
More informationRelating Vanishing Points to Catadioptric Camera Calibration
Relating Vanishing Points to Catadioptric Camera Calibration Wenting Duan* a, Hui Zhang b, Nigel M. Allinson a a Laboratory of Vision Engineering, University of Lincoln, Brayford Pool, Lincoln, U.K. LN6
More informationPID, LQR and LQR-PID on a Quadcopter Platform
PID, LQR and LQR-PID on a Quadcopter Platform Lucas M. Argentim unielargentim@fei.edu.br Willian C. Rezende uniewrezende@fei.edu.br Paulo E. Santos psantos@fei.edu.br Renato A. Aguiar preaguiar@fei.edu.br
More informationClassifying Manipulation Primitives from Visual Data
Classifying Manipulation Primitives from Visual Data Sandy Huang and Dylan Hadfield-Menell Abstract One approach to learning from demonstrations in robotics is to make use of a classifier to predict if
More informationHuman-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database
Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database Seungsu Kim, ChangHwan Kim and Jong Hyeon Park School of Mechanical Engineering Hanyang University, Seoul, 133-791, Korea.
More informationHow To Analyze Ball Blur On A Ball Image
Single Image 3D Reconstruction of Ball Motion and Spin From Motion Blur An Experiment in Motion from Blur Giacomo Boracchi, Vincenzo Caglioti, Alessandro Giusti Objective From a single image, reconstruct:
More information3D Annotation and Manipulation of Medical Anatomical Structures
3D Annotation and Manipulation of Medical Anatomical Structures Dime Vitanovski, Christian Schaller, Dieter Hahn, Volker Daum, Joachim Hornegger Chair of Pattern Recognition, Martensstr. 3, 91058 Erlangen,
More informationVirtual CRASH 3.0 Staging a Car Crash
Virtual CRASH 3.0 Staging a Car Crash Virtual CRASH Virtual CRASH 3.0 Staging a Car Crash Changes are periodically made to the information herein; these changes will be incorporated in new editions of
More informationSensor Based Control of Autonomous Wheeled Mobile Robots
Sensor Based Control of Autonomous Wheeled Mobile Robots Gyula Mester University of Szeged, Department of Informatics e-mail: gmester@inf.u-szeged.hu Abstract The paper deals with the wireless sensor-based
More informationAttitude and Position Control Using Real-Time Color Tracking
Attitude and Position Control Using Real-Time Color Tracking David P. Miller, Anne Wright and Randy Sargent KISS Institute for Practical Robotics Reston, Virginia USA Voice: 703/620-0551 FAX: 703/860-1802
More informationVISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEM Arun Prakash 1
VISION ALGORITHM FOR SEAM TRACKING IN AUTOMATIC WELDING SYSTEM Arun Prakash 1 1 Assistant Professor, Department of Mechanical Engineering, SSN College of Engineering, Chennai, India ABSTRACT Arc welding
More informationAbstract. Introduction
SPACECRAFT APPLICATIONS USING THE MICROSOFT KINECT Matthew Undergraduate Student Advisor: Dr. Troy Henderson Aerospace and Ocean Engineering Department Virginia Tech Abstract This experimental study involves
More information5-Axis Test-Piece Influence of Machining Position
5-Axis Test-Piece Influence of Machining Position Michael Gebhardt, Wolfgang Knapp, Konrad Wegener Institute of Machine Tools and Manufacturing (IWF), Swiss Federal Institute of Technology (ETH), Zurich,
More informationHAND GESTURE BASEDOPERATINGSYSTEM CONTROL
HAND GESTURE BASEDOPERATINGSYSTEM CONTROL Garkal Bramhraj 1, palve Atul 2, Ghule Supriya 3, Misal sonali 4 1 Garkal Bramhraj mahadeo, 2 Palve Atule Vasant, 3 Ghule Supriya Shivram, 4 Misal Sonali Babasaheb,
More informationGeometric Camera Parameters
Geometric Camera Parameters What assumptions have we made so far? -All equations we have derived for far are written in the camera reference frames. -These equations are valid only when: () all distances
More informationBuilding an Advanced Invariant Real-Time Human Tracking System
UDC 004.41 Building an Advanced Invariant Real-Time Human Tracking System Fayez Idris 1, Mazen Abu_Zaher 2, Rashad J. Rasras 3, and Ibrahiem M. M. El Emary 4 1 School of Informatics and Computing, German-Jordanian
More informationStabilizing a Gimbal Platform using Self-Tuning Fuzzy PID Controller
Stabilizing a Gimbal Platform using Self-Tuning Fuzzy PID Controller Nourallah Ghaeminezhad Collage Of Automation Engineering Nuaa Nanjing China Wang Daobo Collage Of Automation Engineering Nuaa Nanjing
More informationStirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot.
Ref BP128 Anatomy Of A Robot Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot. The term robot stems from the Czech word robota, which translates roughly as
More informationRS platforms. Fabio Dell Acqua - Gruppo di Telerilevamento
RS platforms Platform vs. instrument Sensor Platform Instrument The remote sensor can be ideally represented as an instrument carried by a platform Platforms Remote Sensing: Ground-based air-borne space-borne
More informationTHE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY
THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY Dr. T. Clarke & Dr. X. Wang Optical Metrology Centre, City University, Northampton Square, London, EC1V 0HB, UK t.a.clarke@city.ac.uk, x.wang@city.ac.uk
More informationProjection Center Calibration for a Co-located Projector Camera System
Projection Center Calibration for a Co-located Camera System Toshiyuki Amano Department of Computer and Communication Science Faculty of Systems Engineering, Wakayama University Sakaedani 930, Wakayama,
More informationFlight Controller. Mini Fun Fly
Flight Controller Mini Fun Fly Create by AbuseMarK 0 Mini FunFly Flight Controller Naze ( Introduction 6x6mm. 6 grams (no headers, 8 grams with). 000 degrees/second -axis MEMS gyro. auto-level capable
More informationSIX DEGREE-OF-FREEDOM MODELING OF AN UNINHABITED AERIAL VEHICLE. A thesis presented to. the faculty of
SIX DEGREE-OF-FREEDOM MODELING OF AN UNINHABITED AERIAL VEHICLE A thesis presented to the faculty of the Russ College of Engineering and Technology of Ohio University In partial fulfillment of the requirement
More informationThe Basics of Robot Mazes Teacher Notes
The Basics of Robot Mazes Teacher Notes Why do robots solve Mazes? A maze is a simple environment with simple rules. Solving it is a task that beginners can do successfully while learning the essentials
More informationsonobot autonomous hydrographic survey vehicle product information guide
sonobot autonomous hydrographic survey vehicle product information guide EvoLogics Sonobot an autonomous unmanned surface vehicle for hydrographic surveys High Precision Differential GPS for high-accuracy
More informationControl Design of Unmanned Aerial Vehicles (UAVs)
Control Design of Unmanned Aerial Vehicles (UAVs) Roberto Tempo CNR-IEIIT Consiglio Nazionale delle Ricerche Politecnico di Torino tempo@polito.it Control of UAVs UAVs: Unmanned aerial vehicles of different
More informationAnalecta Vol. 8, No. 2 ISSN 2064-7964
EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,
More informationMechanics lecture 7 Moment of a force, torque, equilibrium of a body
G.1 EE1.el3 (EEE1023): Electronics III Mechanics lecture 7 Moment of a force, torque, equilibrium of a body Dr Philip Jackson http://www.ee.surrey.ac.uk/teaching/courses/ee1.el3/ G.2 Moments, torque and
More informationSimulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD)
Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD) Jatin Dave Assistant Professor Nirma University Mechanical Engineering Department, Institute
More informationExperimental Results from TelOpTrak - Precision Indoor Tracking of Tele-operated UGVs
Experimental Results from TelOpTrak - Precision Indoor Tracking of Tele-operated UGVs Johann Borenstein*, Adam Borrell, Russ Miller, David Thomas All authors are with the University of Michigan, Dept of
More informationDesign-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist
Design-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist MHER GRIGORIAN, TAREK SOBH Department of Computer Science and Engineering, U. of Bridgeport, USA ABSTRACT Robot
More informationEssential Mathematics for Computer Graphics fast
John Vince Essential Mathematics for Computer Graphics fast Springer Contents 1. MATHEMATICS 1 Is mathematics difficult? 3 Who should read this book? 4 Aims and objectives of this book 4 Assumptions made
More informationMotion tracking using Matlab, a Nintendo Wii Remote, and infrared LEDs.
Motion tracking using Matlab, a Nintendo Wii Remote, and infrared LEDs. Dr W. Owen Brimijoin MRC Institute of Hearing Research (Scottish Section) Glasgow Royal Infirmary 16 Alexandra Parade Glasgow G31
More informationSolving Simultaneous Equations and Matrices
Solving Simultaneous Equations and Matrices The following represents a systematic investigation for the steps used to solve two simultaneous linear equations in two unknowns. The motivation for considering
More informationFRC WPI Robotics Library Overview
FRC WPI Robotics Library Overview Contents 1.1 Introduction 1.2 RobotDrive 1.3 Sensors 1.4 Actuators 1.5 I/O 1.6 Driver Station 1.7 Compressor 1.8 Camera 1.9 Utilities 1.10 Conclusion Introduction In this
More informationOptical Tracking Using Projective Invariant Marker Pattern Properties
Optical Tracking Using Projective Invariant Marker Pattern Properties Robert van Liere, Jurriaan D. Mulder Department of Information Systems Center for Mathematics and Computer Science Amsterdam, the Netherlands
More informationSimultaneous Gamma Correction and Registration in the Frequency Domain
Simultaneous Gamma Correction and Registration in the Frequency Domain Alexander Wong a28wong@uwaterloo.ca William Bishop wdbishop@uwaterloo.ca Department of Electrical and Computer Engineering University
More informationDefinitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots
Definitions A robot is A programmable, multifunction manipulator designed to move material, parts, tools, or specific devices through variable programmed motions for the performance of a variety of tasks.
More informationSynthetic Sensing: Proximity / Distance Sensors
Synthetic Sensing: Proximity / Distance Sensors MediaRobotics Lab, February 2010 Proximity detection is dependent on the object of interest. One size does not fit all For non-contact distance measurement,
More informationLimitations of Human Vision. What is computer vision? What is computer vision (cont d)?
What is computer vision? Limitations of Human Vision Slide 1 Computer vision (image understanding) is a discipline that studies how to reconstruct, interpret and understand a 3D scene from its 2D images
More informationJean François Aumont (1), Bastien Mancini (2). (1) Image processing manager, Delair-Tech, (2) Managing director, Delair-Tech. Project : Authors:
Jean François Aumont (1), Bastien Mancini (2). (1) Image processing manager, Delair-Tech, (2) Managing director, Delair-Tech. White paper DT26X DT-3BANDS XL APXL Project : Authors: Date:
More informationA PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS
A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS Sébastien Briot, Ilian A. Bonev Department of Automated Manufacturing Engineering École de technologie supérieure (ÉTS), Montreal,
More information