Force and Visual Control for Safe Human Robot Interaction



Similar documents
Limitations of Human Vision. What is computer vision? What is computer vision (cont d)?

Force/position control of a robotic system for transcranial magnetic stimulation

Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System

Sensory-motor control scheme based on Kohonen Maps and AVITE model

Graduate Co-op Students Information Manual. Department of Computer Science. Faculty of Science. University of Regina

Epipolar Geometry and Visual Servoing

Human Motion Mapping to a Robot Arm with Redundancy Resolution

An inertial haptic interface for robotic applications

FUNDAMENTALS OF ROBOTICS

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen

Design-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist

Artificial Intelligence and Politecnico di Milano. Presented by Matteo Matteucci

Industrial Robotics. Training Objective

Internet based manipulator telepresence

Robotic Apple Harvesting in Washington State

Scooter, 3 wheeled cobot North Western University. PERCRO Exoskeleton

Lecture Notes - H. Harry Asada Ford Professor of Mechanical Engineering

A Cognitive Approach to Vision for a Mobile Robot

Integration Services

THE problem of visual servoing guiding a robot using

UOM Robotics and Industrial Automation Laboratory

Europass Curriculum Vitae

Pneumatically Driven Robot System with Force Perception for Minimally Invasive Surgery

Robot coined by Karel Capek in a 1921 science-fiction Czech play

CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras

Taking Inverse Graphics Seriously

RealTime Tracking Meets Online Grasp Planning

Industrial Vision Days 2012 Making Cameras Smarter: FPGA Based Image Pre-processing Unleashed

Robotics. Chapter 25. Chapter 25 1

Professor, D.Sc. (Tech.) Eugene Kovshov MSTU «STANKIN», Moscow, Russia

Digital Image Increase

High Accuracy Articulated Robots with CNC Control Systems

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users

MoveInspect HF HR. 3D measurement of dynamic processes MEASURE THE ADVANTAGE. MoveInspect TECHNOLOGY

Professional Organization Checklist for the Computer Science Curriculum Updates. Association of Computing Machinery Computing Curricula 2008

Tracking and integrated navigation Konrad Schindler

Tracking in flussi video 3D. Ing. Samuele Salti

CIM Computer Integrated Manufacturing

Advances in the Design of the icub Humanoid Robot: Force Control and Tactile Sensing

Medical Robotics. Control Modalities

Robotic motion planning for 8- DOF motion stage

Advantages of Auto-tuning for Servo-motors

Degree programme in Automation Engineering

Diagnosis and Fault-Tolerant Control

Robot Perception Continued

Intelligent Flexible Automation

Online Tuning of Artificial Neural Networks for Induction Motor Control

Masters in Information Technology

Constraint satisfaction and global optimization in robotics

Introduction to Robotics Analysis, systems, Applications Saeed B. Niku

A Prototype For Eye-Gaze Corrected

Vibrations can have an adverse effect on the accuracy of the end effector of a

Leghe a memoria di forma come tecnologia di attuazione per la biorobotica

Microcontrollers, Actuators and Sensors in Mobile Robots

UAV Pose Estimation using POSIT Algorithm

Robotics. Lecture 3: Sensors. See course website for up to date information.

T-REDSPEED White paper

Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication

Basler. Line Scan Cameras

White Paper. How Streaming Data Analytics Enables Real-Time Decisions

Tracking of Small Unmanned Aerial Vehicles

Effectiveness of Haptic Feedback in Open Surgery Simulation and Training Systems

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

What is Visualization? Information Visualization An Overview. Information Visualization. Definitions

Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF

Core Curriculum to the Course:

SOFA an Open Source Framework for Medical Simulation

Design and Implementation of a 4-Bar linkage Gripper

Multisensor Data Fusion and Applications

CROPS: Intelligent sensing and manipulation for sustainable production and harvesting of high valued crops, clever robots for crops.

Unit 1: INTRODUCTION TO ADVANCED ROBOTIC DESIGN & ENGINEERING

Lecture 3: Teleoperation

Static Environment Recognition Using Omni-camera from a Moving Vehicle

FSI Machine Vision Training Programs

Figure Cartesian coordinate robot

How To Get A Computer Engineering Degree

Electronic Interface for Pneumatic Grippers Using USB Port

Dually Fed Permanent Magnet Synchronous Generator Condition Monitoring Using Stator Current

INTRODUCTION. Robotics is a relatively young field of modern technology that crosses traditional

LEGO NXT-based Robotic Arm

Development of Simulation Tools Software

POTENTIAL OF STATE-FEEDBACK CONTROL FOR MACHINE TOOLS DRIVES

Development and optimization of a hybrid passive/active liner for flow duct applications

School of Computer Science

A Remote Maintenance System with the use of Virtual Reality.

Definitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots

Research-Grade Research-Grade. Capture

User Guide MTD-3. Motion Lab Systems, Inc.

Analecta Vol. 8, No. 2 ISSN

Automatic Labeling of Lane Markings for Autonomous Vehicles

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

Transcription:

Force and Visual Control for Safe Human Robot Interaction Bruno SICILIANO www.prisma.unina.it

PRISMA Team Force and Visual Control for Safe Human Robot Interaction 2/35 Bruno Siciliano Luigi Villani Vincenzo Lippiello Alberto Finzi Silvia Rossi Franco Cutugno Fanny Ficuciello Fabio Ruggiero Agostino De Santis Rafik Mebarki Antoine Petit Jun Nishiyama Alejandro Donaire Aykut Satıcı Francesca Cordella Mariacarla Staffa Daniela D'Auria Luca Buonocore Jonathan Cacace Diana Serra Mahdi Momeni Andrea Fontanelli Valeria Federico Jonathan van der Meer

Aerial Robotics Assistive Robotics Cognitive Control of Robotic Systems Dual-Arm/Hand Manipulation Force Control Human Robot Interaction Inspection Robotics Lightweight Flexible Arms Mobile Dynamic Non-prehensile Manipulation Motion Generation and Biomechanical Analysis for Human Figures Redundant Manipulators Robotic Surgery Technology Sensory-Motor Synergies of Anthropomorphic Hands Service Robotics Visual Servoing Funding of 8.5 M Our Research Portfolio Force and Visual Control for Safe Human Robot Interaction 3/35

Outline Force and Visual Control for Safe Human Robot Interaction 4/35 Physical human-robot interaction Force control Impedance control Force control Visual control Position-based visual servoing Pose estimation using vision Interaction control using vision and force Position-based visual impedance control Pose estimation using vision and force Experiments

Human Robot Interaction Force and Visual Control for Safe Human Robot Interaction 5/35 Traditional industrial robotics Segmented workspace for machines and humans Intelligent machines working in contact with humans Haptic interfaces and teleoperators Cooperative material handling Power extenders Rehabilitation and physical training Entertainment Human robot interaction (HRI) Cognitive issues [chri]: perception, awareness, mental models Physical issues [phri]: safety, dependability Ethical issues: motivations and critics about HRI, acceptability

Physical Human Robot Interaction Force and Visual Control for Safe Human Robot Interaction 6/35 Risks for robots interacting with humans Heavy moving parts and objects transported Sensory data reliability Level of autonomy/unpredictable behaviours Solutions for collaborative human robot operation Design of non-conventional actuators (passive safety) Interaction control (active safety) Dependable algorithms for supervision and planning Fault tolerance Need for quantitative metrics

Force Control Force and Visual Control for Safe Human Robot Interaction 7/35 Motion control vs. force control Advanced industrial robotics and service robotics demand for control of interaction between robot manipulator and humans Use of purely motion control strategy is candidate to fail (task planning accuracy) Control of contact force (compliant behaviour) Use of force/torque sensor (interfaced with robot control unit)

Taxonomy Force and Visual Control for Safe Human Robot Interaction 8/35 Indirect vs. direct force control Indirect force control: force control via motion control (w/out explicit closure of force feedback loop) Direct force control: force controlled to desired value (w/ closure of force feedback loop)

Indirect Force Control Force and Visual Control for Safe Human Robot Interaction 9/35 Impedance control with inner motion control loop Force/torque measurements for linear and decoupled impedance Compliant frame between desired and EE frame (disturbance rejection)

Experiments Force and Visual Control for Safe Human Robot Interaction 10/35 Set-up COMAU Smart 3-S robot Open control architecture ATI force/torque sensor 6-DOF spatial impedance surface contact low compliance high damping high compliance low damping high compliance high damping

Extension to Cooperating Robots Force and Visual Control for Safe Human Robot Interaction 11/35 Dual-arm set-up 6ax robot 7ax robot peg-in-hole assembly 6-DOF impedance control of absolute motion and internal forces absolute & relative impedance absolute impedance human-object interaction

Service Tasks Force and Visual Control for Safe Human Robot Interaction 12/35 Set-up @ ARTS Lab, SSSA Pisa DEXTER cable-actuated robot arm Compliance control Tasks of assisting disabled people

Visual Control Force and Visual Control for Safe Human Robot Interaction 13/35 Artificial intelligence vs. automatic control Vision capabilities bring robots nearer to human skills The information that can be extracted form images is very rich and can be used at different levels of a hierarchical architecture Highest level (Artificial Intelligence) Visual information used to understand the world and decide how to act Bandwidth Data complexity Lowest level (Automatic Control) Visual measurements used in the control law Visual servoing

Taxonomy Force and Visual Control for Safe Human Robot Interaction 14/35 Direct visual servoing Vision-based control Robot Visual feedback Camera 100 Hz Indirect visual servoing Vision-based control Motion control Robot Visual feedback 20 Hz Camera Joint feedback 100 Hz dynamic look-and-move

Indirect Visual Servoing Force and Visual Control for Safe Human Robot Interaction 15/35 Vision system BUS Computer Estimated pose 3D object MOVE Control unit

Position-based (3D) visual servoing scheme Position-based Visual Servoing (PBVS) Force and Visual Control for Safe Human Robot Interaction 16/35 Desired pose Pose control Robot + motion control Camera Features extraction Pose estimation Calibration dependent

PBVS Force and Visual Control for Safe Human Robot Interaction 17/35 Pose control and pose estimation Calibrated cameras Target objects of known geometry

Pose Estimation Force and Visual Control for Safe Human Robot Interaction 18/35 Reconstruction of object pose from observed workspace Selection of image regions Observed workspace Image acquisition Image region segmentation Image features extraction Pose reconstruction

Dynamic Image Features Force and Visual Control for Safe Human Robot Interaction 19/35 Pose estimation with dynamic image features selection Extended Kalman Filter (EKF) based on approximate object models

EKF Force and Visual Control for Safe Human Robot Interaction 20/35 Extended Kalman Filter State-space model of the object motion = object pose = input disturbance Output equation = image features parameters = measurement noise Jacobian of the output equation ~ image Jacobian

Pose Reconstruction Force and Visual Control for Safe Human Robot Interaction 21/35 Dynamic mode (interposing parts) A single BSP tree is generated online, describing all the objects with respect to a common reference frame For each camera, the visit algorithm is applied to the BSP tree A Kalman filter is used for each object

Preselection Algorithm Force and Visual Control for Safe Human Robot Interaction 22/35 ~ ~ Out of sight The preselection algorithm estimates the position of the projections of the object visible corners on the image planes of each camera

Selection Algorithm Force and Visual Control for Safe Human Robot Interaction 23/35 A selection algorithm based on suitable quality indexes is employed to find the (sub)optimal set of (6 to 8) feature points Some quality indexes for the cost function Spatial distribution ( avoiding points concentration) Angular distribution ( avoiding alignments) Camera distribution ( triangulation) Anti-chattering ( estimate noise reduction) Extraction reliability ( robustness)

Features Extraction Force and Visual Control for Safe Human Robot Interaction 24/35 Out of sight extracted corners

Visual Servoing Architecture Force and Visual Control for Safe Human Robot Interaction 25/35 COMAU C3G 9000 open Robot CPU Servo CPU Power amplifiers Robot COMAU SMART 3S FOLLOWER Robot COMAU SMART 3S LEADER (PDL2) BUS VME User interface modules Board BIT 3 Vision PC (VESPRO) Board BIT 3 BUS AT Control PC RS232 Vision PC BUS AT MATROX GENESIS MATROX GENESIS SONY XC 8500 CE SONY XC 8500 CE Control PC (RePLiCS)

Experiments Force and Visual Control for Safe Human Robot Interaction 26/35 FOLLOWER LEADER L-Camera Object R-Camera

Extension to Grasping Force and Visual Control for Safe Human Robot Interaction 27/35 Visually guided grasping Object in unstructured environment Visual servoing Tracking of object motion Good reaction to uncertainties

Interaction Control Force and Visual Control for Safe Human Robot Interaction 28/35 Interaction control using vision and force Vision provides global information on surrounding environment to be used for motion planning and obstacle avoidance Force allows adjusting robot motion so that local constraints imposed by environment are satisfied

Force and Visual Control Force and Visual Control for Safe Human Robot Interaction 29/35 Set-up @ DLR KUKA robot with force sensor and camera embedded in the gripper Integration of vision and force Visual feedback in gross motion Force feedback in fine motion

Control Strategy Force and Visual Control for Safe Human Robot Interaction 30/35 Problem Control interaction of a robot manipulator with a rigid object of known geometry but unknown position and orientation Solution When robot is far from object Position-based visual servoing is adopted The relative pose of the robot with respect to the object is estimated recursively using only vision When robot is in contact with object Any kind of interaction control strategy can be adopted (impedance control, parallel force/position control) The relative pose of the robot with respect to the object is estimated recursively using vision, force and joint position measurements

Visual Impedance Force and Visual Control for Safe Human Robot Interaction 31/35 Position-based visual impedance servoing Impedance Force feedback

Pose Estimation Using Vision and Force Force and Visual Control for Safe Human Robot Interaction 32/35 Modified EKF State-space model unchanged Modified output equation (when ) Modified Jacobian

Experiments Force and Visual Control for Safe Human Robot Interaction 33/35 Pose estimation errors X vision vision & force

Conclusion Force and Visual Control for Safe Human Robot Interaction 34/35 Interaction control of robot manipulators Indirect and direct force control Position-based visual servoing Position-based visual impedance control Current and future work Interaction control in unstructured environments (presence of humans) Integration of new sensors (proximity sensors, joint torque sensors, ) Extension to dynamic manipulation (nonprehensile, deformable objects)

Force and Visual Control for Safe Human Robot Interaction 35/35 Thank You very much indeed for Your kind attention Questions? bruno.siciliano@unina.it