Force and Visual Control for Safe Human Robot Interaction Bruno SICILIANO www.prisma.unina.it
PRISMA Team Force and Visual Control for Safe Human Robot Interaction 2/35 Bruno Siciliano Luigi Villani Vincenzo Lippiello Alberto Finzi Silvia Rossi Franco Cutugno Fanny Ficuciello Fabio Ruggiero Agostino De Santis Rafik Mebarki Antoine Petit Jun Nishiyama Alejandro Donaire Aykut Satıcı Francesca Cordella Mariacarla Staffa Daniela D'Auria Luca Buonocore Jonathan Cacace Diana Serra Mahdi Momeni Andrea Fontanelli Valeria Federico Jonathan van der Meer
Aerial Robotics Assistive Robotics Cognitive Control of Robotic Systems Dual-Arm/Hand Manipulation Force Control Human Robot Interaction Inspection Robotics Lightweight Flexible Arms Mobile Dynamic Non-prehensile Manipulation Motion Generation and Biomechanical Analysis for Human Figures Redundant Manipulators Robotic Surgery Technology Sensory-Motor Synergies of Anthropomorphic Hands Service Robotics Visual Servoing Funding of 8.5 M Our Research Portfolio Force and Visual Control for Safe Human Robot Interaction 3/35
Outline Force and Visual Control for Safe Human Robot Interaction 4/35 Physical human-robot interaction Force control Impedance control Force control Visual control Position-based visual servoing Pose estimation using vision Interaction control using vision and force Position-based visual impedance control Pose estimation using vision and force Experiments
Human Robot Interaction Force and Visual Control for Safe Human Robot Interaction 5/35 Traditional industrial robotics Segmented workspace for machines and humans Intelligent machines working in contact with humans Haptic interfaces and teleoperators Cooperative material handling Power extenders Rehabilitation and physical training Entertainment Human robot interaction (HRI) Cognitive issues [chri]: perception, awareness, mental models Physical issues [phri]: safety, dependability Ethical issues: motivations and critics about HRI, acceptability
Physical Human Robot Interaction Force and Visual Control for Safe Human Robot Interaction 6/35 Risks for robots interacting with humans Heavy moving parts and objects transported Sensory data reliability Level of autonomy/unpredictable behaviours Solutions for collaborative human robot operation Design of non-conventional actuators (passive safety) Interaction control (active safety) Dependable algorithms for supervision and planning Fault tolerance Need for quantitative metrics
Force Control Force and Visual Control for Safe Human Robot Interaction 7/35 Motion control vs. force control Advanced industrial robotics and service robotics demand for control of interaction between robot manipulator and humans Use of purely motion control strategy is candidate to fail (task planning accuracy) Control of contact force (compliant behaviour) Use of force/torque sensor (interfaced with robot control unit)
Taxonomy Force and Visual Control for Safe Human Robot Interaction 8/35 Indirect vs. direct force control Indirect force control: force control via motion control (w/out explicit closure of force feedback loop) Direct force control: force controlled to desired value (w/ closure of force feedback loop)
Indirect Force Control Force and Visual Control for Safe Human Robot Interaction 9/35 Impedance control with inner motion control loop Force/torque measurements for linear and decoupled impedance Compliant frame between desired and EE frame (disturbance rejection)
Experiments Force and Visual Control for Safe Human Robot Interaction 10/35 Set-up COMAU Smart 3-S robot Open control architecture ATI force/torque sensor 6-DOF spatial impedance surface contact low compliance high damping high compliance low damping high compliance high damping
Extension to Cooperating Robots Force and Visual Control for Safe Human Robot Interaction 11/35 Dual-arm set-up 6ax robot 7ax robot peg-in-hole assembly 6-DOF impedance control of absolute motion and internal forces absolute & relative impedance absolute impedance human-object interaction
Service Tasks Force and Visual Control for Safe Human Robot Interaction 12/35 Set-up @ ARTS Lab, SSSA Pisa DEXTER cable-actuated robot arm Compliance control Tasks of assisting disabled people
Visual Control Force and Visual Control for Safe Human Robot Interaction 13/35 Artificial intelligence vs. automatic control Vision capabilities bring robots nearer to human skills The information that can be extracted form images is very rich and can be used at different levels of a hierarchical architecture Highest level (Artificial Intelligence) Visual information used to understand the world and decide how to act Bandwidth Data complexity Lowest level (Automatic Control) Visual measurements used in the control law Visual servoing
Taxonomy Force and Visual Control for Safe Human Robot Interaction 14/35 Direct visual servoing Vision-based control Robot Visual feedback Camera 100 Hz Indirect visual servoing Vision-based control Motion control Robot Visual feedback 20 Hz Camera Joint feedback 100 Hz dynamic look-and-move
Indirect Visual Servoing Force and Visual Control for Safe Human Robot Interaction 15/35 Vision system BUS Computer Estimated pose 3D object MOVE Control unit
Position-based (3D) visual servoing scheme Position-based Visual Servoing (PBVS) Force and Visual Control for Safe Human Robot Interaction 16/35 Desired pose Pose control Robot + motion control Camera Features extraction Pose estimation Calibration dependent
PBVS Force and Visual Control for Safe Human Robot Interaction 17/35 Pose control and pose estimation Calibrated cameras Target objects of known geometry
Pose Estimation Force and Visual Control for Safe Human Robot Interaction 18/35 Reconstruction of object pose from observed workspace Selection of image regions Observed workspace Image acquisition Image region segmentation Image features extraction Pose reconstruction
Dynamic Image Features Force and Visual Control for Safe Human Robot Interaction 19/35 Pose estimation with dynamic image features selection Extended Kalman Filter (EKF) based on approximate object models
EKF Force and Visual Control for Safe Human Robot Interaction 20/35 Extended Kalman Filter State-space model of the object motion = object pose = input disturbance Output equation = image features parameters = measurement noise Jacobian of the output equation ~ image Jacobian
Pose Reconstruction Force and Visual Control for Safe Human Robot Interaction 21/35 Dynamic mode (interposing parts) A single BSP tree is generated online, describing all the objects with respect to a common reference frame For each camera, the visit algorithm is applied to the BSP tree A Kalman filter is used for each object
Preselection Algorithm Force and Visual Control for Safe Human Robot Interaction 22/35 ~ ~ Out of sight The preselection algorithm estimates the position of the projections of the object visible corners on the image planes of each camera
Selection Algorithm Force and Visual Control for Safe Human Robot Interaction 23/35 A selection algorithm based on suitable quality indexes is employed to find the (sub)optimal set of (6 to 8) feature points Some quality indexes for the cost function Spatial distribution ( avoiding points concentration) Angular distribution ( avoiding alignments) Camera distribution ( triangulation) Anti-chattering ( estimate noise reduction) Extraction reliability ( robustness)
Features Extraction Force and Visual Control for Safe Human Robot Interaction 24/35 Out of sight extracted corners
Visual Servoing Architecture Force and Visual Control for Safe Human Robot Interaction 25/35 COMAU C3G 9000 open Robot CPU Servo CPU Power amplifiers Robot COMAU SMART 3S FOLLOWER Robot COMAU SMART 3S LEADER (PDL2) BUS VME User interface modules Board BIT 3 Vision PC (VESPRO) Board BIT 3 BUS AT Control PC RS232 Vision PC BUS AT MATROX GENESIS MATROX GENESIS SONY XC 8500 CE SONY XC 8500 CE Control PC (RePLiCS)
Experiments Force and Visual Control for Safe Human Robot Interaction 26/35 FOLLOWER LEADER L-Camera Object R-Camera
Extension to Grasping Force and Visual Control for Safe Human Robot Interaction 27/35 Visually guided grasping Object in unstructured environment Visual servoing Tracking of object motion Good reaction to uncertainties
Interaction Control Force and Visual Control for Safe Human Robot Interaction 28/35 Interaction control using vision and force Vision provides global information on surrounding environment to be used for motion planning and obstacle avoidance Force allows adjusting robot motion so that local constraints imposed by environment are satisfied
Force and Visual Control Force and Visual Control for Safe Human Robot Interaction 29/35 Set-up @ DLR KUKA robot with force sensor and camera embedded in the gripper Integration of vision and force Visual feedback in gross motion Force feedback in fine motion
Control Strategy Force and Visual Control for Safe Human Robot Interaction 30/35 Problem Control interaction of a robot manipulator with a rigid object of known geometry but unknown position and orientation Solution When robot is far from object Position-based visual servoing is adopted The relative pose of the robot with respect to the object is estimated recursively using only vision When robot is in contact with object Any kind of interaction control strategy can be adopted (impedance control, parallel force/position control) The relative pose of the robot with respect to the object is estimated recursively using vision, force and joint position measurements
Visual Impedance Force and Visual Control for Safe Human Robot Interaction 31/35 Position-based visual impedance servoing Impedance Force feedback
Pose Estimation Using Vision and Force Force and Visual Control for Safe Human Robot Interaction 32/35 Modified EKF State-space model unchanged Modified output equation (when ) Modified Jacobian
Experiments Force and Visual Control for Safe Human Robot Interaction 33/35 Pose estimation errors X vision vision & force
Conclusion Force and Visual Control for Safe Human Robot Interaction 34/35 Interaction control of robot manipulators Indirect and direct force control Position-based visual servoing Position-based visual impedance control Current and future work Interaction control in unstructured environments (presence of humans) Integration of new sensors (proximity sensors, joint torque sensors, ) Extension to dynamic manipulation (nonprehensile, deformable objects)
Force and Visual Control for Safe Human Robot Interaction 35/35 Thank You very much indeed for Your kind attention Questions? bruno.siciliano@unina.it