Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System

Similar documents
Robotic Apple Harvesting in Washington State

Industrial Robotics. Training Objective

CROPS: Intelligent sensing and manipulation for sustainable production and harvesting of high valued crops, clever robots for crops.

2/26/2008. Sensors For Robotics. What is sensing? Why do robots need sensors? What is the angle of my arm? internal information

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS

Design of a Robotic Arm with Gripper & End Effector for Spot Welding

Design of a six Degree-of-Freedom Articulated Robotic Arm for Manufacturing Electrochromic Nanofilms

CMA ROBOTICS ROBOT PROGRAMMING SYSTEMS COMPARISON

Introduction to Computer Graphics

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users

Autonomous Mobile Robot-I

Operational Space Control for A Scara Robot

Robotics. Chapter 25. Chapter 25 1

High Resolution RF Analysis: The Benefits of Lidar Terrain & Clutter Datasets

Lecture Notes - H. Harry Asada Ford Professor of Mechanical Engineering

CONCEPTUAL DESIGN OF A HYBRID ROBOT

Development of Easy Teaching Interface for a Dual Arm Robot Manipulator

Robotics. Lecture 3: Sensors. See course website for up to date information.

Development of Docking System for Mobile Robots Using Cheap Infrared Sensors

The Scientific Data Mining Process

Design Aspects of Robot Manipulators

Photogrammetric Point Clouds

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

Robotic motion planning for 8- DOF motion stage

Force/position control of a robotic system for transcranial magnetic stimulation

New development of automation for agricultural machinery

Human Interaction with Robots Working in Complex and Hazardous Environments

IRB 2600ID-15/1.85 Simple integration, high performance

A Cognitive Approach to Vision for a Mobile Robot

Vibrations can have an adverse effect on the accuracy of the end effector of a

MSc in Autonomous Robotics Engineering University of York

RIA : 2013 Market Trends Webinar Series

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research

WAVELENGTH OF LIGHT - DIFFRACTION GRATING

Integrated sensors for robotic laser welding

Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot.

Sensory-motor control scheme based on Kohonen Maps and AVITE model

Using NI Vision & Motion for Automated Inspection of Medical Devices and Pharmaceutical Processes. Morten Jensen 2004

Mobile Robot FastSLAM with Xbox Kinect

High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound

CASE STUDY LANDSLIDE MONITORING

- Time-lapse panorama - TWAN (The World At Night) - Astro-Panoramic Photography

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

Mobile Mapping. VZ-400 Conversion to a Mobile Platform Guide. By: Joshua I France. Riegl USA

Types of 3D Scanners and 3D Scanning Technologies.

Definitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots

Automatic Labeling of Lane Markings for Autonomous Vehicles

A Measurement of 3-D Water Velocity Components During ROV Tether Simulations in a Test Tank Using Hydroacoustic Doppler Velocimeter

Autodesk Fusion 360: Assemblies. Overview

INTRODUCTION. Robotics is a relatively young field of modern technology that crosses traditional

FRC WPI Robotics Library Overview

Toward commercialization of robotic systems for high-value crops: state-of-theart review and challenges ahead

Design-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist

Polarization of Light

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD)

Making Better Medical Devices with Multisensor Metrology

Static Environment Recognition Using Omni-camera from a Moving Vehicle

Robot Perception Continued

Improving a Gripper End Effector

Differentiation of 3D scanners and their positioning method when applied to pipeline integrity

INTERNATIONAL HOCKEY FEDERATION PERFORMANCE REQUIREMENTS AND TEST PROCEDURES FOR HOCKEY BALLS. Published: April 1999

Unit 1: INTRODUCTION TO ADVANCED ROBOTIC DESIGN & ENGINEERING

ROBOT END EFFECTORS SCRIPT

GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM

HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAYS AND TUNNEL LININGS. HIGH-PERFORMANCE INSPECTION VEHICLE FOR RAILWAY AND ROAD TUNNEL LININGS.

Automated part positioning with the laser tracker

Design of a Universal Robot End-effector for Straight-line Pick-up Motion

Practical Work DELMIA V5 R20 Lecture 1. D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

Force and Visual Control for Safe Human Robot Interaction

TAGARNO AS Sandøvej Horsens Denmark Tel: Mail: mail@tagarno.com

Automotive Applications of 3D Laser Scanning Introduction

National Performance Evaluation Facility for LADARs

Deflectable & Steerable Catheter Handbook

EXPERIMENT O-6. Michelson Interferometer. Abstract. References. Pre-Lab

How To Use Trackeye

A STRATEGIC PLANNER FOR ROBOT EXCAVATION' by Humberto Romero-Lois, Research Assistant, Department of Civil Engineering

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

ROBOT SYSTEM FOR REMOVING ASBESTOS SPRAYED ON BEAMS

CNC Machine Control Unit

Developing a Sewer Inspection Robot through a Mechatronics Approach

3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM

What is Visualization? Information Visualization An Overview. Information Visualization. Definitions

E190Q Lecture 5 Autonomous Robot Navigation

Simultaneous Gamma Correction and Registration in the Frequency Domain

E3T Miniature Photoelectric Sensors

Solving Simultaneous Equations and Matrices

User Guide MTD-3. Motion Lab Systems, Inc.

How To Measure Contactless Measurement On A Robot

Analecta Vol. 8, No. 2 ISSN

Programming ABB Industrial Robot for an Accurate Handwriting

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network

TRIMBLE ATS TOTAL STATION ADVANCED TRACKING SYSTEMS FOR HIGH-PRECISION CONSTRUCTION APPLICATIONS

A laboratory work: A teaching robot arm for mechanics and electronic circuits

Animations in Creo 3.0

High speed 3D capture for Configuration Management DOE SBIR Phase II Paul Banks

A Simple Guide To Understanding 3D Scanning Technologies

Transcription:

Ref: C0287 Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System Avital Bechar, Victor Bloch, Roee Finkelshtain, Sivan Levi, Aharon Hoffman, Haim Egozi and Ze ev Schmilovitch, Institute of Agricultural Engineering, ARO, Colcani Center, Israel Abstract Orchards pruning is a labor intensive task which requires more than 25% of the labor costs. The main objectives of this task are to increase exposure to sun light, control the tree shape and remove unfitted branches. In most orchards this task conducted once a year and up to 20% of the branches are removed selectively. Robots are perceptive machines that can be programmed to perform a variety of agricultural tasks, such as cultivating, transplanting, spraying, and selective harvesting. Agricultural robots have the potential to enhance the quality of fresh produce, lower production costs and reduce the drudgery of manual labor. However, in agriculture, the environment is highly unstructured. The terrain, vegetation, landscape, visibility, illumination and other atmospheric conditions are not well defined; they continuously vary, have inherent uncertainty, and generate unpredictable and dynamic situations, and therefore, autonomous robots in real-world, dynamic and unstructured environments still yield inadequate results, and the promise of automatic and efficient autonomous operations has fallen short of expectations in such environments. Introducing a human operator into the system can help improve performance and simplify the robotic system. In this work, we developed a visual servoing methodology and a human-robot collaborative system for selective tree pruning. The system consists of a Motoman manipulator, a color camera, a single beam laser distance sensor, an HMI and a cutting tool based on a circular saw developed for this task. The cutting tool, the camera and the laser sensor are mounted on the manipulator s end-effector, parallel aligned one to the other. A human-robot collaborative system for selective tree pruning was developed. The system consists of a Motoman manipulator, a color camera, a single beam laser distance sensor, an HMI and a cutting tool based on a circular saw developed for this task. The cutting tool, the camera and the laser sensor are mounted on the manipulator s end-effector, parallel aligned one to the other. Experiments were conducted to examine the performance of the system under different conditions, human-robot collaboration methods and different trajectory types. A cutting tool was designed for pruning branches with diameter of up to 26mm at 45 cutting angle. The saw diameter determined to be 115mm with standard shaft diameter of 41mm. An interface to connect the cutting tool to the robot end effector was designed in order to minimize the total dimensions of the tool and increase the robot dexterity. An actual average cycle time of 9.2 s was achieved when the human operator actions and the robot are performing simultaneously. The results also revealed that the average time required to determine the location and orientation of the cut was 2.51 s. Keywords: Tree pruning, vision, laser, human-robot collaboration, cutting tool Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 1/6

Number of branches 1 Introduction Orchards pruning is a labor intensive task which requires more than 25% of the labor costs. The main objectives of this task are to increase exposure to sun light, control the tree shape and remove unfitted branches. In most orchards this task conducted once a year and up to 20% of the branches are removed selectively. Robots are perceptive machines that can be programmed to perform a variety of agricultural tasks, such as cultivating, transplanting, spraying, and selective harvesting. Agricultural robots have the potential to enhance the quality of fresh produce, lower production costs and reduce the drudgery of manual labor. However, in agriculture, the environment is highly unstructured. The terrain, vegetation, landscape, visibility, illumination and other atmospheric conditions are not well defined; they continuously vary, have inherent uncertainty, and generate unpredicta-ble and dynamic situations, and therefore, autonomous robots in real-world, dynamic and un-structured environments still yield inadequate results, and the promise of automatic and effi-cient autonomous operations has fallen short of expectations in such environments. Introduc-ing a human operator into the system can help improve performance and simplify the robotic system. 2 Materials and methods The system developed consists of a Motoman manipulator, a color camera, a single beam laser distance sensor, an HMI and a cutting tool based on a circular saw developed for this task. The cutting tool, the camera and the laser sensor are mounted on the manipulator s end-effector, parallel aligned one to the other. An experiment was conducted to examine the performance of the system under different conditions, human-robot collaboration methods and different trajectory types. 2.1 Cutting tool A cutting tool was developed for pruning branches with diameter of up to 26mm at 45 cutting angle. The maximum cutting diameter was determined based on a measurement of 238 nectarine branches in the field using a Caliber. A histogram of the branches diameter was generated in order to examine the branches diameter distribution (Figure 1). 80 70 60 50 40 30 20 10 0 3 6 9 12 15 18 21 24 27 30 33 36 Branch Diameter Figure 1: Branches diameter distribution. The maximum branch diameter was 36mm and in order to be able to cut most of the branches and maintain minimal dimensions of the tool, it was determined that the tool will cut branches with diameter of up to 26mm which correspond to 98% of the branches. Based on this and the following equation, the saw diameter determined to be Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 2/6

115mm with standard shaft diameter of 41mm. The cutting tool prototype was tested manually in the field (Figure 2) and then mounted on the end effector of a Motoman 5L manipulator in the Agricultural Robotics Lab (ARL) at Volcani Center. An interface to connect the cutting tool to the robot end effector was designed in order to minimize the total dimensions of the tool and increase the robot dexterity. The tool orientation can be changed when it is required to change the tool cross section signature while maintaining the cutting contact point on the end effective axis. The drawing of the system with the end effector, interface and cutting tool is shown in figure 3. Figure 2: Cutting tool test in the field and mounted on the Motoman manipulator Figure 3: A draw of the cutting tool system. 2.2 Human-Robot Collaborative System A human-robot collaborative system for selective tree pruning was developed. The system consists of a Motoman manipulator, a color camera, a single beam laser distance sensor, an HMI, a computer and a circular saw cutting tool prototype. The cutting tool, camera and laser sensor are mounted on the manipulator s end-effector, aligned parallel to each other (Figure 4). The system works in two phases. In the first phase, the camera transfer a 2D image of the tree to a human operator which in turn marks on a display the branches to be removed. In the second phase, the system works autonomously: the laser sensor measure the branch distance and calculates a trajectory to the cutting point. Once this trajectory has been calculated, the robotic arm performs the corresponding moves and cuts the branch at the prescribed location. 2.3 Experiments Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 3/6

Two experiments were conducted. In the first, Two types of motion planning were investigated, i) a linear motion between the tool initial location and the cutting point in global Cartesian coordinates and, ii) in robot joint space. In the second experiment, two types of human-robot collaboration methods were examined: a) the human subject marks two points in the picture received from the end-effector camera, the first point mark the location of the cut on the branch and the second point to calculate the orientation of the cutting tool when pruning the branch; and, b) the human subject marks a single point in the picture received from the endeffector camera to denote the location of the cut on the branch. A computer vision algorithm extracts the orientation of the branch and calculates the desire orientation of the cutting tool. The experimental apparatus is given in figure 5. Figure 4: End-effector system. Figure 5: The experimental apparatus. 3 Results and Discussion 3.1 Experiment 1 Figure 6 shows the mean time of the different movement stages for the linear motion and the robot joint space. The robot trajectory is consist of 1) a movement to scan location, 2) per- Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 4/6

forming the scan, 3) movement to the branch and performing the cut, and 3) return to the initial position. In addition the human mark of the selected location in the branch is shown and denoted as 'cut sign'. Two cycle times are presented: the cycle time including the human actions and the cycle time of the robot movement. Since the human action and the robot movement can be performed simultaneously, the actual cycle times will be similar to the robot movement cycle time. In all movement stages the times were shorter in the robot joint space than in linear movement. The average robot movement cycle time was 9.2 s for the robot joint space movement and was shorter by 43% than in the linear movement (16.1 s). The advantage of the linear movement is that the chances to encounter obstacles is lower since the end effector is moving along the line of site marked by the human operator and by its nature it is obstacle free. Nevertheless, the differences in the trajectories between the two movement methods was minimal. Figure 6: the times for different movement stages in the linear movement and robot joint space. 3.2 Experiment 2 In this experiment, the response time of the human operators were measured for the two collaboration methods denoted as '1 click method' the human operator marked only the location of the cut on the branch and the orientation of the cut was determined by a vision algorithm; and '2 clicks method' the human operator marked 2 points on the branch to retrieve the location and orientation of the cut. The first mark (click 1) in both methods was similar, 2.51 s and 2.76 s for the '1 click method' and '2 clicks method' respectively with no significant difference. The second mark (click 2) was significantly shorter (1.56 s) in comparison to the first mark in the '2 clicks method'. For all human subjects the total time to retrieve location and orientation of the cut was shorter in the '1 click method' in comparison to the '2 clicks method' by approximately 40% (in average 2.51 s in the '1 click method' and 4.31 s in the '2 clicks method'). Although there was no difference in the accuracy of the cut location between the two methods, the orientation in the '2 clicks method' was more accurate than in the '1 click method'. Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 5/6

Figure 7: times to determine the location and orientation of the cut in the two methods. 4 Summary and Conclusions Orchards pruning is a labor intensive task which requires more than 25% of the labor costs. The main objectives of this task are to increase exposure to sun light, control the tree shape and remove unfitted branches. In most orchards this task conducted once a year and up to 20% of the branches are removed selectively. A human-robot collaborative system for selective tree pruning was developed. The system consists of a Motoman manipulator, a color camera, a single beam laser distance sensor, an HMI and a cutting tool based on a circular saw developed for this task. The cutting tool, the camera and the laser sensor are mounted on the manipulator s end-effector, parallel aligned one to the other. An experiment was conducted to examine the performance of the system under different conditions, human-robot collaboration methods and different trajectory types. A cutting tool was designed for pruning branches with diameter of up to 26mm at 45 cutting angle. The saw diameter determined to be 115mm with standard shaft diameter of 41mm. An interface to connect the cutting tool to the robot end effector was designed in order to minimize the total dimensions of the tool and increase the robot dexterity. The designed system was examined in two experiments evaluating the performance of two types of motion planning and two types of human-robot collaboration methods. An actual average cycle time of 9.2 s was achieved when the human operator actions and the robot are performing simultaneously. The results also revealed that the average time required to determine the location and orientation of the cut was 2.51 s in the '1 click method' The finding implies that in an efficient environment and working method, one human operator can supervise three to four tree pruning robots and increase the total production rate. Although the current cycle time achieved is acceptable, reducing the cycle time is feasible and in future work we will focus on optimizing the scanning stages and develop a multi targets (branches) procedure instead of one target (branch) at a time. Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 6/6