Robotic Harvesting of Rosa Damascena Using Stereoscopic Machine Vision

Similar documents
Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Robotic Apple Harvesting in Washington State

Intelligent Flexible Automation

A new strawberry harvesting robot for elevated-trough culture.

Automatic Detection of PCB Defects

Mobile Robot FastSLAM with Xbox Kinect

Automatic Labeling of Lane Markings for Autonomous Vehicles

Epipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R.

CROPS: Intelligent sensing and manipulation for sustainable production and harvesting of high valued crops, clever robots for crops.

Colour Image Segmentation Technique for Screen Printing

Automation of Object Sorting Using an Industrial Roboarm and MATLAB Based Image Processing

3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM

Recognition and localization of ripen tomato based on machine vision

Analecta Vol. 8, No. 2 ISSN

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

Binary Image Scanning Algorithm for Cane Segmentation

Vision based Vehicle Tracking using a high angle camera

T-REDSPEED White paper

Colorado School of Mines Computer Vision Professor William Hoff

DESIGN & DEVELOPMENT OF AUTONOMOUS SYSTEM TO BUILD 3D MODEL FOR UNDERWATER OBJECTS USING STEREO VISION TECHNIQUE

HORTICOOP KENNIS EVENT GROENTEN

Classifying Manipulation Primitives from Visual Data

An Approach for Utility Pole Recognition in Real Conditions

Mouse Control using a Web Camera based on Colour Detection

Optimao. In control since Machine Vision: The key considerations for successful visual inspection

More Local Structure Information for Make-Model Recognition

Robotics. Lecture 3: Sensors. See course website for up to date information.

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen

Image Processing Based Automatic Visual Inspection System for PCBs

Interactive Segmentation, Tracking, and Kinematic Modeling of Unknown 3D Articulated Objects

Architectural Photogrammetry Lab., College of Architecture, University of Valladolid - jgarciaf@mtu.edu b

New Measurement Concept for Forest Harvester Head

Pest Control in Agricultural Plantations Using Image Processing

Geometric Camera Parameters

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network

Circle Object Recognition Based on Monocular Vision for Home Security Robot

19th International Conference on Mechatronics and Machine Vision in Practice (M2VIP12), 28-30th Nov 2012, Auckland, New-Zealand

Under 20 of rotation the accuracy of data collected by 2D software is maintained?

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION

Synthetic Sensing: Proximity / Distance Sensors

Differentiation of 3D scanners and their positioning method when applied to pipeline integrity

UNIT 1 INTRODUCTION TO NC MACHINE TOOLS

LEAF COLOR, AREA AND EDGE FEATURES BASED APPROACH FOR IDENTIFICATION OF INDIAN MEDICINAL PLANTS

RIA : 2013 Market Trends Webinar Series

New development of automation for agricultural machinery

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Optical Tracking Using Projective Invariant Marker Pattern Properties

Design of a six Degree-of-Freedom Articulated Robotic Arm for Manufacturing Electrochromic Nanofilms

MetropoGIS: A City Modeling System DI Dr. Konrad KARNER, DI Andreas KLAUS, DI Joachim BAUER, DI Christopher ZACH

A method of generating free-route walk-through animation using vehicle-borne video image

Autonomous Mobile Robot-I

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

Force/position control of a robotic system for transcranial magnetic stimulation

Self-Calibrated Structured Light 3D Scanner Using Color Edge Pattern

A technical overview of the Fuel3D system.

Implementation of a Wireless Gesture Controlled Robotic Arm

Projection Center Calibration for a Co-located Projector Camera System

Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot.

THE problem of visual servoing guiding a robot using

Wii Remote Calibration Using the Sensor Bar

Base Station Design and Architecture for Wireless Sensor Networks

Removing Moving Objects from Point Cloud Scenes

Matt Cabot Rory Taca QR CODES

Sensory-motor control scheme based on Kohonen Maps and AVITE model

Toward commercialization of robotic systems for high-value crops: state-of-theart review and challenges ahead

Introduction to Accuracy and Repeatability in Linear Motion Systems

V-PITS : VIDEO BASED PHONOMICROSURGERY INSTRUMENT TRACKING SYSTEM. Ketan Surender

FPGA Implementation of Human Behavior Analysis Using Facial Image

VOLUMNECT - Measuring Volumes with Kinect T M

A System for Capturing High Resolution Images

Industrial Robotics. Training Objective

Spatial location in 360 of reference points over an object by using stereo vision

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

Fixplot Instruction Manual. (data plotting program)

Manufacturing Process and Cost Estimation through Process Detection by Applying Image Processing Technique

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS

Developing a Sewer Inspection Robot through a Mechatronics Approach

Robotics. Chapter 25. Chapter 25 1

Relating Vanishing Points to Catadioptric Camera Calibration

Handwritten Character Recognition from Bank Cheque

Learning a wall following behaviour in mobile robotics using stereo and mono vision

Shape Ape. Low- Cost Cubing and Dimensioning

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

Unit 1: INTRODUCTION TO ADVANCED ROBOTIC DESIGN & ENGINEERING

Multispectral stereo acquisition using 2 RGB cameras and color filters: color and disparity accuracy

A Prototype For Eye-Gaze Corrected

Abstract. Introduction

Palmprint Recognition. By Sree Rama Murthy kora Praveen Verma Yashwant Kashyap

Integrated sensors for robotic laser welding

A Method for Image Processing and Distance Measuring Based on Laser Distance Triangulation

Bluetooth + USB 16 Servo Controller [RKI-1005 & RKI-1205]

Basler. Line Scan Cameras

Robust and accurate global vision system for real time tracking of multiple mobile robots

T-SERIES INDUSTRIAL INCLINOMETER ANALOG INTERFACE

Towards License Plate Recognition: Comparying Moving Objects Segmentation Approaches

Introduction Epipolar Geometry Calibration Methods Further Readings. Stereo Camera Calibration

Transcription:

World Applied Sciences Journal 12 (2): 231-237, 2011 ISSN 1818-4952 IDOSI Publications, 2011 Robotic Harvesting of Rosa Damascena Using Stereoscopic Machine Vision 1 1 2 Armin Kohan, Ali Mohammad Borghaee, Mehran Yazdi, 3 4 Saeid Minaei and Mohammad Javad Sheykhdavudi 1 Department of Agricultural Machinery Eng., College of Agriculture, Science and Research Branch, Islamic Azad University (IAU), Tehran, Iran 2 Department of Communications and Electronics Eng., School of Electrical and Computer Eng., College of Eng., Shiraz University, Shiraz, Iran 3 Department of Agricultural Machinery Eng., College of Agriculture, Tarbiat Modares Univ., Tehran, Iran 4 Department of Agricultural Machinery Eng., College of Agriculture, Chamran Univ., Ahvaz, Iran Abstract: In this paper, we propose a system for the automated harvest of Rosa Damascena by aid of computer vision techniques. Three dimensional positions of flowers are obtained by stereo vision technique. For harvesting flowers a four-dof manipulator is used. Also, an end-effector is designed to harvest the flowers by cutting them. To analyze the stereoscopic error, a factorial experiment in the form of a completely randomized design with 2 factors was conducted. The first factor was the distance between cameras at the three levels of 5,10 and 20cm and the second factor was the distance between the camera and the flower at the four levels of 50,75,100 and 125cm. The analysis was done using Duncan s Multiple Range Test at the 1% level. We concluded that the increase in the distance between the cameras reduces the stereoscopic error, while the increase in the distance between the cameras and the flowers increases the error. Finally, the manipulator and the vision system were evaluated altogether. The best results were obtained when the relative distance between cameras was 100 mm. In this case, 82.22 percent of the flowers were successfully harvested. Key words: Rosa Damascena harvester Stereo vision Robotic harvesting Image processing INTRODUCTION image analysis algorithms for mushroom location and seizing and showed the possibility of using a robot to Rosa Damascena, more commonly known as the harvest mushrooms for the fresh market. Damask rose, is a member of the family Rosaceae. It is [4] developed a system for the automated harvest of used in various industries such as pharmaceutical, Gerbera jamesonii pedicels with the help of image analysis cosmetics, perfumery and food. Harvesting begins from techniques. Images of plants were taken with a stereo April to May and flowers are picked from where the stem camera system, which consisted of two CCD-cameras with connects to sepal. Picking is done by human labor under near-infrared filters. The plant was placed on a rotatable blazing sun and it is extremely intensive labor. Because of desk and images of eight different locations were taken. tedious and repetitive tasks, already, researches were Three-dimensional models of the pedicels were created by focused on the automation of agricultural and a triangulation process. horticultural crop production many years ago. [5] developed an image processing method to detect [1] Developed a watermelon-harvesting robot. To green citrus fruit in individual trees. A hyperspectral pick up a fruit, the relative positions of the fruit and the camera of 369-1042 nm was employed to acquire manipulator were detected using the stereo image method. hyperspectral images of green fruits of three different [2] presented an image processing system for conducting citrus varieties and by the means of these images green an orange picking robot. [3] applied machine vision and citrus fruits were detected. Corresponding Author: Armin Kohan, Department of Agricultural Machinery Eng. College of Agriculture, Science and Research Branch, Islamic Azad University (IAU), Tehran, Iran. Tel: +98 61262 37924, Fax: +98 61262 37925, E-mail: arminkohan@gmail.com. 231

Van Henten et al. [6] developed an autonomous Rectification: For a given point in one image, we have to robot for harvesting cucumbers in greenhouses. Two search for its correspondent in the other image along an camera vision systems were employed for creating 3D epipolar line [11]. Generally, epipolar lines are not aligned images of the fruit and the environment. A control scheme with the coordinate axis and are not parallel. Such was used to generate collision free motions for the searches are time consuming since we must compare manipulator during harvesting. The manipulator had pixels on skew lines in the image space. These types of seven degrees of freedom. The proposed computer vision algorithm can be simplified and made more efficient if system was able to detect more than 95% of the epipolar lines are axis aligned and parallel so that epipolar cucumbers in a greenhouse. lines in the original images map to horizontally aligned Tanigakia et a. [7] Manufactured a cherry-harvesting lines in the transformed images.this can be realized by robot for trial purposes. The main parts of the robot were applying 2D projective transforms to each image. This a manipulator with 4 degrees of freedom, an 3-D vision process is known as image rectification [12]. Calibrating sensor, an end effector, a computer and a moving device. the stereo rig leads to a simpler technique for rectification. The 3-D vision sensor was equipped with red and Figure 1 shows original stereo images after calibration infrared laser diodes. Both laser beams scanned objects and rectification. Rectification process was done based simultaneously. By processing the images obtained on [13]. from the 3-D vision sensor, the location of the fruits and obstacles was recognized and the trajectory of the end Object Recognition: Considering the flower s color being effector was determined. Fruits were picked by the end different from that of the foliage, the best space for effector and collisions with obstacles were efficiently detecting the flowers was HSI (Hue-Saturation-Intensity) avoided. color space. Since the H- component has the data on the Noordam et al. [8] Evaluated computer vision image color (Figure 2b), by thresholding this component, methods like stereo imaging, laser triangulation, Rontgen the flowers can be marked in the image. imaging and reverse volumetric intersection, to determine After thresholding the image, a binary image is which method is the most practicable for locating the stem obtained. By using the opening operator [14] on the image and tracking the stem down to locate the cutting position. once and removing the noise, better distinct regions in the They concluded that reverse volumetric intersection is the image are obtained (Figure 2c). In this stage, all the best method to locate the stem cutting position in terms existing regions are labeled and several times erosion of robustness and costs. operator2 followed by dilation operator [14] are applied. In the present work, a machine vision for harvesting Whenever one region is separated into two new regions, Rosa Damascena flower using stereoscopic technique is the previous region is replaced by these two regions in designed. The harvester only harvests the flower for the binary image. In this way, the flowers that are partially industrial usage and the position of the stem is not overlapped are detected and distinguished. Figures 2d, 2e important. and 2f show an example of this case and how we process it. Finally, the coordinates of the region centers in both MATERIALS AND METHODS stereo images are extracted and used later to determine the 3D position of flowers. Vision System: The vision system includes two CCDcameras forming a stereovision system that can move Matching: Once an image has been segmented, the along the horizontal axis relatively. A portable computer shape of objects and epipolar constrains are used to is used to run the implemented programs for camera identify best matches between the stereo images. If there calibration, image rectification, object recognition and are more than one point on the epipolar line of the two matching stereo images. images, then objects located on the same epipolar line from two images are superposed onto each other at the Camera Calibration: It is an important pre-step in centeroid and the union size is calculated by identifying correctly matching the stereo images and in precisely pixels contained in both images. If there is a minimum computing the depth in the stereovision system. It is the percentage of pixel overlap, then the two objects are process of estimating intrinsic and extrinsic parameters identified as a stereo pair. Ordering constrain is used too. of the camera to minimize the discrepancy between the This constrain indicates that, points in the right image observed image features and their theoretical positions have the same order in the left image. After matching, the in the camera pinhole model. Camera calibration in the 3D coordinates of the flower obtained by stereo present study was done based on [9, 10]. triangulation [15]. 232

(a) (b) (c) (d) (e) (f) Fig. 1: Calibration and rectification on a tested image; a) and b) are original left and right images, c) and d) are calibrated left and right images, e) and f) are rectified left and right images respectively. (a) (b) (c) (d) (e) (f) Fig. 2: a) a tested flower image, b) H-component of the HSI image, c) image b after applying the threshold and opening operator once, d) image c after applying several times consecutive erosion and dilation operators, e) image c after applying 8 times consecutive erosion and dilation operators, f) final image after considering all found regions 233

Fig. 3: The manipulator Camera1 Camera2 Primary image Primary image Rectification Rectification Feature Extractio Feature Extractio Laptop Computer Matching Range calculation Fig. 4: Images taken from end effector Vision System Evaluation: For a better analysis of the stereoscopic depth measurement error, a factorial experiment in the form of a completely randomized design with 21 replications was conducted. The experiment had two factors. The first factor was the distance between cameras at the three levels of 50,100 and 200mm and the second factor was the distance between the camera and 12V lead acid battery Control Manipulator Fig. 5: Block diagram of the harvester the flower at the four levels of 500, 750, 1000 and 1250mm. The output of experiments, after assessing the variance (Table 1), was analyzed using Duncan s Multiple Range Test at the 1% level. Feed back 234

Robotic Harvesting: A four degree freedom manipulator As Table 1 shows, the analysis of the variance and F- for picking flowers was designed and mounted (Figure 3). test indicates that the effect of depth variation and Four DC motors that guide the manipulator to the desired cameras relative distance on the depth measurement error position were equipped with a potentiometer. A special by stereoscopy is significant. Analysis by Duncan s clipper for picking flowers was designed and mounted Multiple Range Test (Tables 2, 3 and 4) show that, (Figure 4). The Clipper has a curved form. It has two generally increasing the distance between the cameras appendages which are covered with soft rubber to keep reduces the stereoscopic error, while increasing the the flower after picking. The clipper was designed to be distance between the stereo cameras and the flowers moved by an DC motor. After being picked, flowers are increases it. placed at the site of collecting flowers by the manipulator. The errors along the axes Y and X are shown in The control commands required to carry out the Tables 5 and 6, respectively for different values for the Z. harvesting operation were supplied to the manipulator The results show that the error and it variance are control circuit via the serial port of the computer. increased either we augment the distance between a target Figure 5 shows the block diagram of the harvester. object from the cameras or increase the distance between two cameras. RESULTS AND DISCUSSION The results on matching process show that an increase in the distance between cameras causes that the For the purpose of evaluating the image processing risk of having the occlusion between objects increases algorithms, 30 stereo images were tested. Incorporating and the matching becomes more difficult since two images 181 flowers in total in all images, 98% of the flowers from two camera are less similar. Consequently, the error present in the images were detected correctly. Certain in matching the images increases. The percent of those flowers were not detectable since a major part of them was errors in different treatments are shown in Table 7 By covered by the foliage. moving the camera away from the shrub, the number of Table 1: Analysis of variance of the error of depth measurement by stereoscopy. DF SS MS FS Treatment 11 65452.13 5950.19 374.94* Depth 3 31222.35 10407.45 655.80* Cameras relative distance 2 25304.22 12652.11 797.24* Interaction 6 8925.556 1487.593 93.73* Error 240 3808.76 15.87 Total 251 69260.89 * denotes that the F-test is significant at 1% level Table 2: Effect of depth variation on the mean of depth measurement error (all means are significantly different at the 1% level according to Duncan s Multiple Range Test). Depth(mm) 500 750 1000 1250 Depth measurement mean error(mm) 6.43 13.1 24.4 35.63 Table 3: Effect of the variation in distance between two cameras on the mean of depth measurement error (all means are significantly different at the 1% level according to Duncan s Multiple Range Test.) Depth measurement mean error(mm) 33.5 16.5 9.67 Table 4: Effect of variation in distance between two cameras and that between the camera and the object on the mean of depth measurement error by stereoscopy (the means that are associated with the same letters, do not have a significant difference at the 1% level.) depth(mm) 500 750 1000 1250 500 750 1000 1250 500 750 1000 1250 Depth measurement mean error(mm) 10.14b 22.10c 41 60.76 5.57a 10.48b 20.81c 29.14 3.57a 6.71a 11.38b 17.00 235

Table 5: Errors in the direction of Y axis depth (mm) Z=500 Z=750 Z=1000 Z=1250 Z=500 Z=750 Z=1000 Z=1250 Z=500 Z=750 Z=1000 Z=1250 Average of error (mm) 2.38 4.48 9.81 12.67 2.05 3.19 5.9 8.14 0.86 2.1 3.62 5.1 Error variance 1.45 2.36 3.26 5.43 0.95 1.66 4.59 6.23 0.53 1.69 3.85 4.09 Table 6: Errors in the direction of X axis depth (mm) Z=500 Z=750 Z=1000 Z=1250 Z=500 Z=750 Z=1000 Z=1250 Z=500 Z=750 Z=1000 Z=1250 Average of error (mm) 2.71 4.57 9.57 13.43 2.14 3.29 6.05 9.14 1.62 2.57 4.52 6.81 Variance 1.61 2.26 3.56 4.96 1.43 2.01 3.05 4.23 1.25 1.76 2.86 3.66 Table 7: Errors during matching process Cameras relative distance (mm) I=50 I=100 I=200 Depth (mm) Z=500 Z=750 Z=1000 Z=1250 Z=500 Z=750 Z=1000 Z=1250 Z=500 Z=750 Z=1000 Z=1250 Percent of unmatched objects 6.25 7.77 9.14 10.03 8.62 11.86 22.75 26.46 15.94 21.49 33.33 39.59 Table 8: Percentage of the success and failure in the harvest Harvesting failure Number 43 32 46 Rate 23.88 17.77 45.55 Successfully harvested Number 137 148 134 Rate 76.11 82.22 74.44 flowers in the captured image increased and consequently 3. Reed, J.N., S.J. Miles, J. Butler, M. Baldwin and the matching error increased due to the fact that flowers R. Noble, 2001. Automatic mushroom harvester became more similar. development. J. Agricultural Engineering Res., Finally, the performance of harvester was evaluated 78 (1): 15-23. by varying the distance between the cameras. In each 4. Rath, T. and M. Kawollek, 2009. Robotic harvesting variation, 180 flowers were considered for harvesting. The of Gerbera Jamesonii based on detection and threeobtained results are shown in Table 6. During harvesting, dimensional modeling of cut flower pedicels. the distance between the stereo rig and shrub was kept Computers and Electronics in Agric., 66: 85-92. 500 to 750 mm. The best results were obtained when the 5. Okomoto, H. and W.S. Lee, 2009. Green citrus distance between the cameras was 100 mm. According to detection using hyperspecteral imaging, Computers Tables 2 and 7, although in this case the number of and Electronics in Agric., 66: 201-208 matching errors was increased but positioning error was 6. Van Henten, E.J., J. Hemming, B.A.J. Van Tuijl, reduced. According to Table 7, the matching error in this J.G. Kornet, J. Meuleman, J. Bontsema and E.A. Van case, was less than the case that the distance between the Os, 2002. An autonomous robot for harvesting cameras was 200 mm. cucumbers in greenhouse. Autonomous Robots, REFERENCES 7. 13(3): 241-258. Tanigakia, K., T. Fujiuraa, A. Akaseb and J. Imagawa, 2008. Cherry-harvesting robot. Computers and 1. Umeda, M., S. Kubota. and M. lida. 1999. Electronics in Agric., 63: 65-72. Development of "STORK", a watermelon-harvesting 8. Noordam, J.C., J. Hemming, C. Van Heerde, robot. Artif. Life Robotics, 3: 143-147. F. Goldbach, R. van Soest and E. Wekking, 2005. 2. Plebe, A. and G. Grasso, 2001. Localization of Automated rose cutting in greenhouses with 3d spherical fruits for robotic harvesting. Machine vision and robotics: analysis of 3d vision techniques Vision and Applications, 13(2): 70-79. for stemdetection. Acta. Horticulturae, 691: 885-892. 236

9. Heikkila, J. and O. Silven, 1997. A four-step camera 13. Fusiello, A., E. Trucco and A. Verri, 2000. A compact calibration procedure with implicit image correction. algorithm for rectification of stereo pairs. Machine In the Proceedings of the IEEE Computer Society Vision and Application, 12: 16-22. Conference on Computer Vision and Pattern 14. Gonzalez, C., R.E. Woods and S.L. Eddins, 2004. Recognition, pp: 1106-1112. Digital image processing using matlab. New Jersy: 10. Zhang, Z.Y.. 1999. Flexible camera calibration by Pearsone Education Inc., viewing a plane from unknown orientations. In the 15. Forsyth, A.F. and J. Ponce, 2006. Computer Vision A Proceedings of the IEEE Conference on Computer Modern Aproach. Prentice Hall of India. Vision, 1: 666-673. 11. Zhang. Z., 1998. Determining the epipolar geometry and its uncertainty: A review. International J. Computer Vision, 27(2): 161-1195. 12. Loop, C. and Z. Zhang, 1999. Computing rectifying homographies for stereo vision. In the Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1: 125-131. 237