Camera-based selective weed control application module ( Precision Spraying App ) for the autonomous field robot platform BoniRob



Similar documents
New development of automation for agricultural machinery

Multiple Reflection Ultrasonic Sensor System for Morphological Plant Parameters

Basler. Line Scan Cameras

ADAPTATION OF OPTICAL SENSORS TO DETECT URINE AND DUNG PATCHES IN DAIRY PASTURE

IPA Industrial Process Automation From the Automatic Control of Individual Controlled Systems to Flexible, Full-scale Process Automation

Component-based Robotics Middleware

MH - Gesellschaft für Hardware/Software mbh

Analecta Vol. 8, No. 2 ISSN

How To Use Bodescan For 3D Imaging Of The Human Body

CROPS: Intelligent sensing and manipulation for sustainable production and harvesting of high valued crops, clever robots for crops.

MRC High Resolution. MR-compatible digital HD video camera. User manual

OPERATION MANUAL. MV-410RGB Layout Editor. Version 2.1- higher

FUNDAMENTALS OF CONTROL ENGINEERING

ISOBUS s Past, Present and Future role in Agricultural Robotics and Automation

REMOTE LABORATORY PLANT CONTROL

Overview. Proven Image Quality and Easy to Use Without a Frame Grabber. Your benefits include:

Training Systems for Renewable Energies. Acquiring Practical Skills and Project-oriented Expertise

An Experimental Study on Pixy CMUcam5 Vision Sensor

EBERSPÄCHER ELECTRONICS automotive bus systems. solutions for network analysis

Rapid Application Development for Machine Vision A New Approach

CoolTeg Plus Chilled Water (CW) version: AC-TCW

Simple. Intelligent. The SIMATIC VS 100 Series. simatic MACHINE VISION.

IEEE Projects in Embedded Sys VLSI DSP DIP Inst MATLAB Electrical Android

Building a Simulink model for real-time analysis V Copyright g.tec medical engineering GmbH

Internet based manipulator telepresence

Optimizing the hydraulic designing of pressurized irrigation network on the lands of village Era by using the computerized model WaterGems

The Design of Hardware for Remote Control System of the Manipulator

EN The Tool for Tools

T-REDSPEED White paper

How To Run A Factory I/O On A Microsoft Gpu 2.5 (Sdk) On A Computer Or Microsoft Powerbook 2.3 (Powerpoint) On An Android Computer Or Macbook 2 (Powerstation) On

1 ImageBrowser Software Guide

Bioreactor Process Plant Powered by NI LabVIEW and NI CompactRIO

Corporate Design Manual Design Guidelines for Trade Shows and Events

An educational software project in the field of process control. Michael Ritzschke Institute of Informatics Humboldt University Berlin

Measuring, Controlling and Regulating with labworldsoft

T O B C A T C A S E G E O V I S A T DETECTIE E N B L U R R I N G V A N P E R S O N E N IN P A N O R A MISCHE BEELDEN

RiMONITOR. Monitoring Software. for RIEGL VZ-Line Laser Scanners. Ri Software. visit our website Preliminary Data Sheet

LIST OF RECENT PLANTSCREEN TM INSTALLATIONS

Application Report: Running µshape TM on a VF-20 Interferometer

REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING

Jobrechner ECU-MIDI 1.0 & ECU-MAXI 1.0

PRODUCT SHEET.

NEW GENERATION PROGRAMMABLE AUTOMATION CONTROLLER

Ideas navigated easily. Displays and Control Devices. Construction. accurate. simple. flexible

Industrial Vision Days 2012 Making Cameras Smarter: FPGA Based Image Pre-processing Unleashed

Development of an automated Red Light Violation Detection System (RLVDS) for Indian vehicles

ARM by Gylling Data Management, Inc. The Software Solution for Your Research Data Needs Website:

Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System

Robotic Home Assistant Care-O-bot: Past Present Future

Optimal Vision Using Cameras for Intelligent Transportation Systems

Integrated Building Management and Security System. Building Automation & Security.

A General Framework for Tracking Objects in a Multi-Camera Environment

LMS SoundBrush. From sound to source in minutes

Fuzzy Knowledge Base System for Fault Tracing of Marine Diesel Engine

Applications > Robotics research and education > Assistant robot at home > Surveillance > Tele-presence > Entertainment/Education > Cleaning

Combi Unit Genius. The energy efficient Building Service Centre

SAS BI Dashboard 3.1. User s Guide

Seed Bed Combination Korund 8

Full 1080p Network Video Recorder with 2TB HDD & 8 Night Vision 1080p HD IP Cameras

Vision-Based Blind Spot Detection Using Optical Flow

CONTROL CODE GENERATOR USED FOR CONTROL EXPERIMENTS IN SHIP SCALE MODEL

FACTORY AUTOMATION LASER BARCODE SCANNERS FOR INDUSTRIAL AUTOMATION

Integrated Building Management and Security System. Building Automation & Security.

From Product Management Telephone Nuremberg

Design Report. IniTech for

ZEISS T-SCAN Automated / COMET Automated 3D Digitizing - Laserscanning / Fringe Projection Automated solutions for efficient 3D data capture

Research and Application of Modern Information Technology in the Forest Plant Protection Machinery

Linear Motion System: Transport and positioning for demanding applications

THE DESIGN OF A PORTABLE PROGRAMMABLE LOGIC CONTROLLER (PLC) TRAINING SYSTEM FOR USE OUTSIDE OF THE AUTOMATION LABORATORY.

Interactive Motion Simulators

Brief Information LAN/USB Interface. Art. no st edition, 10 11

FUNDAMENTALS OF ROBOTICS

Chapter I Model801, Model802 Functions and Features

Effective Interface Design Using Face Detection for Augmented Reality Interaction of Smart Phone

Networking Remote-Controlled Moving Image Monitoring System

SUSTAINABLE WEED MANAGEMENT WITH HOT WATER

How To Use Trackeye

ROTALIGN Ultra is The intelligent System for Machinery Alignment

DWH-1B. with a security system that keeps you in touch with what matters most

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

RIVA Megapixel cameras with integrated 3D Video Analytics - The next generation

VZ/VZM SERIES Heavy Duty Shredders 600-2,100 kg/h

Smart Home Security Camera. No cords. No wires. No worries. Features. Motion Alerts Receive push notifications and s if anything moves

Safety Integrated. SIMATIC Safety Matrix. The Management Tool for all Phases of the Safety Lifecycle. Brochure September Answers for industry.

Mobile Robots / Motivity Controller Motivity Software / Fleet Appliance YOUR INTELLIGENT ROBOTICS PARTNER

High Resolution Spatial Electroluminescence Imaging of Photovoltaic Modules

EBERSPÄCHER ELECTRONICS automotive bus systems

Functional environment of a mobile work unit

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

3D SCANNERTM. 3D Scanning Comes Full Circle. s u n. Your Most Valuable QA and Dosimetry Tools A / B / C. The 3D SCANNER Advantage

MobileMapper 6 White Paper

Project 4: Camera as a Sensor, Life-Cycle Analysis, and Employee Training Program

Transcription:

Ref: C0598 Camera-based selective weed control application module ( Precision Spraying App ) for the autonomous field robot platform BoniRob Christian Scholz, Maik Kohlbrecher and Arno Ruckelshausen, University of Applied Sciences Osnabrueck / Competence Center of Applied Agricultural Engineering COALA, Sedanstraße 26, DE-49076 Osnabrueck Daniel Kinski and Daniel Mentrup, iotec GmbH, Albert-Einstein-Straße 1, DE-49076 Osnabrueck Abstract For a sustainable agriculture limited resources, environmental impacts from chemicals (such as fertilizer or herbicides) or soil compaction caused by multiple crossing of heavy agricultural machinery are challenges for innovative technologies. This paper addresses a new technology for reducing herbicides as a major goal, however, further advantages such as reduced soil compaction by the carrier vehicle and reduced spraying of crop plants can be achieved. The newly developed Precision Spraying App is an application module which is integrated in the multipurpose autonomous field robot platform BoniRob. The control between the robotic platform and the application module is similar to a tractor-implement management solution. Due to the size of the robot a working width of about 1.5 meter is achieved, economical benefits can be estimated by applying several robots in parallel ( swarm ). The precision spraying module consists of a nozzle holder with eight nozzles placed at a typical height of about 30 cm from the ground level, adapted to first test applications in agricultural row cultures such as maize. Due to the orientation of the nozzles (adapted to the row structures) and their height the spraying of the crop plants is strongly reduced and focused to the weed. The nozzle can be controlled individually. For plant detection two low-cost color consumer cameras (640 x 170 pixels) are used for image acquisition under daylight condition (without artificial illumination and specific shading). For image processing the open source library "OpenCV" is used, this allows a quasi-real time behavior. Plants are detected by the algorithms which can be assigned as weeds with a high probability due to the field of view of the cameras within the row cultures. The color filtering is performed in the HSV color space, from the following segmentation the objects are identified and the corresponding position information is used for selective nozzle control (e.g. opening of the valve) taking into account the driving condition. The communication between the field robots and the application module is based on the open software framework Robot Operating System (ROS), in particular the Ethernet interface. Via this interface, the module receives information about the instant driving conditions and thus the synchronization of the weed control action can be achieved. The application module has been tested in the laboratory as well as in first field trials in a test carrier vehicle as well as integrated into BoniRob. The technical functionality and robustness of the hardware and the algorithms have been evaluated for different sunlight conditions. Moreover, detection probabilities of plants above 95% have been obtained in the first tests and the timing of the image processing algorithms related to the valve activation has been checked. To summarize, a technical weed control solution based on the application module of a field robot platform, image processing and a flexible sprayer technology has been developed and first field tests Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 1/8

have been successfully performed. In the next step the focus will shift from the technological development to the evaluation of agronomic parameters for different crop cultures. Keywords: Selective weed control, field robot, Robot Operating System ROS, image processing 1 Introduction For the reduction of environmental impacts related to weed control applications the decrease of herbicides or the increase of mechanical weed control processes are topics of high relevance. In order to realize such options interdisciplinary cooperation is indispensable, knowledge about crops, soil, agricultural machinery, sensors, electronics and computer engineering has to be fused to reach practical solutions with respect to economical as well as ecological aspects. Technology pushes such innovative agricultural options as an important supportive tool of the complex outdoor processes. Present agricultural processing and technologies could be improved. One option as an example could be a more precise spraying by taking into account the local weed population. Sensor-based online measurements optionally combined with offline information allow site-specific crop protection by taking into account heterogeneities within the field (Oerke et al. 2010). However, technology can even go further and allows research investigations of radical new technologies for agriculture with high potentials for combining economical and ecological benefits. One option is smaller agricultural field robots (Griepentrog et al. 2010). Together with partners from research and industry the authors have thus developed an autonomous field robot platform for investigating potential applications. The platform BoniRob is a multipurpose platform where different application modules can be integrated (Bangert et al. 2013). At the moment there are 6 application modules in the development and application stage, developed by external research groups or within the BoniRob research groups. The name BoniRob has been selected since the very first application of the platform is field-based crop phenotyping, where the German word is Bonitur. Figure 1 shows the basic idea of the multipurpose platform: The application modules have to fulfill restrictions of the robot with respect to weight, size, speed, energy and data communication and can be changed within a few minutes. In this work the authors are focusing on a Precision Spraying App for chemical weed control (Figure 1, top right). Imaging technologies and individually controlled spray nozzles are combined for weed control of individual weed plants. Figure 1: Multipurpose BoniRob with examples of different Apps (Bangert et al. 2013) Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 2/8

Figure 2: BoniRob with Precision Spraying App during field test 2 Materials and methods 2.1 Field robot platform BoniRob For this paper a multipurpose field robot platform called BoniRob was used. This field robot has an empty space within the body, which can use as the base for multiple BoniRob application modules called Apps (Figure 1). The combination of BoniRob and Apps can be compared with a tractor and implement combination. Various Apps with different purposes have been developed (Bangert et al. 2013). 2.2 Precision Spraying App 2.2.1 Hardware This application module ( App ) is a camera-based solution for selective weed control in crop rows. It can be integrated into BoniRob by using the defined mechanical, electrical and logical interfaces of the robot. The components of the Precision Spraying App are shown in Figure 3, left. The App frame is constructed by aluminum profiles. For plant detection under daylight conditions two cameras (Type: Microsoft LifeCam Studio) are mounted each in the center between two crop rows (Figure 3, left). The cameras are attached to a housing in order to get a water- and dust resistance and connected via USB2.0 interface to the industrial PC unit. For image processing and control of the App an industrial PC unit with an Ethernet IO coupler is integrated within a control box. Figure 3: Precision Spraying App (left), operating range of the system (right) Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 3/8

Eight flatjet nozzles (Type XRC8002, opening angle 80 ) are mounted to a holder (Figure 3, left) to bring the herbicides to the weed. The herbicides are conveyed from a tank by using an electric pump. Furthermore magnetic valves (24V DC) are controlled the nozzles individually. The weed detection is concentrated between the crop rows, as the first test application. Figure 3, right shows the schematic layout of the operating range of the system. Number one indicates the area between the crop rows. The red centerline splits the working area in two parts each for one spraying nozzle, center left and center right. The red left- and right line (number two) marks the areas for the side spraying nozzles into the crop rows. The operating range of four spraying nozzles is approximately 0,75m. This allows an entire working width of about 1,5m with eight nozzles. To increase the working width more nozzles holders are needed. Due to the orientation and the height over ground level of the nozzles (adapted to the row structures) the spraying of crop plants is strongly reduced and focused to the weed. The height of the camera and the nozzle holder is flexible mounted to the ground level for a variable adaptation to field conditions. 2.2.2 Data management The modular architecture of the application module App is shown in Figure 4. The two cameras are connected via USB2.0 to an industrial PC unit (www.beckhoff.com). Figure 4: System architecture of the App This unit consists of a PC and a modular Input/Output (I/O) unit. The I/O unit handles the incoming data and transfers this data via Ethernet Network to the PC. This PC runs a software application with a plant detection algorithm based on image-processing. For this imageprocessing the open source library for programming functions OpenCV (www.opencv.org) is used. Plants are detected by the algorithms and assigned as weeds with a high probability due to the field of view of the cameras within the crop rows. The color filtering of the algorithm is realized into the HSV color space, the objects within the image are identified and marked with a red border. The corresponding position information is used for individually nozzle control e.g. opening of the magnetic valve. The trigger timing depends on the speed of the field robot. 2.2.3 Robot Operating System (ROS) The field robot platform BoniRob and the Precision Spraying App use the open software framework Robot Operating System (ROS) for control and communication (Quigley et al. 2009). Via the defined interface status codes e.g. start plant detection, stop plant detection are exchanged. This communication is essential for the autonomous plant detection with BoniRob and the Precision Spraying App. Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 4/8

2.3 Laboratory tests Laboratory tests were performed at two different locations. The first test was realized as an indoor test to validate the image-processing algorithm under defined conditions. A 2000W halogen spotlight was used in order to create consistent illuminated light conditions and no shades within the container. The test was performed on a conveyor belt in two parts. Within the first part a static test was performed. Figure 5: Measurement setup for indoor (left) and outdoor (right) tests The second part included a dynamic test with a speed of approximately 4,5km/h of the conveyor belt in order to validate the tripping characteristics of the nozzles in combination with the image processing algorithm. For the measuring object, a container with soil and plants have been used and placed below in the center of the camera (Figure 5, left). This setup simulated the area between two crop rows. Figure 5, left shows the measurement setup for the indoor test. The red lines within the image indicate the areas of the nozzles based on the schematic layout at the top. The second test was realized as an outdoor test to validate the image-processing algorithm under outdoor conditions. For the measurement object a container with soil and plants was used and placed below in the center of the camera according to the indoor tests. Figure 5, right shows the measurement setup for the static outdoor test. Within the container a shade was created in order to get a realistic simulation to field condition (e.g. shades from plants, field robot, etc.) The illumination was measured during the test at three different location by using a commercial light meter (Type: LX-107). The identified values of illumination are shown at the top of Figure 5, right. 2.4 Field tests A first application test was performed with the field robot platform BoniRob and Precision Spraying App on the field. For this test BoniRob was controlled manually with a control panel. The plant detection was performed automatically by using the image-processing algorithm described above. For these field test one location (Germany, 52 19'59.0"N 7 58'20.6"E) was used with 6 field plots (2,5m x 15m) within this field (Figure 6). The selective weed control was performed in a working width of approximately 1,5m in every plot. This working width is shown in Figure 6 as the red bordered areas. The total herbicide Roundup (Glufosinate) with a concentration of 4 liter per hectare was used for this field test. The blue bordered areas in Figure 6 are untreated with the herbicide. Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 5/8

Figure 6: Field for application test with plots, red bordered: the working width for selective weed control, blue bordered: untreated areas within the plots 3 3.1 Results and Discussion Laboratory tests Figure 7 left shows the resulting image processing within the indoor test on the conveyor belt. All plants inside the container are detected and marked with red border. The red vertical lines identified the areas for the spraying nozzles according to Figure 3, right. Figure 7: Resulting image-processing of laboratory test (left) and outdoor test with shade (right) The tripping characteristics of the nozzles were validated with three different plants inside the container on a conveyor belt at a speed of approximately 4,5km/h. The container was passed the module 100 times and the tripping was observed. The nozzles triggered 300 times this corresponds to an approximately 100% plant detection at a speed of 4,5km/h. Figure 7 right shows the resulting image processing of the outdoor test. All plants inside the container are detected and marked with red border. Thus, no negative effects are observed due to different light conditions shade area and sun- in order to detect and mark plants with the image-processing algorithm. Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 6/8

3.2 Field tests Figure 8 shows the zoom in on field plot 1 according to Figure 6. Both images show the working width as the red bordered areas and the untreated areas as the blue bordered areas. Figure 8 left, shows the field plot before the weed control. Figure 8 right, shows the same field plot after a week with weed control and the total herbicide Roundup. Within the working width marked as red bordered area- no plants are existed due to the herbicide (Figure 8, right). All plants were detected and sprayed with the herbicide. The amount of reduction of herbicides was not focus of this paper and needs to be systematically investigated in further studies. Figure 8: Field plot, red bordered: working width, blue bordered: untreated area, before weed control (left) and after a week with weed control (right) 4 Conclusions In this paper an application module App for chemical weed control was described and validated in laboratory and field tests. The first achievements show the potential of the field robot platform BoniRob with the Precision Spraying App in order to chemical weed control on the field. The modular system architecture allows further application e.g. a RTK-GNSS receiver to log the spraying positions on the field according to further studies (Scholz et al. 2014). The next step will be more field tests in order to validate the possible savings of herbicides due the selective weed control in comparison with a comprehensive weed control. 5 Acknowledgements This work is part of the EU project SmartBot and is funded by EU INTERREG (Region s EDR and Euregio) 6 References Bangert, W., Kielhorn, A., Rahe, F., Albert, A., Biber, P., Grzonka, S., Haug, S., Michaels, A., Mentrup, D., Hänsel, M., Kinski, D., Möller, K., Ruckelshausen, A., Scholz, C., Sellmann, F., Strothmann, W., Trautz, D. (2013). Field-Robot-Based Agriculture: "RemoteFarming.1" and "BoniRob, 71th conference LAND.TECHNIK AgEng 2013, p. 439-446. Griepentrog, H.-W., Ruckelshausen, A., Jörgensen, R. N., Lund, I. (2010). Autonomous systems for plant protection, Precision Crop Protection The Challenge and Use of Heterogeneity (Editors Oerke, E.-C., Gerhards, R., Menz, G., Sikora, R. A.), Heidelberg, Springer-Verlag, (Chapter 20). Oerke, E.-C., Gerhards, R., Menz, G., Sikora, R. A. (2010). Precision Crop Protection The Challenge and Use of Heterogeneity. Heidelberg, Springer-Verlag. Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 7/8

Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A. Y. (2009). ROS: an open-source Robot Operating System. Journal ICRA workshop on Open- Source Software. Scholz, C., Göttinger, M., Hinck, S., Möller, K., Ruckelshausen, A. (2014). Automatic soil penetrometer measurements and GIS-based documentation with the autonomous field robot platform BoniRob. To be published at 12 th International Conference on Precision Agriculture ICPA 2014 Proceedings International Conference of Agricultural Engineering, Zurich, 06-10.07.2014 www.eurageng.eu 8/8