Eye and Hand Coordination: The Future of Robotics

Similar documents
RIA : 2013 Market Trends Webinar Series

Automated Receiving. Saving Money at the Dock Door. Page 8

ni.com/vision NI Vision

Integration Services

Intelligent Flexible Automation

Phil Crowther, Product Management, April 2015 YuMi IRB Overview

SIMATIC VS720A and VS720-S series Intelligent cameras with PROFINET

CIM Computer Integrated Manufacturing

Industrial Robotics. Training Objective

Optimao. In control since Machine Vision: The key considerations for successful visual inspection

LOOP Technology Limited. vision. inmotion IMPROVE YOUR PRODUCT QUALITY GAIN A DISTINCT COMPETITIVE ADVANTAGE.

5 WAYS TO EVEN GREATER EFFICIENCY

Automotive Applications of 3D Laser Scanning Introduction

How large and small manufacturers are benefiting from Lean Manufacturing and Robotic Automation

VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION

Using NI Vision & Motion for Automated Inspection of Medical Devices and Pharmaceutical Processes. Morten Jensen 2004

Types of 3D Scanners and 3D Scanning Technologies.

Shape Ape. Low- Cost Cubing and Dimensioning

Products / Applications / Solutions. Your Intelligent Robotics Par tner

Basler. Line Scan Cameras

ZEISS T-SCAN Automated / COMET Automated 3D Digitizing - Laserscanning / Fringe Projection Automated solutions for efficient 3D data capture

GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM

New development of automation for agricultural machinery

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

Simple. Intelligent. The SIMATIC VS 100 Series. simatic MACHINE VISION.

Adept Technology, Inc. Industrial Robotics Product Range YOUR INTELLIGENT ROBOTICS PARTNER

AUTOMOTIVE EXCELLENCE

A Remote Maintenance System with the use of Virtual Reality.

Fast Z-stacking 3D Microscopy Extended Depth of Field Autofocus Z Depth Measurement 3D Surface Analysis

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

2/26/2008. Sensors For Robotics. What is sensing? Why do robots need sensors? What is the angle of my arm? internal information

ROBOTICS CAPABILITIES

Robot coined by Karel Capek in a 1921 science-fiction Czech play

OptimizedIR in Axis cameras

CHAPTER 1. Introduction to CAD/CAM/CAE Systems

QUICK REFERENCE GUIDE MOBILE HUMAN MACHINE INTERFACE (HMI): INNOVATION THAT EMPOWERS THE MOBILE OPERATOR

ABB Robotics, June IRB 1200 Overview. ABB Group August 21, 2014 Slide 1

Automated Material Handling and Storage Systems

T-REDSPEED White paper

NEXT GENERATION MOBILITY IN MANUFACTURING

DIE CASTING AUTOMATION AN INTEGRATED ENGINEERING APPROACH

Linear Motion and Assembly Technologies Pneumatics Service. Industrial Ethernet: The key advantages of SERCOS III

Computer Integrated Manufacturing CIM A T I L I M U N I V E R S I T Y

Learning Systems Modular Systems for Mechatronics Training

Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot.

DVD-R/CD-R 3503 DVD-R/CD-R your gateway to the future

Selecting Robots for Use in Drug Discovery and Testing

How To Choose The Right End Effector. For Your Application

Highway Traffic Monitoring

Fettling the most ignored operation in Foundries

Press Release ABB at GIFA 2015 Meeting the automation demands of the foundry and forging industries

Thomas Fuhlbrigge, Global Program Manager of Next Generation Robotics, ABB Corporate Research, April 2016 Current Uses of Robotics and Teleoperation

Basler. Area Scan Cameras

Classroom Activities. These educational materials were developed by the Carnegie Science Center <

POWER. Your Partners in Availability POWER

Linear Motion System: Transport and positioning for demanding applications

Application Example: Quality control of sheet metal components

SuperIOr Controller. Digital Dynamics, Inc., 2014 All Rights Reserved. Patent Pending. Rev:

Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System

OUTCOME 1 TUTORIAL 1 - MECHATRONIC SYSTEMS AND PRODUCTS

FSI Machine Vision Training Programs

Choosing Between Electromechanical and Fluid Power Linear Actuators in Industrial Systems Design

A technical overview of the Fuel3D system.

Conductivity Sensor Calibrations to Meet Water Industry Requirements

DT50 A new age in Laser Distance Measurement

INTRODUCTION TO MACHINE-VISION INSPECTION

How To Understand The Power Of The Internet Of Things

The Challenge of Handling Large Data Sets within your Measurement System

UNIT 1 INTRODUCTION TO NC MACHINE TOOLS

National Performance Evaluation Facility for LADARs

Basler pilot AREA SCAN CAMERAS

PLC Based Liquid Filling and Mixing

Making Multiple Code Reading Easy. Webinar

INTRUSION PREVENTION AND EXPERT SYSTEMS

Interfacing with Manufacturing Systems in Education and Small Industry Using Microcontrollers through the World Wide Web

Pfeiffer. The 30-inch Apple Cinema HD Display Productivity Benchmark. Measuring the impact of screen size on real-world productivity

Silicon Pyranometer Smart Sensor (Part # S-LIB-M003)

Comparing Digital and Analogue X-ray Inspection for BGA, Flip Chip and CSP Analysis

VisioWave TM. Intelligent Video Platform (IVP) Smart Scalable Open. Optimized for high-performance, mission-critical digital video surveillance

Definitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots

3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM

The integrated HMI-PLC

1) Cut-in Place Thermoforming Process

Design/Build and Equipment Integration

inc.jet Software Solutions Design, Control and Monitor

Tips and Technology For Bosch Partners

MACHINE VISION ORIENTATION SENSOR THERMAL IMAGING 3D VISION DEEP LEARNING THERMAL IMAGING.

Mobile Robot FastSLAM with Xbox Kinect

teaching materials Here s Lookin at You, Kids!

Boost Your Productivity

Sony IR Day Semiconductor Business. June 29, Terushi Shimizu

Optical Digitizing by ATOS for Press Parts and Tools

The Internet of Things: Opportunities & Challenges

Mobile Case Studies in Success

PROFIBUS AND MODBUS: A COMPARISON

Application Example: Automated Robot Inspection Cell for Quality Control on Sheet Metal Components. The future of metrology in vehicle construction

RealScan-S. Fingerprint Scanner RealScan-S

CircumSpect 360 Degree Imaging and Inspection Technology Overview

Understanding Device Level Connection Topologies

Transcription:

December 2015 Eye and Hand Coordination: The Future of Robotics Author: Advantech E-mail: eainfo@advantech.com www.advantech.com

Continuing advances in robotics have made them clearly one of the most important pieces of the manufacturing tool box. Giving robots eyes can make them much more productive and they can perform more complex tasks. This paper will discuss how robot vision can make robot assembly smarter, faster, and more reliable-- and with higher quality performance. Robot vision and increased computer processing power are the keys to less expensive, easier to operate and easier to program robots. Without vision, a robot needs to be carefully programmed to move to carefully measured coordinates, and repeat the function. With vision as a feedback mechanism, it is now possible to do random picking with near 100% accuracy. Vision Guided Robotics: A Short Primer Vision Guided Robotics (VGR) is the use of vision systems, cameras, sensors, and feedback loops to control a robot's movements and programming. Instead of the robot "blindly" performing its tasks, a vision guided robot has been given eyes to see with, which provide real-time feedback to the robot's controller. Where you locate those eyes is important. Industrial robots are not really anthropogenic-- except for RethinkRobotics' flagship product, the famous Baxter, they consist of a traversing mechanism, usually mounted to the floor, and an arm that is programmed to do the work. The two types of VGR robots are Eye and Arm, and Eye in Arm. Figure 1 Eye and Arm and Eye in Arm Machine vision for guidance of robots provides the robot with information on where the component to be processed or moved is located in space. This means that it is possible to automate the processing of components without having to position the component precisely, and/or use a jig to hold the component in a specific location. This reduces cycle time and cost substantially. 1

Vision sensors and lighting units mounted on the robot or in its manipulator (gripper) can be subjected to intense stresses from movement, vibration, temperature, and impact. Sensors must be compact, easy to maintain, easy to replace without losing registration, and sturdy and suitable for the service. As digital sensors have replaced optical cameras in many applications, these sensors have gotten tougher, and smarter. Vision sensors work by contrasting all relevant optical features (in visible light, infrared or lidar) of the component with the illumination and image acquisition technology specifically optimal for the application. They must be fast, so that the vision processor can provide precise and fast (real-time) determination of points, features, or bodies in the plane or space using very robust algorithms. They must coordinate transformations between the coordinate system of the sensor, the robot, the object, the cell, conveyor, or platen, and calibrate the entire system, in real time on the fly. The vision system must have real time communications, via Ethernet, between the image processor and the robot controller. The vision system needs also to have the ability to integrate additional quality procedures for measurement and inspection, including documentation. The vision controller needs to be easy to program, with a simple user interface, and good software design. Eye and Arm In many early vision guided robotic designs, the camera or cameras, and the processing computer and the controller were large and cumbersome, and the additional weight was an unnecessary encumbrance to the robot arm itself. As vision system components have become smaller and less costly, with cameras originally designed for cell phones having the resolution to be useful in robotic applications, it has become commonplace to mount the vision system directly to the arm of the robot. There are still applications where the camera must be mounted remotely. In placement of hot castings, or high temperature welding applications, the camera or vision sensor may not be capable of that exposure and so would be mounted remotely. Eye in Arm Typically, now, the design of choice is eye in arm, where the vision system is mounted directly to the robot arm. This, especially in applications where stereoscopic 3D imaging is necessary, makes the package (robot, vision system, and controller) small and compact, allowing use in cramped areas, such as the undercarriage of automobiles on the assembly line, or inside tanks with small access ports for visual inspection. While the design and cost of vision sensors (which can include optical, infrared, laser and lidar) have dramatically reduced, and now can be operated using PoE (Power over Ethernet) and USB connections, the complexity of programming and analysis software has increased dramatically. The human does all of the computations necessary to pick up or put down an object instinctively. Imagine the complexity of computations necessary to pick up a pen and write a sentence. Now imagine having to program a robot to do the same thing. Using machine vision as a feedback circuit makes these computations simpler. Because the robot can 2

see the locations of the objects in its field of vision, and it knows where its manipulator is, it can establish its coordinates and its axes of movement from those factors. This means that the object to be manipulated does not have to be exactly in the right place every time. The vision system provides enough feedback so that the manipulator s controller can adjust the robot arm for any shift in position of the object. Sometimes, it is beginning to be possible to teach this, rather than program it. Shape recognition, part orientation, and other characteristics can now be taught to an advanced robot controller, rather than having to program each position. For example, stereoscopic vision sensors work according to the same principle as the human eye, by determining the 3D coordinates of a point in space using two images from sensors mounted stereoscopically (next to each other, in the same plane, with a small separation and focused on the same spot). The difference between the two images is used to determine the locus of the part in three dimensions. Figure 2 Stereoscopic vision Lidar (laser radar) sensors are sometimes used instead of optical sensors, and they use a time-of-flight differential to locate the part in space. Early VGR Adopter Industries Industries that early on adopted vision guided robotics included automotive, packaging, food and beverage processing, and metal processing and machining. These are all areas where it is difficult to provide safety areas for non-guided robots, and where economics made it necessary for the use of robots to expand beyond very simple tasks. Early vision guided robots had 2D vision systems, which could allow them to control movement in one 3

or two planes with no depth of field perception. Applications for these robots include controlling labeling and marking systems, riveting, filling barrels or other containers, loading and unloading presses, processing machines, latticed boxes, pallets and continuous welding operations. But 3D robots can operate in all 6 degrees of freedom of the position of a moving object in space. 3D vision systems can control painting robots, perform delicate mounting tasks in three dimensions, such as mounting glass, roofs, cockpits, spatial collision monitoring for safety, controlling harvesting machines, and even loading and unloading baggage in airport situations where the size, shape, weight and colors of the baggage are entirely random. VGR and the Internet of Things Because the vision sensors and systems are now more and more designed using digital sensors, rather than optical sensors, they are increasingly functioning as part of the Industrial Internet of Things (Industry 4.0 or Smart Manufacturing) Thus, both vision data and diagnostic data on both the sensor and the robot itself, and its environment, can be delivered to the Internet of Things as the robot is working. For example, a vision sensor can report that the ambient temperature around it is increasing. It could report that it needs its aperture cleaned, or that something is in its field of view that should not be there. Conversely, data from the rest of the plant, or other controllers and vision systems could be downloaded over Ethernet to the robot vision system s controllers such as quality parameters, new product specifications, diagnostic tests, and new programs and do so on the fly. VGR, the Cloud and Asset Management Data from a vision-guided robot can be uploaded directly to the Cloud. Data such as throughput by the robot, number of rejected parts and the vision reasons for rejection, diagnostics on both the vision system and the robot, temperature data, and other variables can be sent directly to the Cloud, and the plant asset management system can pull that data, and add it to other data from other VGRs, and determine the maintenance cycle that will provide the most uptime for the least expenditure. Designing a VGR System A vision guidance system for a robot has three basic functions. The first is calibration: conversion from pixel coordinates from the vision sensors or cameras to real-world coordinates, and vice versa. The second is position and orientation recognition: the system must be able to tell what the position and orientation of the applications it is running. It must be able to tell the difference between a conveyor belt and an object on that conveyor. It must be able to recognize when the robot has picked an object or placed an object and determine if it is doing so correctly. Third, it must communicate the results to and from the robot controller. In addition, the vision system must be able to communicate with the cell or area controller, and/or the Cloud, in real time. Companies like Advantech provide Ethernet or USB-based vision systems and computers that enable these applications. 4

2D and 3D Vision Systems Most advanced vision guided robots today use 3D stereoscopic sensing, whether it is in the optical wavelengths, or in the infrared, or whether the sensor is a lidar sensor (laser radar). Some sensor assemblies have multiple sensors so that, for example, it becomes possible to control and monitor the application of a complex function like adhesive or sealant and at the same time, conduct inspection of the adhesive placement for quality control. Accuracy, Quality, Productivity Vision guidance for robots was developed to produce a simpler, highly accurate way to maintain quality and increase productivity. Using a VGR, it is possible to simply dump out a box full of components at random, and the robot s vision system will sort them, and pick the right components, orient them correctly, perform the required operation on them, and move to the next. Comparing Types of Vision Systems: Eye in Hand and Eye and Hand Unlike a human being, whose eyes are located firmly in the head, the vision sensors for a VGR need not be located any special way. Vision sensors can be located away from the grabber (Eye and Hand) for any number of reasons, including the environment in which the grabber is working. For example, a sensor may be relocated remotely because of temperature, pressure, vibration, or some other reason. When designing for a very compact system or one in which the grabber may need to penetrate into an object to perform its function, the vision sensors can be located directly on the grabber (Eye in Hand). Modern CCD, infrared, and lidar sensors are significantly smaller and more robust than typical optical cameras have been in the past, and are easier to design into Eye in Hand applications than ever before. Design for Robustness, Design for Maintenance Fast motion, jerky start-stop functions, and repetitive impact are hallmarks of robot service. Selecting the correct vision sensor system is paramount when designing the system. But the system should not just be designed for robustness, but also for easy economical maintenance. The sensor itself must be replaceable without re-calibration of the system, and it should be inexpensive enough to make the decision to replace it a maintenance issue, and not an operations one. The Future of Vision Guided Robotics Vision guided robotics is not limited to single arm, pick and place robots. In the future we will see vision guided autonomous vehicles (AGVs) without preconfigured trackways carrying parts and inventory from one part of the plant to another. We will see robots that work in all six degrees of freedom because they are spatially aware thanks to the vision capability of their controllers. We will see smart robots that are connected to the Cloud, and to the entire factory enterprise. We will see robots taking their place in the Industrial Internet of Things, and Industry 4.0. 5

One Size Doesn t Fit All Since the applications and industries for vision systems are so varied and so wide-spread, there will not be a single optimal vision system. 2D vision systems will be continued in use, because they are becoming extremely inexpensive, and will work for a variety of operations as we have discussed. 3D vision systems will become cheaper, more accurate, smaller, and faster to the extent that they will be able to do many operations that robots have been limited from doing, including acting relatively autonomously. Placement of the vision system is also highly variable, and will continue to be so, for some applications. Vision systems guiding robots in high temperatures, areas of high corrosively or abrasion, may be located remotely from the manipulator arm, while others in more temperate areas of the plant may simply be built into the arm itself. Robustness/Reliability As vision systems become digital, and significantly smaller, their robustness has increased. As CCD vision has replaced optics, they have become less affected by vibration, motion, g-forces, and impact. Vision systems that piggyback on cell-phone technology are also more robust and have the reliability that a cell phone must have far better than the optical vision systems of the past. In addition, reliability is enhanced with the connection of the device to the asset management system, either directly via Ethernet, or via the Cloud and TCP/IP. When the device auto-reports its diagnostics on a continuous basis, it makes it possible to schedule preventive maintenance, as well as collect information about device reliability that can be fed back to the design team of the vision system or the robot. Price vs. Performance The cost of vision systems has plunged with the advent of modern sensors and processing power. At the same time, the performance capabilities of the systems have dramatically increased. The size of the vision sensor package is usually significantly smaller than in previous generations, and the robustness per unit price has gone up quite a bit. In general, the cost of using vision-guided robots has reduced, and their performance has increased dramatically. Ease of Use and Programming There are three cost centers in using VGRs. The first is the purchase of the robot and vision system. As we have seen, this is becoming more economical. The second is the installation of the system. Since VGRs are safer than non-vision-guided systems, the area surrounding the robot may be smaller, and the safety cage can be less expensive and intrusive. The third is the programming of the device for the application. In some cases, this remains the most expensive part, although the cost of programming, using show and tell software and learned movements using Artificial Intelligence-type software is increasing and will be more prevalent as time goes on. 6

Vision Guided Robots are the future of industrial robotics, and will allow these devices to be used in many different configurations, and in many different applications from very fine work in semiconductor fabrication and pharmaceutical applications to heavy duty applications in metal fabrication and automotive assembly. 7