New trends in industrial robot controller user interfaces Azin Aryania *, Balazs Daniel **, Trygve Thomessen *** and Gabor Sziebig * * Narvik University College, Department of Industrial Engineering, Narvik, Norway ** Budapest University of Technology and Economics, Department of Mechatronics, Optics and Engineering Informatics, Budapest, Hungary *** PPM AS, Trondheim, Norway azin.aryania@gmail.com, daniel@mogi.bme.hu, trygve.thomessen@ppm.no, gabor.sziebig@hin.no Abstract Robot controller user interface is one of the most technical challenging issues for robot manufacturers because of human presence as an actor in this interaction. Many developments made for having intuitive and easy interaction between user and robot controller while complexity of the robot systems is continuously increasing. One possibility to obtain an efficient controller user interface could be giving this opportunity to the user to operate a flexible Graphical User Interface (GUI), the benefit from this development is that user with different level of proficiency is able to customize his/her own efficient interface rather than a standard interface with large number of components for all users that may be confusing and additionally depending on user training. At first this paper briefly describes trends in robot controller user interface design in industrial use case, then three different types of developing and present intuitive user interfaces are discussed and finally FlexGUI; an efficient user interface will be introduced. One case study is briefly described to illustrate how this intuitive user interface can be effectively adopted with user to execute robot task. I. INTRODUCTION ntuitive user interface topic in human robot I interaction (HRI) as a robotics field is getting increased attention to investigate in order to develop new techniques for information transfer from human to robot and to design effective tools for human to simplify control of a robot [1-2]. The goal of investigation is to find intuitive system by which human can interact and communicate with industrial robot in a friendly way. In designing intuitive user interface in HRI system importance of human subsystem as practitioner in multidisciplinary teams is coming to the fore, regarding to minimize challenging human issues [3]. Previous studies showed that the most popular approaches to design a system are machine-centered and human-centered. In intuitive user interface in HRI design, machine-centered approach is unsuccessful since this approach limited us to have only professional users, which is contrary to our goal to have flexible system for both professional and nonprofessional users. The complex airplane control system could be a typical example for this approach. Whereas, the opposite approach human-centered is a multidisciplinary activity, which improves efficiency and productivity along with human working conditions by combining human factors and techniques, accordingly robot should be designed in such a flexible manner to be able to adapt to human by learning characteristics and behaviors of humans while robot factors should be minimized. It would be quite difficult for robots to have a human model and execute interactions in a human-friendly way due to the complexity and variability of the user s characteristics and behavior [4-5], his paper is going to introduce an effective Graphical user interface (GUI) system for existing NACHI robots called FlexGUI. FlexGUI is a novel solution in that, to create a flexible GUI system adaptable with the user s knowledge level, and it is its appellation. GUI is an interface where the user clicks on a visual screen that has icons, windows and menus, by using a pointing device, such as a mouse in order to execute interaction between human and machine. Even though, Intuitive user interface in HRI differs from Human-machine Interaction (HMI) because it concerns systems, which have complex, dynamic control systems, exhibit autonomy and cognition, and which operate in changing, real-world environments [6], we classified general categories of interacting with robot based on a broad survey in HMI, this categories can be used alone or combination: 1. Acoustic (Sound) 2. Optics (Light) 3. Bionics 4. Motion 5. Tactile (Touch) [7]. In this research, we are going through the optics category, intuitive screens on the controller. Nowadays, several research institutions are dedicated to the development of designing Teach pendant and applied system on it as the most common interface to program and control an industrial robot.
Figure 1 shows the basic robot system configuration, the existing industrial robot generally comprises of teach pendant, robot and controller. Fig. 1. Basic configuration of an industrial robot system [8]. This paper is divided as follows concerning the new trends in industrial robot controller user interface. Respectively, section II considers trends of intuitive user interface in industrial use case and briefly describes present solutions, section III introduces an effective Graphical user interface (GUI) system called FlexGUI with focus on customizing user interface and in the end of this section all solutions are considered through a table. Section IV shows simplicity of using FlexGUI in a case study and as a final point conclusion is covered in section V. II. TREND IN INTUITIVE USER INTERFACE AND PRESENT SOLUTION The first interaction between human and robot starts on 1940 s, in the beginning, this interaction was primarily unidirectional: simple on-off controls or analogue joysticks for operating manipulator joints and remote vehicles. Over time, as robots have become more intelligent, the nature of communication between humans and robots has been becoming less and less like that of using a passive hand tool and more and more like the relationship between two human beings [9]. The main goal is developing intuitive user interface in HRI as a key role player to bring robots more close to humans and allows them to work not only for but also with humans. Although there is rapid progress in teach pendant functionality, still the concept of the pendant and overall look remains the same today. For instance, the big red button, always set in top and right side of the teach pendant. Figure 2 shows evolution of teach pendant design, the modern teach pendant placed in the middle, and out-dated teach pendants are in both sides. Human robot interactions in psychological point of view is based on productive compatibility between human beings and their artificial partners and effectiveness in this interaction can be achieved through the friendly design and pleasing appearance of the robotic creatures or multimodal interactive interface based on the open- loop principle that allows people to communicate with the robot on all levels [11-12]. The most challenging issues for industrial robotics are gaining optimal and economically feasible solutions for flexible automation. In order to optimal use of the robots, it has been necessary for the robot manufacturers to make big efforts to adapt the robot control to the need of automation industry with respect to communication systems and user interfaces, concerning basic requirements on cost efficiency, high reliability and high productivity [13]. Normally robots are delivered with a general user interface, designed with the same set of screens and buttons, independent of the task to solve. However, different processes require different operator interfaces to be truly efficient. A well-designed operator interface presents the appropriate information and functionality at the right time. In recent years, due to competitive environment in the automotive industry, lots of developments made and several applications and solutions have been proposed concerning of having customized intuitive user interface, in this part we came across three of these developing and developed solutions. 1) ABB developed Robot Application Builder (RAB) to give this opportunity to robot users to customize their own interfaces. A RAB application can be developed for either a PC or FlexPendant. The FlexPendant is ABB teach pendant, offers the flexibility to generate a customized graphical user interface to the robot controller in order to simplify operator interaction; refer to Figure 3. RAB is a Software Development Kit (SDK), installed on a PC while it depends on Microsoft s development environment called VisualStudio. Custom operator interfaces created in Robot Application Builder will automatically load into the controller and appear on the touch screen FlexPendant or a PC screen as active buttons, fields, and text boxes etc., whereas safe and efficient operator control required [15]. Fig. 3. ABB FlexPendant [16]. Fig. 2. Development of the robotic teach pendant [10]. 2) KUKA industries introduces smartpad as shown on Figure 4; the new KUKA smartpad brilliantly demonstrates, on a large, high-resolution, antireflection touch screen, in such a way that robots can be operated intuitively. Operations are presented to the user in a transparent manner by means of intelligent, interactive dialogs. The user always has at
his/her disposal precisely the operator control elements that he/she actually needs at any given moment. With the aim of focusing attention on what is important, so that users can work more intuitively and thus more easily, quickly and efficiently [17]. Fig. 4. KUKA smartpad [17]. 3) Reis Robotics is developing ReisPAD; Smart robot programming as it shown in Figure 5, is another developing intuitive user interface introduced by REIS, which basically inspired by IPAD concept regarding to similar graphics, functionalities and large screen with possibly to customize user screen by using its software on PC. Innovative programming hand set used in this reispad; programmers will be working on one control panel without any mechanical and electrical operating elements. ReisPAD has advantage of offering scratch-resistant, Multi-Touch display, intuitive operation, web access, integrated logbook function and WLAN-function [18]. Fig. 5. ReisPAD; Smart robot programming [18]. Even if it is very difficult to predict the long-term development directions of future robot applications with very complex work entities for instance in disassembly, spot welding, sorting, cleaning etc. There should be a possibility that robots will be supported by advanced vision systems. Probably, wireless, 3D combined with color and texture [19]. III. FlexGUI; CUSTOMIZING USER INTERFACE The research and practical work done to create FlexGUI required not only knowledge of robotics but a combination of several other technologies as well psychology and computer science. FlexGUI is being developed by the Norwegian robot system integrator PPM AS in collaboration with Japanese robot manufacturer NACHI Fujikoshi. The main intention was to bridge the gap between human and robot by adapting the robot to the level of the user in order to providing a user-friendly system. FlexGUI is adapted to the low level user s need for easy to use interface, and the advanced user s need for advanced functionality, instead of the traditional way; the user has to be trained to be adapted to the robot. This user interface has advantage of offering custom created screen for every industrial cell and giving possibly to optimize existing screens. In this interface, all required robot parameters, monitoring tools and needed action buttons can easily be customized by the user, according to task and user priority while user won t be overwhelmed by number of unnecessary default settings and buttons on the screen of the robot teach pendant. FlexGUI runs as a separate process on the robot computer, providing an easy alternative interface along with the native NACHI interface. FlexGUI operates back and forth through the FlexGUI Toolbox, a development environment installed on PC, by using mouse and keyboard, to create screens. While it is possible to make new screens in FlexGUI by using drag and drop technology, in more complicated functionality task, FlexGUI Toolbox is recommended. It is FlexGUI server, means that any FlexGUI can be connected to it, to transfer created Screens and Fidgets. To describe FlexGUI Toolbox in more detail, we define FlexGUI widgets called Fidgets as a component of user interface, with well-defined functionality in order to meet robot task purposes. For higher flexibility, several built-in Fidgets are available in FlexGUI Toolbox that can be used individually, combined with potential of programming in Java Script. Figure 6 shows the overall structure of FlexGUI interaction; controller is in the middle of this interaction or in the other words FlexGUI on controller and FlexGUI Toolbox on PC are interacting through the controller. The privilege of using FlexGUI is having two opportunities for controlling the robot, by touching the screen on teach pendant or safe remote control on PC that can be used from anywhere in order to facilitate communication between a human and robot, when robot is executing task. The advantages of such design philosophy are remote control functionality, having all functionality integrated in one package even required software to customize screen and language independency [8]. For review, Table 1 summarized an overall comparison of these four presented intuitive user interfaces.
Fig. 7. Apprentice operating robot by means of FlexGUI at PPM AS laboratory [8]. Fig. 6. FlexGUI installed on NACHI robot controller and operating on teach pendant [8]. TABLE 1 Broad comparison between four developed and developing intuitive user interfaces for industrial robots Product Capability ABB RAB KUKA SmartPAD Reis ReisPAD NACHI FlexGUI Customized graphical user interface WLAN-function Customize screens without using additional software Figure 8, shows the customized screen built in FlexGUI Toolbox, which created based on task requirements and user priorities. From functionality point of view, this screen efficiently able user to: 1) Identifying object simply by looking at its picture on screen. 2) Pick and place required object by touching one button. 3) Option to modify level of speed in three different phases of task: Departure, Approach and Travelling. 4) Monitoring progress during the task execution. 5) Confirming current object position by feedback from indicator lamps. Touch screen Remote control functionality Usable for specific robot IV. CASE STUDY IN FlexGUI FlexGUI is adaptable to user s skill that makes it easy and intuitive to use, even the users with basic information regarding to GUI will be able to find out how to work with it, conveniently from the first time they see it. In one of the PPM AS laboratory experiments regarding FlexGUI test and modelling, a nonprofessional apprentice who just got the FlexGUI package without having manual instruction asked to customize her own screen in order to pick and place three distinct objects between two places (Table 1 and Table 2 with specified coordinates) by means of FlexGUI Toolbox and execute it with FlexGUI installed on NACHI FD-robot. From design point of view, a picture can tell more than thousands words therefore the apprentice to create more efficient screen, applied pictures and built in Fidgets like buttons in this screen. Figure 7, shows the apprentice, controlling robot to execute task by using her customized screen on teach pendant. Fig. 8. Customized screen in FlexGUI Toolbox in order to pick and place three objects [8]. V. CONCLUSION For efficient interaction with robot, four systems based on customizable GUI have been introduced. All the above mentioned systems were developed for one purpose; having user-friendly interaction between robot and human. These presented systems offer a flexible, intuitive interface to interact with robot rather than conventional, hard-wired interface. FlexGUI is considered more deeply in this paper, moreover one case study is presented in order to demonstrate the functionality of this intuitive user interface in practical environment. FlexGUI is radically different from earlier solutions by offering
safe remote control on PC that can be used from anywhere. In tackling the challenge of efficiently interaction between the physical robot and user, the field of Cognitive Infocommunications (CogInfoCom), can be applied. The primary goal of Cognitive Infocommunications is Provide a complete view of how brain processes can be merged with info communications devices so that the cognitive capabilities of the human brain may not only be efficiently extended through these devices, irrespective of geographical distance, but may also be efficiently matched with the capabilities of any artificially cognitive system [20]. These GUI solutions aimed to explore the use of the theoretical foundations of CogInfoCom combined with intuitive interfaces. Intuitive user interface in HRI is still in its infancy and has a big potential, going further in this development direction, it requires much more expertise just to operate the system and there will be a lot of other important areas like remote robot supervision, wireless communication and multi robot control for the future use of industrial robots. ACKNOWLEDGEMENT This work has been supported by BANOROB project, funded by the Royal Norwegian Ministry of Foreign Affairs and Program and Regional R&D and Innovation (VRI) funded by the Norwegian Government. REFERENCES [1] Brenna D. Argall, Aude G. Billard. (2010) A survey of Tactile Human Robot Interactions. Robotics and Autonomous Systems, Vol. (5): pp. 1159 1176. [2] Vincze D., Kovács Sz., Gacsi M., Korondi P., Miklosi A., Baranyi P. "A Novel Application of the 3D VirCA Environment: Modelling a Standard Ethological Test of Dog- Human Interactions."ACTA POLYTECHNICA HUNGARICA 9:(1) pp. 107-120, 2012. [11] Alexander V. Libin and Elena V. Libin. (2004) Person Robot Interactions From the Robopsychologists Point of View: The Robotic Psychology and Robotherapy Approach. Proceeding of the IEEE, VOL. (92): pp. 1-15. [12] Korondi P., Solvang B., Baranyi P., "Cognitive Robotics and Telemanipulation.", In proceedings of 15th International Conference on Electrical Drives and Power Electronics (EDPE 2009), Paper TPL-001, 2009. [13] Torgny Brogårdh. (2007) Present and future robot control development An industrial perspective. Annual Reviews in Control 31. ABB Robotics, SE-721 68 Västerås, Sweden: pp. 69-79. [14] Korondi P., Baranyi P., Hashimoto H., Solvang B. "3D-Based Internet Communication and Control." Computational Intelligence in Engineering, Springer-Verlag, pp. 47-60., 2010. [15] ABB products and services, online, available 2012.10.12. http://www.abb.com/global/abbzh/abbzh251.nsf!opendatabas e&db=/global/seitp/seitp327.nsf&v=9aac910011&e=us&m= 9F2&c=4A08ADF3B8BD7976C12570FC00348B3F [16] ABB in Deutschland, 14.7.2009 - Die neue Version des Programmierhandgerätes bietet noch mehr Bedienkomfort, online, available 2012.10.12. http://www02.abb.com/global/deabb/deabb200.nsf!opendatab ase&db=/global/deabb/deabb204.nsf&v=21b2&e=ge&url=/gl obal/seitp/seitp202.nsf/0/68fdfc9fb8bdee4fc12575f3004 9036C!OpenDocument [17] KUKA products, controllers, online, available 2012.10.12. http://www.kukarobotics.com/en/products/controllers/smartpa D/start.htm [18] Reis Products, robot control, online, available 2012.10.12. http://www.reisrobotics.de/produkte/robot-control [19] Suppa, M. & Hirzinger, G. (2005). Ein Hand-Auge-System zur multisensoriellen Rekonstruktion von 3D-Modellen in der Robotik (an eye-in-hand system for multisensor surface reconstruction of 3-D models in robotics).automatisierungstechnik, 53(7): pp. 322 331. [20] Péter Baranyi, Ádám Csapo. (2010). Cognitive infocommunications: Coginfocom. In Computational Intelligence and Informatics (CINTI), 11th International Symposium, pp. 141-146 IEEE. [3] Jean Scholtz. (2003) Theory and Evaluation of Human Robot Interaction. Proceedings of the 36th Hawaii International Conference on System Sciences, pp. 1-10. [4] Z. Zenn Bien, Hyong-Euk Lee. (2007) Effective learning system techniques for human robot interaction in service environment. Knowledge-Based Systems, Vol. (20): pp. 439-456. [5] Ernst Kesseler, Ed G. Knapen. (2006) Towards humancentred design: Two case studies. The Journal of Systems and Software Vol. (79): pp. 301-313. [6] Terrence Fong, Charles Thorpe and Charles Baur. (2001) Collaboration, Dialogue, and Human-Robot Interaction. 10th International Symposium of Robotics Research, November 2001, Lorne, Victoria, Australia, pp. 1-10. [7] James Cannan and Huosheng Hu. (2010) Human-Machine Interaction (HMI): A Survey. Technical Report: CES-508 School of Computer Science & Electronic Engineering University of Essex, pp. 1-16. [8] PPM AS, online; www.ppm.no, available 2012.10.12. [9] Sheridan, T. (1997). Eight ultimate challenges of human-robot communication. IEEE International Workshop on Robot and Human Communication, pp. 9-14. [10] Tribute to 50 Years of Industrial Robotics at Automate 2011