Designing Arms and Hands for the Humanoid Robot ROMAN



Similar documents
UOM Robotics and Industrial Automation Laboratory

Obstacle Avoidance Design for Humanoid Robot Based on Four Infrared Sensors

Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database

DESIGN AND IMPLEMENTATION OF A SIMPLE, LOW-COST ROBOTIC ARM

Universal Exoskeleton Arm Design for Rehabilitation

This week. CENG 732 Computer Animation. Challenges in Human Modeling. Basic Arm Model

Scooter, 3 wheeled cobot North Western University. PERCRO Exoskeleton

Autonomous Mobile Robot-I

Advances in the Design of the icub Humanoid Robot: Force Control and Tactile Sensing

Frequently Asked Questions

Robotic Home Assistant Care-O-bot: Past Present Future

Unit 1: INTRODUCTION TO ADVANCED ROBOTIC DESIGN & ENGINEERING

LEGO NXT-based Robotic Arm

Intelligent Robotics Lab.

HYDRAULIC ARM MODELING VIA MATLAB SIMHYDRAULICS

Determining the Posture, Shape and Mobility of the Spine

Force/position control of a robotic system for transcranial magnetic stimulation

Frequently Asked Questions

Shadow Dexterous Hand E1 Series (E1M3R, E1M3L, E1P1R, E1P1L) Technical Specification Release: 1 January 2013

Development of Docking System for Mobile Robots Using Cheap Infrared Sensors

An inertial haptic interface for robotic applications

CIM Computer Integrated Manufacturing

FUNDAMENTALS OF ROBOTICS

FIRST YEAR PROJECT SUMMARY

Digital Position Control for Analog Servos

Final Year Projects at itm. Topics 2010/2011

Six-servo Robot Arm. DAGU Hi-Tech Electronic Co., LTD Six-servo Robot Arm

MOTION CONTROL. Special issue on. A helping hand from robotics, page 56 PIEZO-DRIVEN STAGE MAKES MOVES IN MICROMETERS, PAGE 18

Thermodynamic efficiency of an actuator that provides the mechanical movement for the driven equipments:

Switch Assessment and Planning Framework for Individuals with Physical Disabilities

Range of Motion Exercises

Matlab Based Interactive Simulation Program for 2D Multisegment Mechanical Systems

Robotics and Automation Blueprint

ROBOT SYSTEM FOR REMOVING ASBESTOS SPRAYED ON BEAMS

UNIT II Robots Drive Systems and End Effectors Part-A Questions

Development of Easy Teaching Interface for a Dual Arm Robot Manipulator

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS

Bionic Handling Assistant

dspace DSP DS-1104 based State Observer Design for Position Control of DC Servo Motor

LOAD BALANCER WITH AUTOMATIC LIFTING FORCE COMPENSATION

Robot Task-Level Programming Language and Simulation

The Assembly Automation Industry

Definitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots

HYFLAM D3: Report of Scenarios 08/02/2012

Design of a Robotic Arm with Gripper & End Effector for Spot Welding

Passive Range of Motion Exercises

ROBOT END EFFECTORS SCRIPT

Estimating Lengths in Metric Units

Figure Cartesian coordinate robot

New robot improves costefficiency. spot welding. 4 ABB Review 3/1996

UNIT 1 INTRODUCTION TO NC MACHINE TOOLS

RISK ASSESSMENT WORKSHEETS Worksheet Reference Number

PhD Student Marco Maggiali

Security Integrated System Using 3 DOF Robotic Lamps Along With SMS Alert

Hand Gestures Remote Controlled Robotic Arm

GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM

Design of a six Degree-of-Freedom Articulated Robotic Arm for Manufacturing Electrochromic Nanofilms

GAIT DEVELOPMENT FOR THE TYROL BIPED ROBOT

Automated Container Handling in Port Terminals

Android Phone Controlled Robot Using Bluetooth

Proceeding of 5th International Mechanical Engineering Forum 2012 June 20th 2012 June 22nd 2012, Prague, Czech Republic

Design and Implementation of a Wireless Gesture Controlled Robotic Arm with Vision

Interactive Motion Simulators

Automated Recording of Lectures using the Microsoft Kinect

The Design and Application of Water Jet Propulsion Boat Weibo Song, Junhai Jiang3, a, Shuping Zhao, Kaiyan Zhu, Qihua Wang

Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot.

Developing a Sewer Inspection Robot through a Mechatronics Approach

Chapter 5 Objectives. Chapter 5 Input

Leghe a memoria di forma come tecnologia di attuazione per la biorobotica

Pneumatically Driven Robot System with Force Perception for Minimally Invasive Surgery

Chapter 1. Introduction. 1.1 The Challenge of Computer Generated Postures

SimFonIA Animation Tools V1.0. SCA Extension SimFonIA Character Animator

Chapter 2 Fundamentals of Robotics

Lecture Notes - H. Harry Asada Ford Professor of Mechanical Engineering

SIMERO Software System Design and Implementation

Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF

Mechanism and Control of a Dynamic Lifting Robot

A System for Capturing High Resolution Images

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research

Emotional Communication Robot: WAMOEBA-2R - Emotion Model and Evaluation Experiments -

INTRODUCTION TO ROBOTICS. Dr. Bob Williams, Ohio University

Backstroke. Russell Mark National Team Division High Performance Consultant

Unimanual, Bimanual and Bilateral Weight Perception of Virtual Objects in the Master Finger 2 Environment

Multi-sensor and prediction fusion for contact detection and localization

2. Development of androids

14. Mechanisation. and Welding Fixture

Design and Implementation of a 4-Bar linkage Gripper

HUR Rehab Line H E A L T H & F I T N E S S E Q U I P M E N T

SPRING SEMESTER. Turhan DOAN from Computer Engineering Department: Software Development Process

Industrial Robotics. Training Objective

Design Aspects of Robot Manipulators

Pure Efficiency. C.C.I.A.A.: N REA Cod. Fiscale e P. IVA: Capitale Sociale versato ,00

Servo Info and Centering

Development of Robotic End-Effector Using Sensors for Part Recognition and Grasping

Transcription:

Designing Arms and Hands for the Humanoid Robot ROMAN J. Hirth, K. Berns Robotics Research Lab, Dept. of Computer Sciences University of Kaiserslautern Kaiserslautern, Germany j_hirth, berns@informatik.uni-kl.de K. Mianowski Institute of Aircraft Engineering and Applied Mechanics, Warsaw University of Technology Warsaw, Poland kmianowski@meil.pw.edu.pl Abstract The interest in assistance- and personal robots is constantly growing. Therefore robots need new, sophisticated interaction abilities. Humanlike interaction seems to be an appropriate way to increase the quality of human-robot communication. Psychologists point out that most of the human-human interaction is conducted nonverbally. For that reason, researchers try to enable humanoid robots to realize nonverbal communication signals. This paper presents a compact, lightweight, and low-cost arm and hand design, to enable the generation of gestures as nonverbal interaction signals to humanoid robots. Human ranges of motion and size have been investigated and they have been used as guiding principle for the construction of the arms and hands. Keywords-component; humanoid robots; arm and hand design I. INTRODUCTION Recent developments in robotics show a growing interest in assistance- and personal robots. Application areas for these robots reach from the care of elderly people, over nursing robots or household robots, to tour guiding robots or museum guide robots, as mentioned in [1]. These application areas lead to completely new requirements for the robot design, compared to traditional service or industrial robots. These robots need abilities to interact and behave socially. They will need some kind of understanding of human behavior and they will also need some kind of empathy in order to behave in an appropriate way. One of the first steps in the development of such a robot is the mechanical realization of humanlike interaction abilities. Psychologists did a lot of research on human interaction. They suggest that more than 50% of the human-human interaction is conducted nonverbally. In [2] e.g. it is mentioned that 60% of the communication is nonverbal, in [3] it even comes to 93%. Considering these findings, a humanlike interacting robot especially needs the abilities for nonverbal interaction. Looking to psychology leads to the information, that facial expressions and gestures are mainly used for the generation of nonverbal signals, cf. [4]. In [5] the facial action coding system is explained. This system provides information about all moveable parts of the human face that are included in the generation of a certain expression. A similar system for gestures is described in [6]. In order to use these studies as basis for the development of an interacting robot a, humanlike robot shape seems to be Figure 1. The humanoid robot ROMAN of the University of Kaiserslautern. necessary. Thereby several requirements for the robot design can be derived from these psychological findings. It needs the abilities for: Facial expressions Head gestures, like e.g. nodding Gestures realized by body postures Arm and hand gestures World wide researchers focus on the realization of humanoid robots. Most of them focus on skills like bipedal walking or object manipulation like e.g. in [7] or [8]. Therefore, not all requirements for the generation of nonverbal interaction are realized in the construction of these robots. In the last few years there was a lot of research in field of so called robot companions. One of these companions, a miniaturized humanoid robot, is introduced in [9]. These companions have the abilities for the realization of all kinds of gestures but most of them are lacking the possibility for facial expressions. One of the first humanoid

robots fulfilling all requirements for the generation of nonverbal expressions is WE-4RII of the Takanishi Lab of the Waseda University in Japan, [10]. Although this robot is equipped with a rather technical face it is able to present several emotional expressions. A robot providing a more humanlike face is presented in [11]. Furthermore, this robot is able to realize expressive gestures. Although there is a lot of research going on in this area it is still an open question how to design a compact, lightweight, and low-cost humanoid robot, able to realize nonverbal expressions. The Robotics Research Lab of the University of Kaiserslautern is developing the robot ROMAN as test platform for human-robot interaction. Till now this robot is equipped with an expressive face and an upper body, cf. fig. 1. For that reason, the development of arms and hands regarding the criteria mentioned above was the goal of the research work presented here. This paper is arranged in the following way: At first the current state of the robot ROMAN is briefly described. Afterwards the design of the new arms and hands is discussed. In the next section the sensor- and control system of ROMAN is introduced. Finally an outlook for future work and a summary of the paper are given. II. CURRENT STATE OF THE ROBOT ROMAN The robot ROMAN is designed as a test platform for human-robot interaction. Therefore it provides possibilities to realize facial expressions, expressive body postures, and head gestures. For the generation of facial expressions the robot's face provides 10 DOF (degrees of freedom). They are realized by small metal plates that can be moved by servomotors. These metal plates are connected to a silicone skin. That way, humanlike expressions can be generated. Furthermore the robot is equipped with 3 DOF eyes and 1 DOF lower jaw. The whole construction of the head including the eyes is realized out of ABS-plastic (lightweight) and as actuators model making servo-motors are used (low-costs). In addition ROMAN is equipped with a 4 DOF neck. Providing the possibility of generating head gestures like e.g. nodding or shaking the head. Furthermore, in combination with the eyes the important gesture of looking at the interaction partner can be realized. The upper body of ROMAN provides additional 3 DOF which can be used for the realization of body posture expressions like e.g. withdrawal to signal fear or disgust. To reduce the weight of the neck and the upper body a construction including springs is used, see [12]. That way the required motor torque can be reduced and smaller motors can be attached to the robot. An overview of the current state of the robot ROMAN is depicted in fig. 1. III. DESIGN OF ARMS AND HANDS For the design of arms and hands there exist several preconditions: Besides approx. humanlike ranges of motion and size, they should also fulfill the secondary conditions of lightweight and low-cost. For safety reasons, the weight of Figure 2. Details on the compact linear actuator construction. This conscruction provides the possibility to add parallel springs to the actuator and doing this, reduce the required motor torque. the arm should not exceed 6 kg, otherwise the robot may harm its human interaction partners. In addition the robot should be able to fulfill simple manipulation tasks. Therefore, sensors to measure the force of the grip and to measure the exact position of the different joints should be integrated. A. Arm Design To reduce the weight and the costs of the arm, compact linear actuators, for realizing the arm movement, had been designed. Using these linear actuators it is possible to reduce the motor size by combining these actuators with parallel springs (see a detailed drawing of the actuator in fig. 2). That way, smaller motors can handle the same payload than stronger ones. These actuators are used for rotating and lifting the upper arm, for lifting the forearm, and for the movement of the wrist. For the rotation around the axis of the upper arm and the forearm, motors have been located exactly on a longitudinal axis of upper arm and the forearm, resp. The first actuator is located horizontally inside the shoulder of the robot. The actuation system for the second DOF is realized similarly; the actuator is located nearly vertically between the joint, mounted by a bearing to the hip of the robot. The third joint is driven by a simillar motor. In fig. 3 and 4 the kinematics of the arm as well as the construction of the arm including all actuators is depicted. To control the movements of the arm it is important to get precise information of the current rotation angles of the different joints. Because of the linear actuators attached to the arm and the passive movement capabilities of the actuators (depending on their mounting by bearings) it is hard to calculate the rotation angles depending on the encoders attached to the motor axis. For the realization of this, the different motor controllers would have to communicate with each other and

(a) Kinematics of the developed arm (b) 3D Model of ROMAN s arm Figure 3. Thee procedure how to create the functional prototype of the humanlike arm. TABLE I. TECHNICAL PARAMETERS OF THE DEVELOPED ARM COMPARE TO THE HUMAN ARM. THE INFORMATION ON THE HUMAN ARM RELIES ON [13] AND[14]. Robot Human Upper arm (length) 32 cm 26-38 cm Forearm (length/diameter) 27 cm 25-27 cm Upper arm (flexion/hyperextension) 110 /0 180 /60 Upper arm (abduction/hyperadduction) 110 /0 180 /75 Upper arm rotation (internal/external) 80 /80 90 /90 Upper arm (horiz. flexion/horiz. extension) 90 /0 135 /45 Forearm (flexion/extension) 25 /90 90 /90 Forearm (pro-/supination) 160 230 Wrist (flexion/extension) 80 160 Wrist (adduction/abduction) 40 /40 20 /40 Weight (including all motors) 3.75 kg 3-3.5 kg consider the positions of the other actuators for controlling the corresponding motor. To avoid this, the arm has been equipped with absolute position encoders located in the joint axis. That way the motors can be controlled without communication between the different controllers. In fig. 3 the developed arm including the position encoders is shown. Using these construction principles the whole arm (including the motors located in the robot s shoulder and next to the hip) without hand has a weight of about 3.75 kg. Furthermore it is able to lift a payload up to 8 kg. All technical details concerning the arm are depicted in table I. The big difference for hyperextension, hyperadduction, and horiz. extension of the upper arm depends on the fact: That for the realization of these movements humans not just use the joints of the arm but also the shoulder and the collarbone. B. Hand Design To design the hand, the kinematics of the human hand have been investigated. Since the human hand is very Figure 4. The realized arm including all motors and sensors. The 2 additional actuators are used for realization of the movement of the shoulder. They will be located in ROMAN's upper body. (a) Morecki's kine- (b) Simplified kine- (c) 3D Model of matics of the hand matics of the human ROMAN's hand [15] hand Figure 5. Scheme how to create human-like robot in the base of the theory of cybernetics. complex, in [15] a simplified kinematics diagram with 19 DOF plus 3 DOF (wrist) has been introduced, see fig. 5a. Unfortunately, this version is still very complex and it is hard to realize such a hand in a compact way including all the electronics. After some additional investigations the even more simplified version in fig. 5b has been derived. This kinematics version provides 15 DOF. To realize a compact hand including all the actuators, the basic functions of the finger has been analyzed. Real fingers are driven by elastic tendons, and the basic function of the hand is to catch 3D objects. Therefore, the decision was to transmit motions from small motors with gears located inside the palm by kinematic coupling between the following phalanges of each finger. Because of this, the kinematics shown in fig. 5b can be realized using only 6 motors 2 motors for the thumb and 4 motors for the other fingers. To get tactile- and force feedback the hand is equipped with different sensors. For simple tactile feedback switches are located in the fingertips. For more sophisticated force feedback absolute position encoders have

TABLE II. THE TECHNICAL PARAMETERS OF THE DEVELOPED HAND COMPARED TO THE HUMAN HAND (ROM MEANS RANGE OF MOTION). THE VALUES FOR THE RANGE OF MOTION OF THE HUMAN HAND RELY ON [14], THE SIZE IS MEASURE BY HAND. Robot Human Thumb (length) 6.8 cm 6.8 cm Little finger (length/diameter) 8.8 cm 8.8 cm Finger (length/diameter) 9.4 cm 9.4 cm Hand (length/height/width) 10.5/9.5/2.5 cm 10.5/9.5/2.5 cm ROM finger (1 st and 2 nd seg.) 40 100 ROM finger (last seg.) 40 70 ROM thumb (each seg.) 50 100 ROM metacarpus 90 100 Weight 375 g 300-500 g Figure 6. Details on the finger construction, including the serial spring for meassuring the current force of the grip. IV. SENSOR AND CONTROL ARCHITECTURE A design goal of the ROMAN project is the integration of all electronic components inside the robot s body. Therefore, a compact design is necessary. The sensor system of ROMAN consists of 2 stereo vision camera systems, 6 microphones, and 21 motor encoders. In addition 19 absolute position encoders and 10 switches had been added. The actuator system of the robot consists of 47 different motors including electric- and servo-motors. The control of the motors is done using a DSP (Motorola 56F803) connected to a CPLD (Altera EPM 70 128). One DSP-CPLD unit controls up to two electric and up to eight servo-motors. Using these control units in combination with the precise encoder an accurate positioning system can be realized. DSP- CPLD units are connected via CAN-bus to an embedded Linux-PC. Figure 7. The realized hand including all motors and sensors. The mentioned elastic rods are realized by applying serial springs. been attached to the joints connecting the fingers with the palm. By attaching serial springs to the rods moving the fingers the current force of the grip can be calculated, see (1). F describes the force and κ the spring constant. The function f(x) converts the measured position of the joint to the corresponding displacement of the rod. The target position of the motor (the servo-motor always adjusts to the target position) is represented by pos t and the actual position (measured by the encoder) by pos a. The difference of these positions describes the compression of the spring. Fig. 6 provides a detailed drawing of the developed finger, depicting the spring and the rods. F = κ (f(pos t ) - f(pos a )). (1) Since the hand is build out of ABS-plastic a very lightweight construction has been realized. The weight of this hand construction in total (including motors and sensors) is 375 g. The size of the hand is almost comparable to human hands. All technical parameters of the developed hand compared to the human hand are described in table II. In fig. 7 the developed hand including motors and sensors is shown. The 3 DOF eyes include 1 stereo camera system as well as the motor units for the eye movement (3 servo-motors) and the 11 servo-motors for the movements of the face are located in the robot's head. They are controlled by 2 DSPs also located in the head. These DSPs are also used to control the 4 motors responsible for the movement of the robot's neck. The stereo camera system located in the eyes and the second system attached to the upper body are connected to the graphics card of an embedded PC. For sound localization 6 microphones are mounted to the upper body. They are connected to the sound card of the embedded PC. The 3 motors for moving the upper body are controlled by 2 DSPs attached to the upper body. The motors and sensors of the arm-hand construction are controlled via 8 DSPs 4 per side. 1 DSP of each side is also located in the robot s upper body; the remaining DSPs are mounted to the arms themselves. Fig. 8 provides an overview of the hardware system including all necessary connections to sensors and actuators. V. CONCLUSION AND FUTURE WORK The paper presented the new designed arms and hands for the humanoid robot ROMAN. They have been developed considering the conditions of being compact (all electronics should be included), lightweight, and low-cost. Furthermore

the next version the ranges of motion should be increased. Additionally the arms should be improved by adding elasticity to reduce the risk of dangerous collisions for human interaction partners as well as for the robot itself. Finally psychological experiments should be realized to determine the quality of gestures realized with the arms and hands. ACKNOWLEDGMENT We gratefully acknowledge Klaus Tschira Stiftung and Carl-Zeiss-Stiftung for supporting our research. Figure 8. Mechatronics design of the robot ROMAN. they should provide the capabilities to realize humanlike gestures and handle simple manipulation tasks. Although the achieved ranges of motion differ from the human ranges of motion (cf. table I and II), the first tests were quite promising and at least some basic gestures can be realized. However, in REFERENCES [1] A. Tapus, M. Mataric, and B. Scassellati, The grand challenges in socially assistive robotics, Robotics and Automation Magazine, vo. 14, no. 1, pp. 35 -- 42, 2007. [2] R. Birdwhistell, Kinesics and Context: Essays in Body Motion Communication. Philadelphia, USA: University of Pennsylvania Press, 1970. [3] A. Mehrabian and M. Wiener, Decoding of inconsisten communications, Journal of Personality and Social Psychology, vol. 6, pp. 109 114, 1967. [4] I. Engelberg, Working in Groups: Communication Principles and Strategies. Allyn & Bacon, 2006. [5] P. Ekman, W. Friesen, and J. Hager, Facial Action Coding System. A Human Face, 2002. [6] D. McNeill, Hand and Mind. The University Press of Chicago, 1992. [7] S. Kagami, M. Mochimaru, Y. Ehara, N. Miyata, K. Nishiwaki, T. Kanade, and H. Inoue, Measurement and comparison of human and humanoid walking, in Proceedings of IEEE International Symposium on Computational Intelligence in Robotics and Automation, vol. 2, 2003, pp. 918 922. [8] T. Asfour, K. Regenstein, P. Azad, J. Schröder, A. Bierbaum, N. Vahrenkamp, and R. Dillmann, Armar-III: An integrated humanoid plattform for sensory-motor control, in Proceedings of IEEE-RAS International Conference on Humanoid Robots (Humanoids), Genoa, Italy, December 2006. [9] T. Ishida, Y. Kuroki, and J. Yamaguchi, Mechanical system of a small biped entertainment robot, in Proceedings of the 2004 IEEE- RAS International Conference on Humanoid Robots (Humanoids), Los Angeles, California, USA, November 10 12, 2004, pp. 235 252. [10] M. Zecca, S. Rocella, M. Carrozza, G. Cappiello, et al., On the development of the emotion expression humanoid robot WE-4RII with RCH-1, in Proceedings of the 2004 IEEE-RAS International Conference on Humanoid Robots (Humanoids), Los Angeles, California, USA, November 10 12, 2004, pp. 852 862. [11] T. Spexard, M. Hanheide, and G. Sagerer, Human-oriented interaction with an anthropomorphic robot, in IEEE Transactions on Robotics, Special Issue on Human-Robot Interaction, vol. 23, no. 5, December 2007, pp. 852 862. [12] K. Mianowski, N. Schmitz, and K. Berns, Mechanics of the humanoid Robot ROMAN, in Sixth International Workshop on robot Motion and Control (RoMoCo), Bukowy Dworek, Poland, June 11 13, 2007, pp. 341 348. [13] J. Hamill and K.M. Knutzen, Biomechanical Basis of Human Movement, 2 nd ed. Lippincott Williams & Wilkins, 2003. [14] K. Tittel, Beschreibende und funktionelle Anatomie des Menschen. Urban & Fischer, 2003. [15] A. Morecki, J. Ekiel, and K. Fidelius, Cybernetic Systems of Limb Movements in Man, Animals, and Robots. Polish Scientific Publishers, Warsaw, Ellis Horwood Limited Publishers, Chinchester, 198.