Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database



Similar documents
Human-like Motion of a Humanoid Robot Arm Based on a Closed-Form Solution of the Inverse Kinematics Problem

Vision-based Walking Parameter Estimation for Biped Locomotion Imitation

Interactive Computer Graphics

Figure 1.1 Vector A and Vector F

Lecture L5 - Other Coordinate Systems

Progettazione Funzionale di Sistemi Meccanici e Meccatronici

This week. CENG 732 Computer Animation. Challenges in Human Modeling. Basic Arm Model

Mechanics lecture 7 Moment of a force, torque, equilibrium of a body

Force/position control of a robotic system for transcranial magnetic stimulation

ME 115(b): Solution to Homework #1

TWO-DIMENSIONAL TRANSFORMATION

SOLID MECHANICS TUTORIAL MECHANISMS KINEMATICS - VELOCITY AND ACCELERATION DIAGRAMS

1.3. DOT PRODUCT If θ is the angle (between 0 and π) between two non-zero vectors u and v,

Development of Easy Teaching Interface for a Dual Arm Robot Manipulator

Design of a six Degree-of-Freedom Articulated Robotic Arm for Manufacturing Electrochromic Nanofilms

Geometric Transformation CS 211A

Given a point cloud, polygon, or sampled parametric curve, we can use transformations for several purposes:

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users

3. KINEMATICS IN TWO DIMENSIONS; VECTORS.

Obstacle Avoidance Design for Humanoid Robot Based on Four Infrared Sensors

ONLINE TRANSFER OF HUMAN MOTION TO HUMANOIDS

Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi

Introduction to Computer Graphics Marie-Paule Cani & Estelle Duveau

Computer Animation. Lecture 2. Basics of Character Animation

Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD)

Practical Work DELMIA V5 R20 Lecture 1. D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr

Lecture L3 - Vectors, Matrices and Coordinate Transformations

LEGO NXT-based Robotic Arm

An inertial haptic interface for robotic applications

Visual Servoing for the REEM Humanoid Robot s Upper Body

Solving Simultaneous Equations and Matrices

DINAMIC AND STATIC CENTRE OF PRESSURE MEASUREMENT ON THE FORCEPLATE. F. R. Soha, I. A. Szabó, M. Budai. Abstract

Rotation: Moment of Inertia and Torque

Two hours UNIVERSITY OF MANCHESTER SCHOOL OF COMPUTER SCIENCE. M.Sc. in Advanced Computer Science. Friday 18 th January 2008.

Classifying Manipulation Primitives from Visual Data

Geometric Camera Parameters

Robot Task-Level Programming Language and Simulation

Physics 160 Biomechanics. Angular Kinematics

discuss how to describe points, lines and planes in 3 space.

Figure Cartesian coordinate robot

Operational Space Control for A Scara Robot

Essential Mathematics for Computer Graphics fast

Solutions to Problems in Goldstein, Classical Mechanics, Second Edition. Chapter 7

Lecture 7. Matthew T. Mason. Mechanics of Manipulation. Lecture 7. Representing Rotation. Kinematic representation: goals, overview

Georgia Department of Education Kathy Cox, State Superintendent of Schools 7/19/2005 All Rights Reserved 1

Design-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS

Unified Lecture # 4 Vectors

Refractive Index Measurement Principle

Lecture 2: Homogeneous Coordinates, Lines and Conics

A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS

PLANE TRUSSES. Definitions

MODELLING A SATELLITE CONTROL SYSTEM SIMULATOR

Awell-known lecture demonstration1

RealTime Tracking Meets Online Grasp Planning

Mechanics 1: Conservation of Energy and Momentum

CIS 536/636 Introduction to Computer Graphics. Kansas State University. CIS 536/636 Introduction to Computer Graphics

Math Placement Test Practice Problems

Algebra Academic Content Standards Grade Eight and Grade Nine Ohio. Grade Eight. Number, Number Sense and Operations Standard

Angular acceleration α

Centripetal Force. This result is independent of the size of r. A full circle has 2π rad, and 360 deg = 2π rad.

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Rotation Matrices and Homogeneous Transformations

Chapter 6 Circular Motion

On Motion of Robot End-Effector using the Curvature Theory of Timelike Ruled Surfaces with Timelike Directrix

B4 Computational Geometry

State of Stress at Point

Frequently Asked Questions

Athletics (Throwing) Questions Javelin, Shot Put, Hammer, Discus

INTRODUCTION. Robotics is a relatively young field of modern technology that crosses traditional

11.1. Objectives. Component Form of a Vector. Component Form of a Vector. Component Form of a Vector. Vectors and the Geometry of Space

PHY121 #8 Midterm I

Chapter 2 Lead Screws

B) 286 m C) 325 m D) 367 m Answer: B

Algebra. Exponents. Absolute Value. Simplify each of the following as much as possible. 2x y x + y y. xxx 3. x x x xx x. 1. Evaluate 5 and 123

Section 9.1 Vectors in Two Dimensions

Motion Control of 3 Degree-of-Freedom Direct-Drive Robot. Rutchanee Gullayanon

Chapter 18 Static Equilibrium

Vectors VECTOR PRODUCT. Graham S McDonald. A Tutorial Module for learning about the vector product of two vectors. Table of contents Begin Tutorial

Chapter 6 Lateral static stability and control - 3 Lecture 21 Topics

Vector surface area Differentials in an OCS

Classification of Fingerprints. Sarat C. Dass Department of Statistics & Probability

Sensory-motor control scheme based on Kohonen Maps and AVITE model

Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary

FRICTION, WORK, AND THE INCLINED PLANE

Analysis of Stresses and Strains

INTRODUCTION TO SERIAL ARM

Physics 235 Chapter 1. Chapter 1 Matrices, Vectors, and Vector Calculus

Matlab Based Interactive Simulation Program for 2D Multisegment Mechanical Systems

Industrial Robotics. Training Objective

A System for Capturing High Resolution Images

South Carolina College- and Career-Ready (SCCCR) Pre-Calculus

Synthesis of Constrained nr Planar Robots to Reach Five Task Positions

CNC Machine Control Unit

THEORETICAL MECHANICS

Computer Puppetry: An Importance-Based Approach

Transcription:

Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database Seungsu Kim, ChangHwan Kim and Jong Hyeon Park School of Mechanical Engineering Hanyang University, Seoul, 133-791, Korea. Intelligent Robotics Research Center Korea Institute of Science and Technology, Seoul, 13-65, Korea. Email: nicekim@hanyang.ac.kr; ckim@kist.re.kr; jongpark@hanyang.ac.kr Abstract During the communication and interaction with a human using motions or gestures, a humanoid robot needs to not only look like a human but also behavior like a human to avoid confusions in the communication and interaction. Among humanlike behaviors, arm motions of the humanoid robot are essential for the communication with people through motions. In this work, a mathematical representation for characterizing human arm motions is first proposed. The human arm motions are characterized by the elbow elevation angle that is determined using the position and orientation of human hands. That representation is mathematically obtained using an approximation tool, Response Surface Method (RSM). Then a method to generate humanlike arm motions in real time using the proposed representation is presented. The proposed method was evaluated to generate human-like arm motions when the humanoid robot was asked to move its arms from a point to another point including the rotation of hand. An example motion was performed using the KIST humanoid robot, MAHRU. I. INTRODUCTION A few humanoid robots have been developed and shown to the public in the last decades aiming for providing people with useful services. Most interactions between a humanoid robot and a human happen through voices and behaviors. Such behaviors need to look like humans otherwise they may cause people to understand the meaning of behaviors incorrectly. It is natural that a behavior of humanoid robot be comfortable to and expectable by a human. Human-like arm motions are then the bottom requests for humanoid robots to do like humans. Some studies have been done to generate a human-like motion by imitating a human motion as closely as possible. The human motion is measured by a motion capture system and then adapted to a humanoid robot or an animation character. In the case of using an optical motion capture system the human motion are captured in the form of time trajectories of markers attached on the human body. This approach have been developed by several researchers. Kim et al.[?] proposed a method to convert the captured marker data of human arm to motions available to a humanoid robot using and optimization scheme. The position and orientation of hand, and the orientation of upper arm of a human were imitated by a humanoid robot under bounded capacities of joint motors. However this method were not able to generate a new human-like motion. Pollard et al.[?] also developed a method to adapt captured human motions to a humanoid robot that consists of only a upper body. The captured upper body motions of an actor was optimized, minimizing the posture differences between the humanoid robot and the actor. The limits of joint position and velocity were considered. Nakaoka et al.[?] explored a procedure to let a humanoid robot (HRP- 1S) imitate a Japanese folk dance captured by a motion capture system. Symbolic representations for primitive motions were presented. The time trajectories of joint positions were first generated to imitate the primitive motions. These trajectories were then modified to satisfy mechanical constraints of the humanoid robot. Especially, for the dynamic stabilities the trajectory of waist was modified to be consistent with the desired ZMP trajectory. The imitation of the Japanese folk dance was performed in the dynamics simulator, OpenHRP, and was realized by the real humanoid robot, HRP-1S, as well. These methods are all used to imitate the given human motions. The methods may have difficulties to generate a variety of new human-like motions from the Human s motion capture database, since they adapted only the given captured motions. Another approach to generate human-like arm motions using a mathematical representation for a human s arm motion was performed by Asfour et al.[?]. The mathematical representation in [?] and [?] was used. In those papers the four parameters were defined and represented in terms of wrist positions of a human in the spherical coordinate system at the shoulder. However, the proposed representation approximated arm movements so that the method developed in [?] provided erroneous results at the position and orientation of a humanoid hand. In addition, the four parameters used in that work may not have physical meanings. For a humanoid robot not only to imitate human motions but also to perform a human-like motions anytime needed using motion database, a new method is needed. In this paper, a method for extracting movement characteristics of human arms from the motion capture database will be presented.

The characteristics will be described in terms of the elbow elevation angle. This angle will be determined by the position of wrist and the angle between the palm and the ground. Using this representation of a human natural elbow elevation angle a human-like motion will be generated. II. ELBOW ELEVATION ANGLE: CHARACTERIZING A HUMAN ARM MOTION To obtain the natural elbow postures of a human the kinematic analysis were performed as seen in Fig. 3. During the experiments, the actor relaxed his arm, and moved without planned arm postures. A number of wrist positions and palm directions were examined in the rule given to the actor. For the experiments the reachable volume by human arms were divided by 6 planes vertical to the ground as equally as possible. Then the actor drew 5 circles having different diameters from each other during 5 seconds. The same experiment were repeated by 3 times by varying palm directions. Elbow elevation angle Fig. 1. The definition of elbow elevation angle for a human arm Fig. 2. Motion capture system and actor In this section the characterizing process for the movement of a human arm in the motion capture database is described. The motion database is constructed using an commercially available optical motion capture system as seen in Fig. 2. The human model in Fig. 3 were modeled by the S/W of provided the motion capture system. In daily life, the hand motions of moving from a point to another point with varying its orientation occur anytime such as when pointing out by a hand, moving a hand to grasp an object on a table or in the air, talking to persons with hand gestures and so on. The human arm s posture may be described in terms of the position of wrist, the orientation of hand, the elbow posture from the body and more. From the capture arm motion database it was observed that the elbow posture might be determined mainly by the position of wrist and the direction of vector normal to the palm, as called palm direction. In other words, a posture of an arm at a certain instance can be described in terms of the position of wrist, palm direction and elbow posture. Moreover elbow posture can be expressed by the position of wrist and palm direction. The wrist position is obtained using the markers on the human arm first about the global cartesian coordinate on the ground and then converted about the reference frame attached at the shoulder. The elbow posture is defined by the angle between a plane vertical to the ground (see the red dashed triangle in Fig. 1) and the plane defined by the three markers of shoulder, elbow and wrist (see the blue dashed triangle in Fig. 1). This angle between the two planes are called elbow elevation angle in the entire paper. Using this angle, human arm motions are characterized, since the angle is represented in terms of the wrist position and the palm direction that are the key factors for natural postures of human arms. The elbow elevation angle is defined as zero when the blue dashed plane in Fig. 1 is parallel to the vertical plane (red dashed plane in the figure). Fig. 3. Kinematic analysis for a variety of human arm postures The human arm motions were captured using the Hawk Digital System commercially available from Motion Analysis Inc as seen in Fig. 2. 29 markers were attached to the upper body of actor and 8 cameras were used. The capturing rate was 12 frames per second. The time trajectories of markers representing human motions were stored. Using such markers time trajectories, the wrist positions were obtained with respect to the reference frame at the shoulder and the palm directions was calculated at each frame as well. III. EQUATION OF ELBOW ELEVATION ANGLE From the kinematic analysis in the forgoing section it was observed that the arm posture could be characterized by the elbow elevation angle which is represented in terms of the wrist position and the palm direction. In this section, the representation of the elbow elevation angle is obtained using Response Surface Methodology (RSM) given in [?]. A. Response Surface Methodology The Response Surface Methodology (RSM) in [?] is a technique for representing relationship between controllable input variables and a response. In the methodology a response function is defined to approximate experiment results. The brief descriptions are made as follows:

The response of an experiment is approximated using a response function as y(x) = ŷ(x) + e (1) where y denotes the given response of experiment, ŷ is the unknown response function of y and e is the error between the response and the response function. x is a controllable input variable vector. The response function approximates the response using shape functions as ŷ(x) = N b i=1 b i ξ i (x) (2) where N b is the number of terms of the response function. ξ i for i = 1 N b are called shape functions (or basis functions by some researchers). Unknown coefficients of shape functions, b i for i = 1 N b, need to be determined by curvefitting experiment at results. When the multiple responses are given, the multiple errors are also obtained using Eq. (1) and Eq. (2) as N b e j = y j ŷ(x j ) = y j b i ξ i (x j ) for j = 1 N (3) i=1 where N is the number of responses (or experiments); y j and e j are the value of the j th response and the corresponding error, respectively; x j is the input variable vector corresponding to the j th response. Equation (3) can be rewritten in a vector form as e = y Xb (4) where the dimension of matrix, X, is N N b having the values of ξ i (x j ) as its elements. The unknown constant vector, b, is then determined by minimizing the root mean square (RMS) for e as e RMS = 1 N e 2 i N = 1 e n T e. (5) y i=1 Note that minimizing e RMS is equivalent to minimizing e T e. Using the optimality conditions the vector, b, can be obtained as b = ( X T X ) 1 X T y (6) so that the response function is achieved. It should be noticed that the process in this section is called the least squares method. B. Normalization of input variables In the solution process in the previous section, it is worthwhile normalizing input variables separately, since big differences in the the magnitudes of the variables may exist. This normalization may help reduce the approximation error. Moreover, since the size of the humanoid is different from that of human, the normalization makes it easy to apply the human database to the humanoid. As mentioned in Sec. II the characteristics of human arm motions can be represented using the wrist position and the palm angle. The wrist positions are obtained about the spherical coordinate system on the shoulder using the trajectories of the marker at the wrist. The palm direction denotes the direction of vector normal to the palm as defined in Sec. II. The angle between this direction and the ground is used as one of input variables. These representation parameters are normalized to the dimensionless magnitude of 2 as r 2 ; 1 ᾱ 1 1 β 1 ; 1 θ 1 where r is the distance from the shoulder to the wrist; ᾱ and β are the angles for the spherical coordinate system at the shoulder as seen in Fig. 7; θ is the angle between the palm direction and the ground. C. Characteristic equation for elbow elevation angle For the shape function a second ordered polynomial is widely used in the response surface methodology. Using the parameters defined in the previous section are used to represent the response function for the elbow elevation angle as ˆγ = b + b 1 x 1 + b 2 x 2 +b 3 x 3 + b 4 x 4 + + b 5 x 1 x 2 + b 13 x 2 3 + b 14 x 2 4 (7) (8) [ x 1 x 2 x 3 x 4 ] = [ r ᾱ β θ ] (9) where ˆγ is the normalized response function for the elbow elevation angle. The input variable vector, x is given as Eq. (9). The unknown coefficient vector, b, for the shape function above is then obtained using Eq. (6) and the results of kinematic analysis of human arm in Sec. 1. Once the response function for the elevation angle for a human are completed, the most natural elbow elevation angle of a humanoid robot is then determined by the wrist positions and the palm directions of the humanoid robot. In addition the motions generated using this response function should look like those of a human. Figure 4 shows the effects of parameters on the elbow elevation angle with varying input parameters. IV. INVERSE KINEMATICS Up to the previous section the elbow elevation angle of a human was obtained using RSM and motion capture database. In this section, The inverse kinematics problem to generate a human-like arm motion using the elbow elevation angle and a typical procedure of inverse kinematics solution process in robotics, is solved for joint positions. As a testbed, the humanoid robot of MAHRU in Fig. 5 developed by Korean Institute of Science and Technology (KIST) with 6 degrees of freedom for each arm, was used. To solve inverse kinematics problem, 6 holonomic constraints are needed. Input for desired posture is a wrist position in the

Elbow elevation angle ( γ ), in degree 2 15 1 5 α = 45 α = 45 α = Elbow elevation angle ( γ ), in degree 16 14 12 1 8 6 4 2 α = α = 45 α = 45 5 2 15 1 5 5 1 15 2 Palm direction angle ( θ ), when r = 1.7 and β = 1 1.2 1.4 1.6 1.8 2 Wrist distance ( r ), when β =. and θ =. 8 16 Elbow elevation angle ( γ ), in degree 7 6 5 4 3 2 1 1 θ = 9 θ = θ = 9 Elbow elevation angle ( γ ), in degree 14 12 1 8 6 4 2 θ = 45 θ = 45 θ = 2 5 5 1 Wrist pitch angle ( β ), when r = 1.7 and α =. 5 5 Wrist yaw angle ( α ), when r = 1.7 and θ =. Fig. 4. Elbow elevation angles of a human with respect to the four parameters, r, α, β, and θ shoulder-centered spherical coordinate and a palm direction angle. Wrist stoop angle also can be input. But in this paper, the angle was set zero. To generate human-like posture, human arm characteristic equation will be used. Therefore, 6 constraint was set. Our approach to solve inverse kinematics is derived from geometric analysis of the problem. When the elbow elevation angle is obtained from the previous section, the remaining joint angles from θ to θ 4 are obtained through the procedure in this section. First, the joint angle θ 3 depends only on the distance r as seen in Fig. 7. θ 3 = π cos 1 ( L 2 u + L 2 l r2 2L u L l ) (1) The joint angles θ and θ 1 is dependent on the vector E. The E is the elbow position when α, β and γ were set zero at given wrist positions and palm directions. The plane builded with E and the wrist position vector from shoulder to wrist lies on the x-z plane of shoulder centered coordinate. [ r 2 +L 2 u L2 l 2r L u sin ( cos 1 ( r 2 +L 2 u L2 l 2rL u )) ] T E = (11) E can be calculated by the elbow elevation angle, ˆγ, in Eq. (8) and the wrist position. Fig. 5. The KIST humanoid robot, MAHRU Figure 6 shows the home position of left arm. The posture of the arm at this position stretch down the ground and the palm faces to the hip. E = R x (γ) R y (β) R z (α) E (12) ( ) Ey θ 1 = sin 1 (13) θ = a tan 2 L u ( ) E x L u cos (θ 1 ), Ez L u cos (θ 1 ) (14)

Shoulder z θ x θ 1 θ 2 Lu E y Shoulder Elbow W Elbow θ 3 EW θ 4 Ll Wrist Wrist θ 5 Fig. 8. coordinates of left arm Elbow Fig. 7. Fig. 6. L u L l y Hand coordinates of left arm r Wrist where E is component of E. z x Shoulder Parameters for human arm posture The wrist position is expressed as below 1A 12A 23A 34A 4 W = W (15) where i ja is a homogeneous transformation matrix from the i th reference frame to the j th reference frame and 4 W is the wrist position vector in the 4 th frame. That vector is 4 W [ = Ll ] T. The Wrist position is given and θ,θ 1 and θ 3 are known using the equations above. Therefore, θ 2 can be obtained as s 2 = W z + (L u + L l c 3 ) s 1 L l c 1 s 3 (16) c 2 = W y + c (c 1 (L u + L l c 3 ) + L l s 1 s 3 ) L l s s 3 (17) θ 2 = atan2 (s 2, c 2 ) (18) where c i is cos (θ i ), s i is sin (θ i ). In this paper, the wrist stoop angle θ 5 was set zero. To find θ 4, angle between blow two vector was used. N c = E EW N v = EW 1 θ diff = cos 1 N c N v N c Nv where, Nv is the normal vector of the plane consisting of the vector from the elbow to the wrist and the normal direction vector from the ground. Nc is the normal vector of the plane consisting of the origin at the shoulder, the wrist position and the elbow position under given input variables. θ 4 = θ θ diff (19) V. AN EXAMPLE From above section, The equation of elbow elevation angle was implemented. Using this equation, the best natural humanlike posture can be obtained. Moreover, inverse kinematics solution of KIST humanoid MAHRU can be obtained in any reachable wrist position and palm direction. To evaluate the equation and the inverse kinematics solution, the humanoid robot was required to follow the desired trajectories of wrist position and palm direction. The wrist trajectory is given by a sin wave in the y-z plane of the cartesian coordinate system at the shoulder with the distance of.44 m in the x direction. The desired trajectory of palm direction was generated by tangential vectors of sin wave function for the wrist position at each time frame. Using those desired trajectories the desired trajectories of joint angles were calculated giving human-like arm motions. Such desired trajectories of joint angles were examined by the KIST humanoid robot, MAHRU. The experiment was performed using a PC operated by the real-time Linux (RTAI) and DSP control boards at each joint motors. The Linux PC could send the desired joint angle and

Fig. 9. Comparison of the human arm motion and the human-like arm motion by MAHRU using the developed method desired joint velocity to each DSP board with CAN protocol at each 5 ms. The real-time Linux (RTAI) system guaranteed the 5 ms sampling time. Each DSP board controlled each motor to chace the desired values for joints with PD controller. Figure 9 shows the snap shots of the experiment result. The left and right wrist positions are symmetric and both of the palm directions in the first, third and last scenes of the figure are same about the cartesian coordinate systems at each shoulder. It should be noticed that the resultant arm postures of humanoid robot in such scenes are not symmetric so that one of one elbow was lifted more than the other one as a human does. VI. CONCLUSION A mathematical representation for characterizing human arm motions have been proposed. The motion capture database were used for the representation. The representation was implemented and evaluated for the KIST humanoid robot, MAHRU successfully. The developed method for characteristics of human arms was very simple for implementation and generated a human-like posture for an arbitrary arm configuration. The method can be used to generate arm motions in real time. In addition, the generated motion followed the desired wrist positions exactly, since the elbow elevation angle did not effect the wrist positions. Furthermore the method may be used for the case where the humanoid robot is required to move the wrist or the hand from a point to another point such a case as approaching arm action to an object in the field of visual servoing. The method may not satisfy a desired orientation of hand fully, since the elbow elevation angle used only one angle, which is relative to the palm direction, out of three angles of desired orientation. If the desired orientation of the hand is satisfied, more degrees of freedom are needed to the humanoid robot. In addition the arm motion generation considering dynamics and the self-collision problem are still remaining for our future work. Proc. of Int. Conf. on Infomatics in Control, Automation and Robotics, 25, pp. 85 92. [2] N. S. Pollard, J. K. Hodgins, Marcia J. Riley, and Christopher G. Atkeson, Adapting human motion for the control of a humanoid robot, in Proc. of IEEE Int. Conf. on Robotics and Automation, 22, vol. 2, pp. 139 1397. [3] S. Nakaoka, A. Nakazawa, K. Yokoi, H. Hirukawa, and K. Ikeuchi, Generating whole body motions for a biped humanoid robot from captured human dances, in Proc. of Int. Conf. on Robotics and Automation, 23, pp. 395 391. [4] T. Asfour and R. Dillmann, Human-like motion of a humanoid robot arm based on a closed-form solution of the inverse kinematics problem, in Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, 23, vol. 2, pp. 147 1412. [5] J. F. Soechting and M. Flanders, Errors in pointing are due to approximations in targets in sensorimotor transformations, in Journal of Neurophysiology, 1989, vol. 62, pp. 595 68. [6] J. F. Soechting and M. Flanders, Sensorimotor representations for pointing to targets in three-dimensional space, in Journal of Neurophysiology, 1989, vol. 62, pp. 582 594. [7] R. T. Haftka, Experimental Optimum Engineering Design Course NOTEs, Department of Aerospace Engineering, Mechanics and Engineering Science, University of Florida, Gainesville, Florida, U.S.A., 2. REFERENCES [1] C. Kim, D. Kim, and Y. Oh, Solving an inverse kinematics problem for a humanoid robots imitation of human motions using optimization, in