3D Arm Motion Tracking for Home-based Rehabilitation



Similar documents
A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS

Tracking devices. Important features. 6 Degrees of freedom. Mechanical devices. Types. Virtual Reality Technology and Programming

Robotics. Chapter 25. Chapter 25 1

Interactive Computer Graphics

Fuzzy logic control of a robot manipulator in 3D based on visual servoing

Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database

GLOVE-BASED GESTURE RECOGNITION SYSTEM

Feature Tracking and Optical Flow

Operational Space Control for A Scara Robot

Analyzing Facial Expressions for Virtual Conferencing

MMX-Accelerated Real-Time Hand Tracking System

FUNDAMENTALS OF ROBOTICS

A Study on Intelligent Video Security Surveillance System with Active Tracking Technology in Multiple Objects Environment

Metrics on SO(3) and Inverse Kinematics

Localization of Mobile Robots Using Odometry and an External Vision Sensor

UAV Pose Estimation using POSIT Algorithm

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

EFFICIENT VEHICLE TRACKING AND CLASSIFICATION FOR AN AUTOMATED TRAFFIC SURVEILLANCE SYSTEM

Real Time Target Tracking with Pan Tilt Zoom Camera

Tracking in flussi video 3D. Ing. Samuele Salti

Sensor Fusion Mobile Platform Challenges and Future Directions Jim Steele VP of Engineering, Sensor Platforms, Inc.

Relating Vanishing Points to Catadioptric Camera Calibration

Robotic motion planning for 8- DOF motion stage

Communicating Agents Architecture with Applications in Multimodal Human Computer Interaction

Orientation Tracking for Humans and Robots Using Inertial Sensors

Robot Perception Continued

Incremental PCA: An Alternative Approach for Novelty Detection

Robotics. Lecture 3: Sensors. See course website for up to date information.

Industrial Robotics. Training Objective

3D Annotation and Manipulation of Medical Anatomical Structures

Analecta Vol. 8, No. 2 ISSN

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Limits and Possibilities of Markerless Human Motion Estimation

Real-Time Camera Tracking Using a Particle Filter

Mouse Control using a Web Camera based on Colour Detection

Digital Image Increase

2.1 Three Dimensional Curves and Surfaces

Automated Process for Generating Digitised Maps through GPS Data Compression

Physics 53. Kinematics 2. Our nature consists in movement; absolute rest is death. Pascal

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users

Chapter 8. Lines and Planes. By the end of this chapter, you will

Mathieu St-Pierre. Denis Gingras Dr. Ing.

THE MS KINECT USE FOR 3D MODELLING AND GAIT ANALYSIS IN THE MATLAB ENVIRONMENT

Reconstructing 3D Pose and Motion from a Single Camera View

Ubiquitous Tracking. Ubiquitous Tracking. Martin Bauer Oberseminar Augmented Reality. 20. Mai 2003

Two hours UNIVERSITY OF MANCHESTER SCHOOL OF COMPUTER SCIENCE. M.Sc. in Advanced Computer Science. Friday 18 th January 2008.

Potential for new technologies in clinical practice

Traffic Flow Monitoring in Crowded Cities

Universal Exoskeleton Arm Design for Rehabilitation

Course 8. An Introduction to the Kalman Filter

An inertial haptic interface for robotic applications

CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras

Effective Use of Android Sensors Based on Visualization of Sensor Information

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

4 Constrained Optimization: The Method of Lagrange Multipliers. Chapter 7 Section 4 Constrained Optimization: The Method of Lagrange Multipliers 551

Real-time Visual Tracker by Stream Processing

Virtual Fitting by Single-shot Body Shape Estimation

SYSTEMS, CONTROL AND MECHATRONICS

Off-line programming of industrial robots using co-located environments

Division of Biomedical Engineering, Chonbuk National University, 567 Baekje-daero, deokjin-gu, Jeonju-si, Jeollabuk-do , Republic of Korea 3

Understanding and Applying Kalman Filtering

2. Colour based particle filter for 3-D tracking

SERVO CONTROL SYSTEMS 1: DC Servomechanisms

Visual Servoing using Fuzzy Controllers on an Unmanned Aerial Vehicle

Vibrations can have an adverse effect on the accuracy of the end effector of a

Vision-based Walking Parameter Estimation for Biped Locomotion Imitation

HYDRAULIC ARM MODELING VIA MATLAB SIMHYDRAULICS

Off-line Model Simplification for Interactive Rigid Body Dynamics Simulations Satyandra K. Gupta University of Maryland, College Park

3D Vehicle Extraction and Tracking from Multiple Viewpoints for Traffic Monitoring by using Probability Fusion Map

Automatic Labeling of Lane Markings for Autonomous Vehicles

Wii Remote Calibration Using the Sensor Bar

Visual-inertial Tracking on Android for Augmented Reality Applications

Constraint satisfaction and global optimization in robotics

Force/position control of a robotic system for transcranial magnetic stimulation

Simulation of Electromagnetic Leakage from a Microwave Oven

Model-Based 3D Human Motion Capture Using Global-Local Particle Swarm Optimizations

Multi-view Intelligent Vehicle Surveillance System

VEHICLE LOCALISATION AND CLASSIFICATION IN URBAN CCTV STREAMS

Multi-ultrasonic sensor fusion for autonomous mobile robots

Stabilizing a Gimbal Platform using Self-Tuning Fuzzy PID Controller

Fundamentals of Computer Animation

Transcription:

hapter 13 3D Arm Motion Tracking for Home-based Rehabilitation Y. Tao and H. Hu 13.1 Introduction This paper presents a real-time hbrid solution to articulated 3D arm motion tracking for home-based rehabilitation b combining visual and inertial sensors. The Etended Kalman Filter (EKF) is used to fuse the different data modalities from two sensors and eploit complementar sensor characteristics. Due to the non-linear propert of the arm motion tracking, upper limb geometr information and the pin-hole camera model are used to improve the tracking performance. The eperimental results show the real-time performance and reasonable accurac of our sstem. Recent developments in visual tracking tend to track 3D arm motion using a single camera (Goncalves et al., 1995). Most visual tracking algorithms of arm motion are however computationall epensive and have low tracking accurac. The normall need manual initialisation, and their motion model or structure model is difficult to generalise. In contrast, inertial tracking has attracted much attention recentl in human motion analsis (achmann et al, 1999). Most of the work focused on using onl accelerometers (or gros) attached to a human bod to detect and analse the human motion. Zhou and Hu (Zhou and Hu, 005) used both acceleration and rate of turn to track 3D human arm motion in real time. ut inertial tracking suffers from the drift problem. Integrating visual and inertial sensors into a motion tracking sstem proves to be of particular value to achieve robust and applicable data (Folin et al., 004). ut eisting work has mainl been done on tracking the pose of a rigid object. Our interest in this paper is to integrate visual and inertial sensors for tracking the 3D motion of articulated objects, e.g., human upper limbs. The purpose is to develop a 3D motion tracking model for home-based rehabilitation projects, which should be cheap, accurate and in real time. Traditionall stroke patients rel on the help of phsiotherapists or well-trained carers to diagnose their rehabilitation activities during phsiotherap. e propose to develop an intelligent sstem to support the stroke patients in doing rehabilitation at home so that the burden on hospitals and

106 Tao and Hu phsiotherapists can be reduced. The motion tracking method proposed in this paper will serve as one of the major models for the home based intelligent sstem. The rest of the paper is organised as follows: Section 13. presents the sstem overview. The integration of vision and inertial sensors for tracking arm movements is presented in Section 13.3. Our eperimental results on the performance test of the proposed hbrid motion tracking method are shown in Section 13.4. Finall, conclusions and future work are presented in Section 13.5. 13. Sstem Overview Inspired b Johansson s pscholog eperiments (Johansson, 1973) using moving light displas (MLDs), we focus on tracking three arm joints: shoulder, elbow and wrist. It is assumed in our method that the shoulder joint is fied during the motion and its position is known a priori. This has been widel used in man arm motion tracking sstems (Goncalves et al., 1995). It is based on the fact that the shoulder joint moves little with the arm motion. e further assume that the length of forearm L and upper arm L are known a priori. The tracking task is now to track the elbow 1 and wrist joints in 3D. The sstem configuration and related coordinate sstems for the arm motion tracking using the proposed hbrid method are shown in Figure 13.1, including three coordinate frames: a camera frame (), an inertial sensor frame (), and a world reference sstem (). X Shoulder L1 Z X Z Elbow Y L R, T (ris X Z Y R, T Y Figure 13.1 Sstem configuration

3D Arm Motion Tracking for Home-based Rehabilitation 107 13.3 Data Fusion & Accurac Improvement 13.3.1 Sstem State Vector Normall the arm pose can be represented b a state vector with si degrees of freedom, i.e., three coordinates of elbow joint position PE ( PE, PE, PEz ) and three coordinates of wrist joint position P ( P, P, Pz ). The size of the state vector can be reduced from si to three, represented b Equation (13.1): X [ P( ] P P P (13.1) The main idea is to convert the elbow and wrist joint tracking problem to the position tracking of the wrist joint onl. The elbow joint is then inferred from wrist joint tracking and arm geometr information. The detailed procedures are as follows: (1) The elbow joint in local frame is constant and can be represented as PE ( 0, L,0) ; () Given the pose R T, of the wrist joint with respect to, the elbow joint position can be represented in as follows: PE R * PE T. It is clear that the orientation and position R T, of the wrist joint with respect to world frame are the variables to be estimated. The MT9 inertial sensor can measure the orientation R directl and our previous work (Tao et al., 005) shows it is quite accurate. e use the unit quaternion q ( qw, q1, q, q3) to represent the orientation information R in this paper. The remaining unknown variable is T, which has the same meaning as P, and needs to be estimated. z 13.3. Sensor Fusion e assume a constant acceleration motion model for the wrist position tracking. Inertial data is used as input signals in the dnamic model of the sstem: P( t 1) 1 P( v( t a( t X ( t 1) v( t 1) v( a( t (13.) where P is used to represent P and v represents P for simplicit. a( is the acceleration of the moving object represented in the world frame. It is converted from the inertial output a { a, a, a z } b the equation: a q( * a * q( 'g, where denotes the quaternion multiplication; g is the gravit; and q( is the unit quaternion representing the orientation of the frame relative to the reference frame.

108 Tao and Hu Figure 13. shows a colour marker attached to a MT9 sensor and used for visual tracking of the sensor position. The centre of the colour patch is regarded as the origin of the inertial local frame, and it is also the D image projection of the wrist joint. The reason for using colour is to achieve real-time performance and robustness to occlusion. Figure 13. olour tracking marker The visual measurements z I, I ) of the colour marker are related to the state ( vector of the sstem, which is obtained b the measurement equation (13.3). P, P z I, P, Z( I KI * KI * R P, z 1 1 P * P T Pz (13.3) where KI is the camera intrinsic parameter and is calibrated offline; R, T is the pose of the reference frame represented in the camera frame ; P ( P,, P,, P, z ) is the wrist joint represented in the camera frame and P is the state vector to be estimated. The sensor fusion is implemented with the EKF that operates in a predictorcorrector model, as shown in Figure 13.3. Note that X ( is the state estimate of the sstem, X ˆ is the predicted state vector, and Z( is the visual measurements at time t. It is assumed in our eperiments that the accelerometer measurements are Gaussian noise and isotropic with 0.1m / s and the visual measurement noise is also assumed to be Gaussian and isotropic with piels.

3D Arm Motion Tracking for Home-based Rehabilitation 109 Inertial Tracker a ( a( q( oordinate Transformation Prediction X ( P, v ) orrection Visual Tracker Z ( I, I ) X ( P, v) Figure 13.3 Sensor fusion framework 13.3.3 Accurac Improvement The EKF fusion method gives reasonable results, but the accurac is not enough when inputs are nois. Therefore, an optimisation method is adopted here to improve the tracking accurac: 1 f ( X ) (( PS PE( X ) l ) ( I proj( X ) ) ( I proj( X ) ) ) (13.4) where the first term means the elbow joint lies on a sphere surface, whose centre is located in the shoulder joint and the radius is the upper arm length. The second and third terms are to restrict the wrist joint so it lies on a back-projected line through the camera projection centre and the D image position of colour tracking. Now, the task is to find X (, which minimises (13.5): X arg min { f ( X ( ))} (13.5) ( X t This is a non-linear least square problem. The famous optimisation algorithm Levernberg-Marquardt (LM) method is emploed here to find the sstem s state vector X( at time t that minimises the cost function. e combine the sensor fusion method in the last section and the LM algorithm to achieve the good tracking accurac. The procedures are as follows: Fuse the data from inertial and visual sensors using the EKF algorithm. The result from the EKF method is used as the initial value into (13.4) to find the minimum X ( + that satisfies (13.5). 13.4 Eperimental Results In this eperiment, the camera is fied and the inertial sensor is attached to the wrist joint. As shown in Figure 13.4, the subject performs a circular motion with a radius of 10 cm.

110 Tao and Hu Figure 13.4 A circular motion Figure 13.5 and Figure 13.6 show the results of wrist joint position from the fusion method and the fusion plus optimisation method respectivel. More specificall, Figure 13.5 shows the results from the EKF estimator that can successfull capture arm motion b fusing inertial data with visual data. Fusion results don t have the drift problem. However, the results are still too nois to be used for rehabilitation applications. After introducing the arm geometr and sstem configuration constraints, fusion results can be improved. Figure 13.6 illustrates results of the fusion plus optimisation method. The tracking results match the ground truth quite well. Tracking accurac is within ±5cm for all three coordinates. (a) coordinates (b) coordinates (c) z coordinates Figure 13.5 Fusion result circular motion of the wrist joint (a) coordinates (b) coordinates (c) z coordinates Figure 13.6 Fusion+LM results circular motion of the wrist joint

3D Arm Motion Tracking for Home-based Rehabilitation 111 13.5 onclusions and Future ork This paper presents a new arm motion tracking sstem based on the integration of vision and inertial sensors. Our method does not impose an special requirements on the environment, the clothes that the subject wears or arm motion. Initial eperimental results demonstrate that the proposed method can track arm movements accuratel in 3D. It has great potential to be used for home-based rehabilitation. In the future, we will further test and refine our tracking sstem b comparing the results with that from a commercial marker based sstem, ODA or Qualiss, which offers more accurate ground truth. 13.6 References achmann ER, Duman I, Usta UY, McGhee R, Yun XP, Zda MJ (1999) Orientation tracking for humans and robots using inertial sensors. In: Proceedings of International Smposium on omputational Intelligence in Robotics and Automation (IRA 99), Montere, A, USA, pp 187-194 Folin E, Altshuler Y, Naimark L, Harrington M (004) FlightTracker: a novel optical/inertial tracker for cockpit enhanced vision. In: Proceedings of IEEE/AM International Smposium on Mied and Augmented Realit (ISMAR 004), ashington, D, USA, pp 1-1 Goncalves L, ernardo ED, Ursella E, Perona P (1995) Monocular tracking of the human arm in 3D. In: Proceedings of IV95 (International onference on omputer Vision 1995), MIT, ambridge, MA, USA, pp 764-770 Johansson G (1973) Visual perception of biological motion and a model for its analsis. Perception and Pschophsics 14(): 10-11 Tao Y, Hu H, Zhou H (005) Integration of vision and inertial sensors for home-based rehabilitation. In: Proceedings of the nd orkshop on Integration of Vision & Inertial Sensors (InerVis 005), arcelona, Spain Zhou H, Hu H (005) Inertial motion tracking of human arm movements in home-based rehabilitation. In: Proceedings of IEEE International onference on Mechatronics and Automation, Ontario, anada