MODEL BASED VISUAL RELATIVE MOTION ESTIMATION AND CONTROL OF A SPACECRAFT UTILIZING COMPUTER GRAPHICS



Similar documents
Aerospace Information Technology Topics for Internships and Bachelor s and Master s Theses

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

Analecta Vol. 8, No. 2 ISSN

Robot Perception Continued

Building an Advanced Invariant Real-Time Human Tracking System

A System for Capturing High Resolution Images

Static Environment Recognition Using Omni-camera from a Moving Vehicle

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

STEREO Guidance & Control

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

Force/position control of a robotic system for transcranial magnetic stimulation

DEOS. Deutsche Orbitale Servicing Mission. The In-flight Technology Demonstration of Germany s Robotics Approach to Service Satellites

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

How To Fix Out Of Focus And Blur Images With A Dynamic Template Matching Algorithm

Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Industrial Robotics. Training Objective

Solution Guide III-C. 3D Vision. Building Vision for Business. MVTec Software GmbH

Chapter 2. Mission Analysis. 2.1 Mission Geometry

High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound

Vectors VECTOR PRODUCT. Graham S McDonald. A Tutorial Module for learning about the vector product of two vectors. Table of contents Begin Tutorial

Projection Center Calibration for a Co-located Projector Camera System

A. OPENING POINT CLOUDS. (Notepad++ Text editor) (Cloud Compare Point cloud and mesh editor) (MeshLab Point cloud and mesh editor)

Resolution Enhancement of Photogrammetric Digital Images

Robotics. Lecture 3: Sensors. See course website for up to date information.

animation animation shape specification as a function of time

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users

Automotive Applications of 3D Laser Scanning Introduction

Colorado School of Mines Computer Vision Professor William Hoff

Laser Ranging to Nano-Satellites

Robotic Pre-Cursor Contribution to Human NEA Mission. H. Kuninaka JSPEC/JAXA

Onboard electronics of UAVs

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network

The Scientific Data Mining Process

Synthetic Sensing: Proximity / Distance Sensors

A method of generating free-route walk-through animation using vehicle-borne video image

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

TOPO Trajectory Operations Officer

How To Analyze Ball Blur On A Ball Image

ACTUATOR DESIGN FOR ARC WELDING ROBOT

Crater detection with segmentation-based image processing algorithm

Nonlinear Iterative Partial Least Squares Method

Autonomous Mobile Robot-I

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique


Solving Simultaneous Equations and Matrices

A class-structured software development platform for on-board computers of small satellites

Basic Principles of Inertial Navigation. Seminar on inertial navigation systems Tampere University of Technology

Data Sheet. definiti 3D Stereo Theaters + definiti 3D Stereo Projection for Full Dome. S7a1801

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

Collision Prevention and Area Monitoring with the LMS Laser Measurement System

Introduction. C 2009 John Wiley & Sons, Ltd

Introduction to Computer Graphics Marie-Paule Cani & Estelle Duveau

Removing Moving Objects from Point Cloud Scenes

JPEG compression of monochrome 2D-barcode images using DCT coefficient distributions

MODELLING A SATELLITE CONTROL SYSTEM SIMULATOR

Origins of the Unusual Space Shuttle Quaternion Definition

Real-time Visual Tracker by Stream Processing

Poker Vision: Playing Cards and Chips Identification based on Image Processing

Lecture 7. Matthew T. Mason. Mechanics of Manipulation. Lecture 7. Representing Rotation. Kinematic representation: goals, overview

RiMONITOR. Monitoring Software. for RIEGL VZ-Line Laser Scanners. Ri Software. visit our website Preliminary Data Sheet

A Short Introduction to Computer Graphics

Automated Process for Generating Digitised Maps through GPS Data Compression

Optical Tracking Using Projective Invariant Marker Pattern Properties

Realization of a UV fisheye hyperspectral camera

An Iterative Image Registration Technique with an Application to Stereo Vision

Physics 2A, Sec B00: Mechanics -- Winter 2011 Instructor: B. Grinstein Final Exam

CS 534: Computer Vision 3D Model-based recognition

OBJECT TRACKING USING LOG-POLAR TRANSFORMATION

Intelligent Flexible Automation

Feature Tracking and Optical Flow

Introduction to Robotics Analysis, Systems, Applications

SATELLITE IMAGES IN ENVIRONMENTAL DATA PROCESSING

An Algorithm for Classification of Five Types of Defects on Bare Printed Circuit Board

The Concept(s) of Mosaic Image Processing. by Fabian Neyer

How To Use Trackeye

Micro-CT for SEM Non-destructive Measurement and Volume Visualization of Specimens Internal Microstructure in SEM Micro-CT Innovation with Integrity

A Study on SURF Algorithm and Real-Time Tracking Objects Using Optical Flow

High-resolution Imaging System for Omnidirectional Illuminant Estimation

High speed 3D capture for Configuration Management DOE SBIR Phase II Paul Banks

Template-based Eye and Mouth Detection for 3D Video Conferencing

DEOS The German Robotics Approach to Secure and De-Orbit Malfunctioned Satellites from Low Earth Orbits

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research

Modelling, Extraction and Description of Intrinsic Cues of High Resolution Satellite Images: Independent Component Analysis based approaches

Monash University Clayton s School of Information Technology CSE3313 Computer Graphics Sample Exam Questions 2007

Automatic Labeling of Lane Markings for Autonomous Vehicles

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

PRELIMINARY DESIGN REVIEW

CHAPTER 2 ORBITAL DYNAMICS

HSI BASED COLOUR IMAGE EQUALIZATION USING ITERATIVE n th ROOT AND n th POWER

Automated part positioning with the laser tracker

CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras

COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies

By: M.Habibullah Pagarkar Kaushal Parekh Jogen Shah Jignasa Desai Prarthna Advani Siddhesh Sarvankar Nikhil Ghate

Power Electronics. Prof. K. Gopakumar. Centre for Electronics Design and Technology. Indian Institute of Science, Bangalore.

KINEMATICS OF PARTICLES RELATIVE MOTION WITH RESPECT TO TRANSLATING AXES

Integration Services

SIX DEGREE-OF-FREEDOM MODELING OF AN UNINHABITED AERIAL VEHICLE. A thesis presented to. the faculty of

Transcription:

MODEL BASED VISUAL RELATIVE MOTION ESTIMATION AND CONTROL OF A SPACECRAFT UTILIZING COMPUTER GRAPHICS Fuyuto Terui Japan Aerospace Exploration Agency 7-44-1 Jindaiji-Higashimachi, Chofu-shi, Tokyo 182-8522, JAPAN TEL : 81-422-40-3164 E-mail:terui.fuyuto@jaxa.jp Abstract An algorithm is developed for estimating the motion (relative attitude and relative position) of large pieces of space debris, such as failed satellite. The algorithm is designed to be used by a debris removal space robot which would perform six degreeof-freedom motion control (control its position and attitude simultaneously). The information required as feedback signals for such a controller is relative position, velocity, attitude and angular velocity and these are expected to be measured or estimated from image data. The algorithm uses a combination of stereo vision, 3D model matching, applying the ICP (Iterative Closest Point) algorithm, and extended Kalman Filter to increase the reliability of estimates. To evaluate the algorithm, a simulator is prepared to simulate the on-orbit optical environment in terrestrial experiments, and the motion of a miniature satellite model is estimated using images obtained from the simulator. In addition to that, six DOF(Degrees Of Freedom) manoeuvre simulation with developed algorithm for motion estimation using image data was succefully tried for the proximity flight around a failed satellite utilizing CG(Computer Graphics). 1 INTRODUCTION As the number of satellites continues to increase, space debris is becoming an increasingly serious problem for near-earth space activities and effective measures to mitigate it are important. Satellite end-of-life de-orbiting and orbital lifetime reduction capability will be effective in reducing the amount of debris by reducing the probability of collisions, but this approach cannot be applied to inert satellites and debris. On the other hand, a debris removal space robot comprising of a spacecraft ( chaser ) that actively removes space debris and retrieves malfunctioned satellites ( targets ) is considered as a complementary approach.[6] The concept of such a removal space robot is shown in Fig. 1. After rendezvous with the target and approach to approximately 50 m using data from ground observations, satellite navigation positioning and radar, the chaser maintains a constant distance from the target measured using images taken by onboard cameras. During this station-keeping phase, the chaser captures images of the target to allow remote visual inspection and measures its motion by image processing both onboard and on the ground. Since the target is noncooperative, there is no communication, no special markings or retro-reflectors to assist 1

Figure 1: On-orbit operation of a debris removal space robot requiring measurement using image data; (top) station keeping and motion measurement; (middle) fly-around and final approach; (bottom) capture of the target image processing, and no handles for capturing the target. The next phase is flying around and final approach. The chaser maneuvers towards the target to within the range of a capturing device such as manipulator arm in order to capture a designated part of the target. During the fly-around and final approach phase, the chaser must control its position and attitude simultaneously. Such six degree-of-freedom control becomes more difficult if the target is changing attitude, such as by nutation. [7] The information required as feedback signals for such a controller is relative position, velocity, attitude and angular velocity and these are expected to be measured or estimated from image data onboard. This paper deals with technical elements, motion estimation and six degree-of-freedom control expected to be necessary for the development of such a debris removal space robot. A terrestrial experiment facility called the On-orbit Visual Environment Simulator is used in this development and is described below. Stereo vision is applied to measure the threedimensional shape of a miniature satellite model using images obtained from this simulator. After that, model-based 3D model matching between groups of point data (the ICP (Iteratice Closed Point) algorithm) is applied to estimate the relative attitude and position of the target with respect to the chaser. Then the output from ICP is used as inputs to extended Kalman Filter to estimate more reliable and accurate relative attitude and position. Since extended Kalman Filter uses the model of translational and attitude motion of both bodies (target and chaser) in space, it is expected that estimates from extended Kalman Filter would not be affected a lot even in the case of data loss from ICP. For the verification of utility of proposed motion estimation algorithm and 6 DOF control algorithm, six DOF(Degrees Of Freedom) manoeuvre simulation with developed motion estimation using images from CG(Computer Graphics) was succefully tried for the proximity flight around a failed satellite. 2 MOTION MEASUREMENT USING IMAGES 2.1 On-orbit visual environment simulator The on-orbit visual environment has two characteristics that make image processing difficult: intense, highly-directional (collimated) sunlight no diffusing background other than the earth 2

Figure 2: On-orbit Visual Environment Simulator The first characteristic gives very high image contrast, with the earth s albedo the only diffuse light source. Since the target is non-cooperative it is impossible to hope for markings on its surface specifically designed to assist image processing. Satellites are wrapped in Multi Layer Insulation (MLI) materials for thermal protection such as second-surface aluminized Kapton (gold, specular), beta cloth (white, matte) and carbon-polyester coated Kapton (black, matte). Among these, aluminized Kapton is most commonly used and it gives the target the following optical characteristics. specular and wrinkled surface smooth edges Because of this, optical features such as texture changes according to the direction of lighting and view. Considering that it is not easy to mimick such images by computer graphics with sufficient fidelity to be used for developing image processing algorithms, a visual simulator is prepared in order to reproduce the characteristics of the on-orbit visual environment for preliminary terrestrial experiments. Fig. 2 shows the configuration of the visual simulator at JAXA. This facility is the 1/10 model of the actual on orbit configuration and generates simulated images taken by a chaser on-orbit. A light source simulating the sun illuminates a miniature satellite target and an earth albedo reflector that adds diffuse light to the environment. Actuators change the direction of the light source to simulate the change of sunlight direction caused by orbital motion. The attitude of the miniature satellite target is altered using a three-axis gimbal mechanism, and the position and attitude of the chaser stereo camera set is changed using a linear motion stage and a gimbal mechanism to simulate relative motion. Themonochromestereocamera(640 480 pixels) with parallel lines-of-sight and 40 mm base line distance is located 1870 mm from the center of the model. The satellite model comprises a cube measuring approximately 300 mm 250 mm 200 mm with a 400 mm 200 mm solar paddle and a radar antenna. The surface of the model is wrapped by specular and wrinkled sheet simulating MLI. 3

Figure 3: (top-left)a Motion Estimation using On-orbit Visual Environment Simulator, (top-right)motion Estimation utilizing CG, (bottom-left)6-dof manoeuvre simulation with motion estimation utilizing CG, (bottom-right)hardware-in the Loop simulation for 6-DOF manoeuvre 2.2 Motion estimation algorithm Various researchers have investigated image processing techniques for measuring the motion of targets in the space environment. Their methods vary depending upon their research objectives and assumptions. [4], [5], [1]) For this paper, we sought a strategy that would be able to use 3D shape information obtained from ordinary image data and that would not be affected by data loss from shadows, occulusion or specular reflection. [8] Fig. 4 shows our strategy for motion measurement applying to the experiment of fig. 3 : top-left. The area-based stereo matching algorithm gives 3D position data for the viewable area of the satellite model s surface from a pair of cameras, but since the only limited area of the model is in view at any one instant, and the appearance of the area changes according to the model s attitude and the direction of illumination, the number of measured points obtained tends to be limited. The stereo matching algorithm also gives many incorrectly measured points due to matching failures, further reducing the number of accurately measured points obtained. However, it is assumed that the geometry of the satellite can be obtained from design data, and so a three-dimensional shape model of the target may be constructed a priori and model matching can be applied between this model and the 3D measurement points obtained from stereo matching, allowing relative position and relative attitude to be estimated. A matching algorithm called the ICP 4

Figure 4: A Motion Estimation Strategy using Image (Iterative Closest Point) algorithm is used for this purpose. Using measured relative position and attitude as a input, extended Kalman Filter outputs estimated these values with reducednoiseandincreasedreliability. Extended Kalman Filter output predicted relative position and attitude as well as estimated ones. These are used for pre-alignment and front surface generation which is used as a model 3D points in the process of ICP resulting good matching result. 2.3 Stereo vision The area-based stereo vision algorithm uses multiple images taken from different viewpoints and generates a 3D disparity map which contains displacement information between the cameras and points on the object. The final output of the algorithm is the 3D shape of the part of the object in view reconstructed from the disparity map. In the area-based stereo method, a small window (e.g. 17 17 pixels) around each pixel in the image is matched using texture ; i.e. by minimizing an image similarity function such as SAD (Sum of Absolute Differences) of pixel intensities, defined as follows: Q SAD (I,J,d(I,J)) = N 1 X p= N 1 N 2 X q= N 2 F 1 (I + p, J + q, d) G 0 (I + p, J + q) (1) F 1 (I,J,d)=G 1 (I + d, J). (2) where the size of the matching window is (2N 1 +1) (2N 2 +1). G 0 (I,J) is the intensity of a pixel in the reference camera image (left camera) and G 1 (I,J) is the intensity of the corresponding pixel in the right camera image. d(i,j) is the disparity, which is the positional difference of the corresponding point from the reference point, and this should be optimized to minimize Q SAD (I,J,d)foreachunitpixel. F 1 (I,J,d) in eq.(2) is the pixel intensity of apoint(i + d, J) ing 1 which is a candidate for the corresponding point to G 0 (I,J). It should be noted that since the pair of cameras used are aligned side-by-side with parallel lines-of-sight, the epipolar line for finding the matching window in the right camera image 5

Figure 5: Mmeasured 3D points by stereo vision will be parallel to the horizontal image axis. Therefore, d appears only in the horizontal image coordinate of G 1 in eq.(2). The optimized disparities d (I,J) d (I,J)=argmin Q SAD (I,J,d(I,J)) (3) d are then obtained for all pixels in G 0, and make up the disparity map. To pre-process images for stereo matching, the LoG (Laplacian of Gaussian) filter, which is a spatial band pass filter, is used for extracting and enhancing specific features in image. Fig. 5 shows an example of the result of applying the stereo vision algorithm to a satellite model with specular reflection. The attitude of the miniature satellite is fixed and the stereo camera is located 1870 mm from the center of the model. The satellite model comprises a cube measuring approximately 300 mm 250 mm 200mm with a 400mm 200 mm solar paddle and a 320 mm 150 mm radar antenna. A pair of digital CCD cameras with resolution of 640 480 pixels are used. The left pictures in fig. 5 show the images taken by each camera after compensation of lens distortion. It is quite difficult to notice the difference between them at first glance, but differences in the horizontal position between corresponding parts (disparity) can be observed. The right of the figure shows measured 3D points. The original 3D points have lots of spurious noise-like outliers sprinkled around the satellite shape. Some of these are from the albedo reflector, the background curtain behind the model and the 3-axis gimbal mechanism beneath the model. The dots from the reflector and the curtain were removed using 3D position thresholding. The dots from the gimbal mechanism were suppressed by wrapping it using matte black cloths. Other noisy spurious points appear as if they were sprayed from the position of the camera and these are thought to be points where the stereo vision algorithm failed to find a matching window using texture for determining disparity. These are eliminated by Curvature masking, Left-Right Consistency Checking and Median Filtering. It can be seen that 3D point positions are obtained for only a limited part of the model. 2.4 ICP algorithm Fig. 6 shows the computational flow of the ICP algorithm.[3] The superscript (m) shows iteration number. This handles two point sets such as a measured data point set P and a model data point set X (m). It first establishes correspondences between the data sets by finding the closest point y (m) in X (m) for each point p i i in P (the 3rd box). Next, it finds the rotational transformation matrix R (m) and translational transformation matrix T (m) for 6

Figure 6: ICP algorithm Y (m) which minimizes the cost function J shown in Fig. 6 (the 4th box). This cost function is the summation of the distance between corresponding points p i and y (m) i. The algorithm in Horn (Ref. [2]) gives closed-form solution for this problem using the eigenvalue problem and finally estimated relative position ˆx, ŷ, ẑ which gives T and the estimated quaternion ˆq which gives optimal R are directly calculated. X (m) and Y (m) are then transformed using R (m) and T (m) (the 5th box). This process is repeated until convergence criteria E T E is achieved or iteration exceeds the threshold m T m (the 6th and 7th box). 2.5 extended Kalman filter and front surface model The simple and intuitive ICP algorithm has a number of advantages. Since it is an algorithm for point sets, it is independent of shape representation and does not require any local feature extraction. It can also handle a reasonable amount of noise such as mismatched dots from stereo vision. The main disadvantage of ICP alogorithm is that correct registration (matching) is not guaranteed. Depending on the initial relative position and attitude between two point sets, the result of matching could fall into a local minimum. In order to prevent this, pre-alignment ofthemodeldatasetrelativetothemeasured data set using some measure is required. The information used for the pre-alignment is predicted measurement given by extended Kalman Filter. The state and measurement of extended Kalman Filter for position (X p, Y p ) and attitude (X a, Y a )areasfollows, " # " # r H X p = TC r C, Y p = CT (4) ṙ H TC r C CT 7

Figure 7: ICP using time series of images (0, 20, 40, 60, 80, 100 sec) [(red) measured points; (blue) model 3D points after ICP] X a = " q T I ω T IT # " q T, Y a = C qc T # (5) where rtc H represents relative position vector from target to chaser expressed in Hill frame {H} (see fig.9) and rct C represents relative position vector from chaser to target expressed in chaser fixed frame {C}. qi T represents quaternion from inertia frame {I} to target fixed frame {T } and qc T represents quaternion from chaser fixed frame {C} to target fixed frame {T }. means difference from the previous time step. Using predicted measurement Y p (k +1/k) andy a (k +1/k) itispossibletoknowthe portion of the 3D model data point set X which could be seen from binocular camera and it corresponds to the measured data point set P. This viewable part of the 3D model data point set is called Front Surface model and is used as redefined X for ICP. 8

Figure 8: Estimated relative attitude (quaternion) 2.6 Motion estimation experiment using terrestrial simulator A quasi-real-time program running on Personal Computer (Intel Pentium 4, 2.0GHz, 500MB RAM) was developed which repeatedly performs model satellite attitude motion simulation, gimbal mechanism drive, image capture, stereoprocessingandmotionestimationbyicp and Kalman Filter. The attitude of the model satellite is simulated and is updated at 5 sec. time steps. Since the stereo vision processing time is approx. 5 sec. and attitude estimation takes a further 2 11 sec., the program was incapable of real-time processing and so the motion of the gimbal mechanism for the next time step was deferred until the estimation calculation was completed at each iteration. Several tests with different attitude motion have been carried out and one of the results isshowninfigs.8and7. Fig.8showstheactual and estimated relative quaternions. Fig. 7 shows the ICP matching result at each iteration. Measured points at t=0, 5, 10, 15, 45, 50, 55, 85, 90, 95, 100 (sec) are classified reliable and rest of points are classified unreliable. Number of measured points at t=0 (sec) is 1078 and corresponding Front Surface Model has 947 points. Both numbers are the result of thinned up to 1/30 for saving processing time. With the benefit ofthefirst two reliable measured points, it seems that the chosen strategy for the motion estimation using time series of images worked properly in this case. 3 Six degrees of freedom manoeuvre simulation with motion estimation utilizing Computer Graphics After confirming feasibility of the motion estimation algorithm using images taken from the terrestrial experiment (fig 3 : top-left), the same algorithm is applied successfully to the software simulator using Computer Graphics (fig. 3 ; top-right). Then closed loop simulation for six degrees of freedom maneouvre with motion estimation utilizing computer grahics (fig. 3 ; bottom-left) is performed. Fig. 9 explains the conditions of the simulation. The 9

target is a failed satellite on Low Earth Orbit and the chaser is a free flying space robot which rendezvous to the target and flies in proximity to the target utilizing image data as measurement. {T }, {C} arecoordinateframesfixed to the target and chaser respectively and {H} is Hill frame with the same origin as {T }. Both position and attitude controller are designed applying sliding mode control. [7] 3.1 Position controller The position control force obtained from sliding mode control is as follows, F = α p m S p. (6) S p + ² p where α p is a feedback gain, m is mass of chaser and ² p is small positive scalar for preventing chattering. The switching surface S p is defined S p = v C e + k p r C e. (7) k p is a constant defining the switching surface. re C and vc e in eq. (7) is relative position and velocity between required point for position control and the center of chaser defined eq. (8), (9)below,aswellasshowninfig. 9. r C e =ˆr C C ˆD C T r T req (8) v C e =ˆvC C ˆD C T ˆωT IT rt req (9) where ˆD T C is direction cosine matrix transforming from {T } to {C} and ˆω IT T is attitude rate of the target. ˆr C, C ˆDC T,ˆv C C and ˆω IT T are all estimated from image data. 3.2 Attitude controller The sliding mode attitude control torque is S a T = α a I. (10) S a + ² a where α a is a feedback gain, I is moment of inertia of chaser and ² a is small positive scalar for preventing chattering. The switching surface S a is defined as follows. S a = ω C e + k a1 q e vector sgn(q e vector )+k a2 q C T vector(2, 1) sgn(q C T scalar) (11) k a1 is a constant defining the switching surface. q e vector in eq. (11) is a quaternion representing the error angle θ LOS between LOS vector of the onboard camera r LOS and relative position vector beween target and chaser r C as shown in fig. 9. Since the major purpose for using q e vector as feedback signal for attitude control is to minimize θ LOS, there still remains the attitude error along the LOS axis. In order to make it small, qt C vector(2, 1) which represents attitude error between {T } and {C} along r LOS is also used as a feedback signal for defining switching surface. The process for calculating q e from ˆr C, C rlosisshownineqs.(12)-(15). C ˆr C C is given from motion estimation using image data explained in previous section. Ã! θlos q e vector = λ sin (12) 2 10

Figure 9: Target, chaser, frames, vectors Figure 10: Time sequence of the proximity maneouvre q e scalar = cos Ã! θlos 2 (13) ω C e λ = ˆrC C rc LOS ˆr C C rlos C Ã θ LOS = arccos ˆrC C r C LOS ˆr C C r C LOS in eq. (11) is defined below. ωc C is from Gyros on chaser. As is the case of the position control, ˆDC T and ˆω IT T are all estimated from motion estimation utilizing image data.! (14) (15) ω C e = ωc C ˆD C T ˆωT IT (16) 3.3 Numerical simulation The time sequence of the proximity maneouvre is shown in fig.10 and table1. It is a proximity maneouvre after rendezvous to the distance of approx.18 [m]. After the station keeping phase, it moves up for 10 [m] then moves back to the original position afterwards. During this maneouvre the chaser controls its position so that its own center of mass to be coincident with the desired position in the target fixed frame {T } and controls its attitude so that the LOS of its onboard camera points to the mass center of the target. It is assumed that the target is not doing attitude motion. Images of the target was generated by CG(Computer Graphics) and these are processed by motion estimation algorithm (Stereo vision + ICP + 11

Table 1: Time table of the proximity maneouvre Table 2: Specifications of target(left) and chaser(right) extended Kalman Filter). It is assumed that the direction of sunlight is in Y T axis direction so that it could be suitable for image capturing. Table2 shows specificatins of target and chaser. The chaser is designed as a micro satellite with the mass of 3.59 [kg] and it has three wheels for attitude control and six thrusters for position control. Fig.11-14 show the result of the simulation. It can be seen that the position of the chaser is controlled with error of approx. ±3[m] ineachdirection. (seefigs.11and12)asisseen from fig. 12,itisspeculatedthattheerrorismainly due to the estimated relative position error in r e caused by attitude control measurement and control error in ˆD C T. Fig.13 shows the position of the center of the target in the FOV(Fiels Of View) of the chaser onboard camera. It is controlled by the attitude controller to be within the area of FOV where motion estimation from image is possible. Fig.14 shows some examples of CG images of the left camera of binocular vision and result of motion estimation algorithm. Blue points in the lower row shows measured points using stereo vision and red points shows model 3D points after the matching by ICP(Iterative Closest Point) algorithm. The column at 5 [sec] shows the initial state, the column at 450 [sec] shows the intermediate position moving from [0, 18, 0][m] to[0, 18, 10][m], the column at 700 [sec] is at [0, 18, 10][m], the column at 750 [sec] is at the intermidiate position going back to the initial position from [0, 18, 10][m] and the collumn at 1000 [sec] is at the initial position. The image is captured every 5 [sec] and processed for motion estimation. It can be seen that blue red points was matched to the blue points successfully generating relative position and attitude. 12

Figure 11: 3D position of chaser Figure 12: 3D position of chaser : X T, Y T, Z T 13

Figure 13: position of target in FOV Figure 14: CG+measured model points 14

4 Concluding remarks and future work Motion estimation of a large space debris object using image data was performed by applying Stereo Vision, the ICP(Iterative Closest Point) algorithm using a measured data point set and a model data point set, and extended Kalman Filter. Three-axis attitude motion and position were estimated in a terrestrial experiment using an On-orbit Visual Environment Simulator. Six DOF(Degrees Of Freedom) manoeuvre simulation with motion estimation based on CG(Computer Graphics) applying developed algorithm was succefully tried for the proximity flight around a failed satellite. More challenging six DOF manoeuvre simulation such as for the chaser following nutating satellite would be the next goal of this research. In addition to that, hardware In the Loop simulation replacing the CG image by images taken in the On-orbit Visual Environment Simulator (fig. 3 : bottom-right) is now under preparation. References [1] A. Cropp and P. Palmer. Pose estimation and relative orbit determination of a nearby target microsatellite using passive imagery. 5th Cranfield Conference on Dynamics and Control of Systems and Structures in Space 2002, pages 389 395, 2002. [2]B.K.P.Horn. Closed-formsolutionofabsolute orientation using unit quaternions. Journal of Optical Society of America, 4-4:629 642, 1987. [3] P. J.Besl and N. D.McKay. A method for registration of 3-d shapes. IEEE Transactions of Pattern Analysis and Machine Intelligence, 14-2:239 256, 1992. [4] M. D. Lichter and S. Dubowsky. Estimation of state, shape, and inertial parameters of space objects from sequences of range images. Proc. SPIE Vol. 5267 : Intelligent Robots and Computer Vision XXI:Algorithms, Techniques, and Active Vision, D. P. Casasent, ed., pages 194 205, 2003. [5] P. Jasiobedzki M. Abraham and M. Umasuthan. Robust 3d vision for autonomous space robotic operation. Proceedings of the 6th International Symposium on Artificial Intelligence and Robotics & Automation in Space : i-sairas, June 18-22, 2001. [6] S. Nishida. On-orbit servicing and assembly : Japanese perspective and more. Proceedings of 24th International Symposium on Space Technology and Science, Miyazaki, JAPAN, 2004. [7] F. Terui. Position and attitude control of a spacecraft by sliding mode control. Proceedings of the American Control Conference, pages 217 221, 1998. [8] F. Terui. Relative motion estimation and control to a failed satellite by machine vision. Space Technology, 27:90 96, 2007. BACK TO SESSION DETAILED CONTENTS 15 BACK TO HIGHER LEVEL CONTENTS