Sensor Fusion of a CCD Camera and an. Acceleration-Gyro Sensor for the Recovery of. Three-Dimensional Shape and Scale
|
|
- Baldric Dorsey
- 7 years ago
- Views:
Transcription
1 Sensor Fusion of a CCD Camera and an Acceleration-Gyro Sensor for the Recovery of Three-Dimensional Shape and Scale Toshiharu Mukai Bio-Mimetic Control Research Center The Institute of Physical and Chemical Research (RIKEN) Nagoya, Japan Noboru Ohnishi Dept. of Information Engineering Faculty of Engineering Nagoya University Nagoya, Japan Abstract Shape recovery methods from an image sequence have been studied by many researchers. Theoretically, these methods are perfect, but they are sensitive to noise, so that in many practical situations, we could not obtain satisfactory results. In addition, we could not obtain the scale of the recovered object because of the imageprojection property. To solve these problems, we propose a shape recovery method based on the sensor fusion technique. This method uses an acceleration-gyro sensor attached on a CCD camera for compensating images. Keywords: Recovery from images, Sensor fusion, Gyro sensor, Three-dimensional model 1 Introduction Image changes produced by a moving camera are an important source of information on the observer's motion and structure of the environment. These changes are represented by velocities called optical ow on an image screen or point-correspondences between two or more images. The recovery of a three-dimensional structure and motion from an image sequence is one of the most important issues in computer vision. It can be used in many elds such as three-dimensional object modeling, tracking, passive navigation, and robot vision. Recovery methods from an image sequence have been proposed by many researchers (for example, [1, 2, 3, 4]). Theoretically, these methods are perfect, but they are very sensitive to noise, so that, in many practical situations, we could not obtain satisfactory results. Recently the factorization method developed by Tomasi and Kanade has attracted researchers' attentions [5]. This method has been proposed for orthogonal projection [5] and then extended for approximations of perspective projection [6]. It is reported that good results have been obtained in practical situations by the use of this method, when the approximation of the camera model is suitable. However when the assumed camera approximation is not suitable for the situation or the amount of camera motion through an image sequence is small, the results are not satisfactory yet. There is another limitation comes from an image property. Under the perspective projection or orthogonal projection which is widely used as a camera model, a slowly-moving small object near to a camera produces perfectly the same image sequence as a fast-moving object far from a camera. It means that we could not recover the scale concerning the object and camera motion (velocity or displacement). In order to solve these problems, we propose the use of an acceleration-gyro sensor attached onto a CCD camera. We selected the sensor because it does not require any environmental setting, so that the sensing system can be carried anywhere. In the following sections, we propose a method for the recovery of object shape and scale from the output of the CCD camera and acceleration-gyro sensor. Experimental results are also shown. 2 Sensor Fusion for Obtaining Good-Quality Information One of the causes of the diculty in shape recovery is the fact that the discrimination of small rotation and small translation, as shown in Figure 1, is di-
2 cult when the object's width along the optical axis relative to the distance between the camera and the object is small, because they invoke similar image changes. Object Camera Situaton 1: Translation Situation 2: Rotation Figure 1: Small rotation and small translation have a similar eect on the image screen. When we study animals, we nd that many control their eye motion so as to obtain better visual information. For example, the vestibulo-ocular re- ex makes us possible to obtain stabilized images on the retina. This reex rotates our eyeballs so as to cancel rapid head motions, by using information on our head's rotation obtained from three semicircular canals [7]. It is also reported that, when ying, insects control the direction of their bodies to obtain visual information without rotation [8]. In our system, we do not control the video camera to remove rotation, in order to build a compact, inexpensive and free-from-mechanicalproblems system. Instead, we process the image sequence by a computer to remove rotation using output from the gyro sensor obtained simultaneously with the image sequence. Conceptually, we design a virtual camera, as shown in Figure 2. This virtual sensor receives input from the video camera and gyro sensor and outputs an image sequence without rotation. Gyro sensor Video camera Inside computer Virtual camera Image sequence without rotation Figure 2: The virtual camera outputs an image sequence without rotation. 3 Overview of Our System 3.1 Setup Our system consists of two sensors, a CCD camera and an acceleration-gyro sensor, and a computer for processing. The purpose of our system is the recovery of an object's three-dimensional structure including its scale from the sensor output. We assume the following situation. The rigid object is xed in the environment and the sensor system moves around it. The object has feature points which can be tracked through an image sequence and its structure is determined by three-dimensional feature point positions. The acceleration-gyro sensor (GU-3011 by Data Tec), mounted on the CCD camera, as shown in Figure 3, is used to compensate the CCD camera. It consists of 3 vibration gyroscopes and 3 acceleration sensors in a cube with sides 36 mm long and outputs 3-axial acceleration, 3-axial angular velocity and 3-axial rotation angle at 60 Hz. The rotation angle is obtained by integrating angular velocity, so that it drifts even though it can be corrected to some extent by using gravity as reference. In the present paper, we use the acceleration and angular velocity information. Figure 3: A photograph of the CCD camera and acceleration-gyro sensor 3.2 Four Stages in Our Method Our shape and scale recovery method using the CCD camera and acceleration-gyro sensor consists of four stages. In the rst stage, optical ow or pointcorrespondences through the image sequence are obtained. This is usually achieved by tracking feature points in the image sequence. Many methods are studied for this purpose but, are not discussed here. In the second stage, we use one of any shapefrom-image-sequence methods, modied for use of
3 the acceleration-gyro sensor. This stage recovers the shape of the object and camera velocity and angular velocity at a point of time. It should be noted that the scale factor concerning the camera velocity and the recovered point positions cannot be obtained in this stage. The third stage is the integration of recovered parameters at dierent time points, which are obtained in the previous stage. By integration of the images, three eects can be expected. When recovered points exist in more than one structure at dierent time points, more accurate positions can be obtained by taking their average. In addition, if there are points which exist in only some of the recovered structures, they are appended to the rest of the points. That is, occluded points in some parts of the image sequence can be recovered by integration if the points are viewed in other parts of the image sequence. Finally, transition of camera motion is obtained. In the fourth stage, we obtain the scale concerning recovered positions and camera velocity. By integrating acceleration output from the accelerationgyro sensor, camera velocity is obtained, theoretically. However, the acceleration output includes noise in practice, so that the velocity obtained from the acceleration-gyro sensor drifts. By using both acceleration from the acceleration-gyro sensor and velocity with the unknown scale factor obtained from the image sequence, the scale is obtained. 3.3 Three Coordinate Systems Vector elements depend on the coordinate system to which the vector is related. For representing vector elements, we use three kinds of coordinate systems. The rst is xed in the world and is constant over time. We call this a base world coordinate system. Recovered structures and camera motion parameters at dierent time points are integrated in this coordinate system. When a vector x is represented in this coordinate system, it is denoted as x B. The second is a camera coordinate system which is attached to and moves with the CCD camera. When a vector x is represented in this coordinate system, it is denoted as x C. The last coordinate system is also xed in the world, but its position is changed according to the referred time. It is used to represent the sensor output obtained at the referred time. The coordinate system is positioned so as to correspond with the camera coordinate system when the sensor output is obtained. Therefore this depends on time. We termed this a temporary world coordinate system. When a vector x is represented in this coordinate system, it is denoted as x W. An example of the application of the coordinate systems is as follows. When camera velocity is v B, v C = o because the camera coordinate system moves with the camera. The relation between the base world coordinates and the temporary world coordinates is v W = Rv B where R is the rotation from the base world coordinates to the temporary world coordinates. Vector coordinates do not depend on the position of the origin of the related coordinate system, so that they can be transformed by only rotation. Output of the acceleration-gyro sensor is based on the temporary world coordinate system. That is, acceleration a W and angular velocity! W are obtained from the sensor. 4 The Recovery of Object Shape and Camera Motion from an Image Sequence The recovery of object shape and camera motion from an image sequence at a point of time is studied by many researchers. We modify one of these methods in order to use the acceleration-gyro sensor output for compensating images, then use it in our system. In this section, we briey introduce the method previously proposed by us. The details are reported in [4]. Assume that we observe a point on the object at time t and t + t. We denote the unit vector from the camera center to the point as q W and camera translation as u W, as shown in Figure 4. Then we obtain (q W (t + t) 2 q W (t)) 1 u W = 0; (1) because q W (t), q W (t+t) and u W are on the same plane. Taking t 0! 0 and using the following relation (this can be easily proved) we obtain _q W = _q C +! W 2 q C ; (2) (( _q C +! W 2 q C ) 2 q C ) 1 v W = 0: (3) By arranging the above equation on n( 8) points, we obtain G = 0; (4)
4 Observed point From the acceleration-gyro sensor,! W is obtained. By substituting this into (2), we obtain _q W. This can be considered as the output of the virtual camera in Figure 2. The virtual camera output when observing m points (m 2) yields q W ( t ) Camera center δu W at time t q W ( t+ δt) Hv W = o; (8) where H is the m 2 3-matrix and its ith row is _q W 2 q C i : (9) Figure 4: Relationship of camera positions before and after innitesimal time lapse. where = [v W 1 j v W 2 j v W 3 j! W 1 vw 1 j! W 2 vw 2 j! W 3 vw 3 j! W 1 vw 2 +! W 2 vw 1 j! W 2 vw 3 +! W 3 vw 2 j! W 3 vw 1 +! W 1 vw 3 ] T (5) and G is a n 2 9-matrix composed of only observed values. The ith row of G is [ _q C i;2 qc i;3 0 _qc i;3 qc i;2 j _qc i;3 qc i;1 0 _qc i;1 qc i;3 j _qc i;1 qc i;2 0 _qc i;2 qc i;1 j (q C i;1) j (q C i;2) j (q C i;3) j q C i;1 qc j i;2 qc i;2 qc j i;3 qc i;3 qc i;1 ]; (6) where the dot on variables denotes the time derivative of the variable and qi;j C is the jth element of q C i which is the unit vector from the camera center to the ith point. By nding nontrivial 6= 0 from (4), camera velocity v W up to its scale and angular velocity! W are obtained. We select the unit vector ^v W s as the recovered camera velocity and denote the recovered camera angular velocity as ^! W. The positions of observed points are also recovered as x W i = 0s ^vw s 1 ( _qc i + ^! W 2 q C i ) k _q C i + ^! W 2 q C i k2 q C i ; (7) where s is the unknown scale factor and v W = s^v W s if noise is absent. 5 Using the Acceleration- Gyro Sensor for Compensating an Image Sequence In this section, we describe a modication of the shape and motion recovery method described in the previous section. This can be determined by observed values only. This equation is obtained from (3). By nding nontrivial v W 6= 0 from this equation, we obtain the velocity up to its scale. This velocity is expected to be better than the previous one because the degree of freedom in the equation is smaller. In practice, m is usually much larger than 3 and H is disturbed by noise, so that the matrix has rank 3. Hence this equation is ill-conditioned for obtaining v W 6= 0. The SVD (Singular Value Decomposition) method is suitable for solving this equation. By the SVD, H is decomposed as U6V T = [u 1 ju 2 ju 3 ]diagf 1 ; 2 ; 3 g[v 1 jv 2 jv 3 ] T ; (10) where U and V are orthonormal matrices and Then v 3 is adopted as ^v W s. Point positions can be determined by (7). 6 Integration of Recovered Structures and Motion Parameters at Dierent Time Points The integration of recovered structures at dierent time points is expected to improve accuracy, recover occluded points and clarify the transition of camera motion. However, the object structures and camera motion parameters at dierent time points are obtained with respect to dierent temporary world coordinate systems. The scales are also different because they cannot be determined by the method in the previous stage. Hence we cannot simply integrate recovered structures. This problem can be solved as follows. The object shapes are the same even though coordinate systems are dierent. Therefore we can determine (relative) scaling, rotation and translation transformations which make transformed structures overlap each other. We take the structure at the rst time
5 point as the base structure and nd the transformation to this structure. This means that the temporary world coordinate system at the rst time point is used as the base world coordinate system. We denote the recovered position of point i with respect to the temporary world coordinate system at time k as x k i. The superscript W is dropped in this section for concise description. The transformed point position x 0k i from x k i is dened as x 0k i = s k R k x k i + t k ; (11) where s k ; R k and t k are the scaling, rotation and translation from the structure at time k to the base structure. In order to obtain s k ; R k ; t k, we minimize E k 1 (s k ; R k ; t k ) = 1 2 X i f~xi 0 x 0k i g 2 ; (12) where ~xi is the position of point i in the base structure. In practice, the translation t k is obtained k 1 =@t = 0 as t k = ~g 0 s k R k g k ; (13) where ~g; g k are the centroids of ~xi; x k i. Hence we minimize X0 1 ~x i 0 ~g 0 s k R k (x k i 0 g k 2 ) : E k 1 (s k ; R k ) = 1 2 i (14) In our implementation, we used the conjugate gradient method for minimizing the function numerically. Using s k ; R k ; t k obtained above, we can obtain a better object structure and camera motion as follows. Object structure: Taking the average of the structures transformed using scaling, translation and rotation, accuracy is improved. If the corresponding point does not exist in the integrated structure yet, it is appended to the integrated structure. Camera velocity: Transforming using only scaling, v W (t) is recovered. Transforming using scaling and rotation, v B (t) is recovered. Camera angular velocity: From the recovery at a point of time,! W (t) is recovered. So, transforming using rotation,! B (t) is recovered. Camera position: In the recovery at a point of time, the camera center is assumed to be at the origin of the temporary world coordinate system. Using the scaling, translation and rotation information, transition of the camera center position and direction in the base world coordinate system is obtained. 7 Determining the Scale of Structure and Velocity In this section, we denote camera velocity obtained from an image sequence as v W I (t), that from the acceleration-gyro sensor as v W G (t), and true camera velocity as v W T (t). Then, if noise is absent, v W T (t) = sv W I (t); (15) where s is the unknown scale factor. Therefore if the relation between v W T (t) and vw G (t) is known, we can determine s form the above equation. Theoretically, v W G (t) is obtained by integrating acceleration a W (t) obtained from the accelerationgyro sensor if the initial value is known. It is formulated as v W G (t) = R(t) nz t + v B (t 0 ) o t 0 fr 01 ()a W () 0 g B gd ; (16) where v B (t 0 ) is the initial velocity and R(t) is the rotation form the base world coordinates to the temporary world coordinates. It can be obtained from the acceleration-gyro sensor output! W (t) if the initial value R(t 0 ) is known. However, in practice, the acceleration-gyro sensor output includes noise so that v W G (t) drifts. Hence the following relation holds. v W T (t) = vw G (t) + b(t) (17) The b(t) represents the eect of drift and the unknown initial value. The change of b(t) comes from the drift, so we can assume that the change in short time is small. From discretization of above equations, we obtain sv W ;k I = v W ;k G + bk ; (18) where v W ;k I is v W I at time k (k = 0; 1;... ; K) and so on, and changes of b k along k are small. We minimize the following function for obtaining s. E 2 (s; bi) = 1 2 KX k=0 ksv W ;k I 0 (v W ;k G + bk )k K X k=0 k2b k 0 b k01 0 b k+1 k 2 ; (19)
6 where b 01 = b K+1 = 0 and is some positive value for weighting. 8 Experiments 8.1 Experimental Environment We used a cube with a known size, shown in Figure 5, for examining the recovery errors. The cube has sides 20 cm long. The acceleration-gyro sensor output is obtained at 60 Hz via serial connection. The CCD camera has lens whose focal length is 8 mm and its output ( pixels) is captured at 15 frames/sec synchronously with accelerationgyro sensor output. The CCD camera's inner parameters are obtained in preliminary calibration. The CCD camera was moved by human hand, so we know only the rough trajectory of the camera motion. 8.2 Experiment 1: Simple and Short Camera Motion In experiment 1, the CCD camera was moved in short period almost straightly, as shown in Figure 6. The lengths mentioned in this gure are rough estimates as explained before. 65 cm 1.5 m Object 20 cm CCD camera moves along this trajectory Figure 6: Camera motion in experiment 1. We obtained 27 frames (in 1.68 sec) in the motion. When we do not use the acceleration-gyro sensor, optical ow for obtaining results must be large. In this case, only two structures were recovered and the average of the errors was 9.4 cm. When the acceleration-gyro sensor output was used, 13 structures were recovered and the average of the errors was 3.1 cm. The accuracy was much improved by using the acceleration-gyro sensor. In Figure 7, errors of each of the recovered structures and the results of the integration of structures when the acceleration-gyro sensor output was used are plotted. The error of the integration result at index i is the result of integration from 0 to i. It is shown that the integration of recovered structures at dierent time points improves the accuracy. Figure 5: Photograph of the object with sides 20 cm long. The interval between the two images for obtaining optical ow are automatically determined by a certain method, but we do not mention it for lack of space. In order to examine the accuracy of recovery, we found the rotation, translation and scaling (if needed) from the recovered structure to the actual structure, because the recovered structures are related to a dierent coordinate system from that of the actual structure. We adopted the RMS (Root Mean Square) of the distances between actual and recovered points as the recovery error. Error [cm] Error of each structure Error of integrated structure Number of recovered structure Figure 7: Errors of recovered structures using the acceleration-gyro sensor in experiment 1. In Figure 8, the relation between kv W I k and
7 kv W G k are plotted. The plotted points are on almost the same line through the origin, because the motion nished in short period, so that the drift of v W I was small. To determine s, (19) with = 1 was used. The results are shown in Table 1, where point numbers for specifying sides are shown in Figure 5. 1 m 60 Object 20 cm 0.4 v G W [m/s] Figure 9: Camera motion in experiment v W I Error [cm] Error of each structure Figure 8: Velocities obtained from the image sequence and acceleration-gyro sensor output in experiment 1. Table 1: Recovered length and actual length in experiment 1 Side Recovered [cm] Actual [cm] Experiment 2: Complex Camera Motion More complex motion in long period was adopted in experiment 2. In this experiment, the CCD camera moved around the object as shown in Figure 9. We obtained 94 frames (in 5.64 sec) in the motion. When the acceleration-gyro sensor was not used, 6 structures were recovered as shown in Figure 10. In this case, the average of the errors before the integration of recovered structures was 3.2 cm. In Figure 11, the results when the accelerationgyro sensor output was used are plotted, where Error of integrated structure Number of recovered structure Figure 10: Errors of recovered positions without the acceleration-gyro sensor in the experiment 2. structures were recovered. It is shown that the integration of the recovered structures at dierent time points improves the accuracy. In this case, the average of the errors before integration was 3.3 cm. It is a little worse than the case without the acceleration-gyro sensor, but we obtained the larger number of recovered structures, so that the integrated structure was better than the results without the acceleration-gyro sensor. However, in this experiment where the camera motion is complex, we could not obtain reliable scale. A part of results is shown in Table 2. We need more study to improve the accuracy. The recovered structure using the accelerationgyro sensor output projected to new screen positions is shown in Figure 12. It is displayed using a wire frame or texture mapping. In the wire frame image, points are connected by lines in order to clearly show the structure. 9 Conclusion We have proposed a method for shape and scale recovery using a CCD camera and an acceleration-
8 8 7 Table 2: Recovered length and actual length in experiment 2 Error [cm] Error of each structure Side Recovered [cm] Actual [cm] Error of integrated structure Number of recovered structure Figure 11: Errors of recovered positions using the acceleration-gyro sensor in the experiment 2. [2] T. S. Huang and A. N. Netravali. Motion and Structure from Feature Correspondences: A Review. Proc. of the IEEE, 82(2):252{268, 1994 [3] J. K. Aggarwal and C. H. Chien. 3-D Structure from 2-D Images. in Advances in Machine Vision (J. L. C. Sanz, Ed.), 64{121, [4] T. Mukai and N. Ohnishi. Motion and Structure from Perspectively Projected Optical Flow by Solving Linear Simultaneous Equations. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'97), 740{745, [5] C. Tomasi and T. Kanade. Shape and Motion from Image Streams under Orthography: a Factorization Method. International Journal of Computer Vision, 9(2):137{154, Figure 12: Recovered structure using the acceleration-gyro sensor in experiment 2. gyro sensor. We modied the method proposed by us before, in order to use both the CCD camera and the acceleration-gyro sensor. In the experiments, improvement of recovered structure is veried. However, recovered scales are not so reliable when camera motion is complex. In the next step, we try to improve the accuracy of our method. In particular, improvement of scale recovery is necessary. [6] C. J. Poelman and T. Kanade. A Paraperspective Factorization Method for Shape and Motion Recovery. IEEE Trans. Pattern Anal. Machine Intell., 19(3):206{218, [7] O. Coenen and T. J. Sejnowski. A Dynamical Model of Context Dependencies for the Vestibulo-Ocular Reex. in Advances in Neural Information Processing Systems 8, MIT Press, [8] N. Franceschini, J.M. Pighon and C. Blanes. From insect vision to robot vision. Phil. Trans. Roy. Soc. B, 337:283{294, References [1] R. Y. Tsai and T. S. Huang. Uniqueness and Estimation of Three-Dimensional Motion Parameters of Rigid Objects with Curved Surfaces. IEEE Trans. Pattern Anal. Machine Intell., 6(1):13{27, 1984.
Geometric Camera Parameters
Geometric Camera Parameters What assumptions have we made so far? -All equations we have derived for far are written in the camera reference frames. -These equations are valid only when: () all distances
More informationDaniel F. DeMenthon and Larry S. Davis. Center for Automation Research. University of Maryland
Model-Based Object Pose in 25 Lines of Code Daniel F. DeMenthon and Larry S. Davis Computer Vision Laboratory Center for Automation Research University of Maryland College Park, MD 20742 Abstract In this
More informationChapter 12 Modal Decomposition of State-Space Models 12.1 Introduction The solutions obtained in previous chapters, whether in time domain or transfor
Lectures on Dynamic Systems and Control Mohammed Dahleh Munther A. Dahleh George Verghese Department of Electrical Engineering and Computer Science Massachuasetts Institute of Technology 1 1 c Chapter
More informationTracking Moving Objects In Video Sequences Yiwei Wang, Robert E. Van Dyck, and John F. Doherty Department of Electrical Engineering The Pennsylvania State University University Park, PA16802 Abstract{Object
More informationTime Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication
Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication Thomas Reilly Data Physics Corporation 1741 Technology Drive, Suite 260 San Jose, CA 95110 (408) 216-8440 This paper
More information3D Scanner using Line Laser. 1. Introduction. 2. Theory
. Introduction 3D Scanner using Line Laser Di Lu Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute The goal of 3D reconstruction is to recover the 3D properties of a geometric
More information3 Orthogonal Vectors and Matrices
3 Orthogonal Vectors and Matrices The linear algebra portion of this course focuses on three matrix factorizations: QR factorization, singular valued decomposition (SVD), and LU factorization The first
More informationSubspace Analysis and Optimization for AAM Based Face Alignment
Subspace Analysis and Optimization for AAM Based Face Alignment Ming Zhao Chun Chen College of Computer Science Zhejiang University Hangzhou, 310027, P.R.China zhaoming1999@zju.edu.cn Stan Z. Li Microsoft
More informationStatic Environment Recognition Using Omni-camera from a Moving Vehicle
Static Environment Recognition Using Omni-camera from a Moving Vehicle Teruko Yata, Chuck Thorpe Frank Dellaert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 USA College of Computing
More informationAutomatic Labeling of Lane Markings for Autonomous Vehicles
Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 jkiske@stanford.edu 1. Introduction As autonomous vehicles become more popular,
More informationForce/position control of a robotic system for transcranial magnetic stimulation
Force/position control of a robotic system for transcranial magnetic stimulation W.N. Wan Zakaria School of Mechanical and System Engineering Newcastle University Abstract To develop a force control scheme
More informationACTUATOR DESIGN FOR ARC WELDING ROBOT
ACTUATOR DESIGN FOR ARC WELDING ROBOT 1 Anurag Verma, 2 M. M. Gor* 1 G.H Patel College of Engineering & Technology, V.V.Nagar-388120, Gujarat, India 2 Parul Institute of Engineering & Technology, Limda-391760,
More informationOrbital Mechanics. Angular Momentum
Orbital Mechanics The objects that orbit earth have only a few forces acting on them, the largest being the gravitational pull from the earth. The trajectories that satellites or rockets follow are largely
More informationShape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision
International Journal of Advanced Robotic Systems ARTICLE Shape Measurement of a Sewer Pipe Using a Mobile Robot with Computer Vision Regular Paper Kikuhito Kawasue 1,* and Takayuki Komatsu 1 1 Department
More informationFREE FALL. Introduction. Reference Young and Freedman, University Physics, 12 th Edition: Chapter 2, section 2.5
Physics 161 FREE FALL Introduction This experiment is designed to study the motion of an object that is accelerated by the force of gravity. It also serves as an introduction to the data analysis capabilities
More information1996 IFAC World Congress San Francisco, July 1996
1996 IFAC World Congress San Francisco, July 1996 TRAJECTORY GENERATION FOR A TOWED CABLE SYSTEM USING DIFFERENTIAL FLATNESS Richard M. Murray Mechanical Engineering, California Institute of Technology
More informationA Reliability Point and Kalman Filter-based Vehicle Tracking Technique
A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video
More informationRobot Perception Continued
Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart
More informationIntegrated sensors for robotic laser welding
Proceedings of the Third International WLT-Conference on Lasers in Manufacturing 2005,Munich, June 2005 Integrated sensors for robotic laser welding D. Iakovou *, R.G.K.M Aarts, J. Meijer University of
More informationVision based Vehicle Tracking using a high angle camera
Vision based Vehicle Tracking using a high angle camera Raúl Ignacio Ramos García Dule Shu gramos@clemson.edu dshu@clemson.edu Abstract A vehicle tracking and grouping algorithm is presented in this work
More informationA PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS
A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS Sébastien Briot, Ilian A. Bonev Department of Automated Manufacturing Engineering École de technologie supérieure (ÉTS), Montreal,
More informationGOM Optical Measuring Techniques. Deformation Systems and Applications
GOM Optical Measuring Techniques Deformation Systems and Applications ARGUS Forming Analysis ARGUS Deformation analysis in sheet metal and forming industry Forming Characteristics of Sheet Metals Material
More informationHigh-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound
High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound Ralf Bruder 1, Florian Griese 2, Floris Ernst 1, Achim Schweikard
More informationNonlinear Iterative Partial Least Squares Method
Numerical Methods for Determining Principal Component Analysis Abstract Factors Béchu, S., Richard-Plouet, M., Fernandez, V., Walton, J., and Fairley, N. (2016) Developments in numerical treatments for
More informationRecall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.
ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?
More informationHow To Analyze Ball Blur On A Ball Image
Single Image 3D Reconstruction of Ball Motion and Spin From Motion Blur An Experiment in Motion from Blur Giacomo Boracchi, Vincenzo Caglioti, Alessandro Giusti Objective From a single image, reconstruct:
More information521493S Computer Graphics. Exercise 2 & course schedule change
521493S Computer Graphics Exercise 2 & course schedule change Course Schedule Change Lecture from Wednesday 31th of March is moved to Tuesday 30th of March at 16-18 in TS128 Question 2.1 Given two nonparallel,
More informationRobotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information.
Robotics Lecture 3: Sensors See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Locomotion Practical
More informationBasic Principles of Inertial Navigation. Seminar on inertial navigation systems Tampere University of Technology
Basic Principles of Inertial Navigation Seminar on inertial navigation systems Tampere University of Technology 1 The five basic forms of navigation Pilotage, which essentially relies on recognizing landmarks
More information404 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 14, NO. 3, JUNE 1998. Moving Obstacle Detection From a Navigating Robot
404 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 14, NO. 3, JUNE 1998 Moving Obstacle Detection From a Navigating Robot Dinesh Nair, Member, IEEE, and Jagdishkumar K. Aggarwal, Fellow, IEEE Abstract
More informationEpipolar Geometry and Visual Servoing
Epipolar Geometry and Visual Servoing Domenico Prattichizzo joint with with Gian Luca Mariottini and Jacopo Piazzi www.dii.unisi.it/prattichizzo Robotics & Systems Lab University of Siena, Italy Scuoladi
More informationDINAMIC AND STATIC CENTRE OF PRESSURE MEASUREMENT ON THE FORCEPLATE. F. R. Soha, I. A. Szabó, M. Budai. Abstract
ACTA PHYSICA DEBRECINA XLVI, 143 (2012) DINAMIC AND STATIC CENTRE OF PRESSURE MEASUREMENT ON THE FORCEPLATE F. R. Soha, I. A. Szabó, M. Budai University of Debrecen, Department of Solid State Physics Abstract
More informationDetermining the Acceleration Due to Gravity
Chabot College Physics Lab Scott Hildreth Determining the Acceleration Due to Gravity Introduction In this experiment, you ll determine the acceleration due to earth s gravitational force with three different
More informationThe Basics of FEA Procedure
CHAPTER 2 The Basics of FEA Procedure 2.1 Introduction This chapter discusses the spring element, especially for the purpose of introducing various concepts involved in use of the FEA technique. A spring
More informationA method of generating free-route walk-through animation using vehicle-borne video image
A method of generating free-route walk-through animation using vehicle-borne video image Jun KUMAGAI* Ryosuke SHIBASAKI* *Graduate School of Frontier Sciences, Shibasaki lab. University of Tokyo 4-6-1
More informationRotation Matrices and Homogeneous Transformations
Rotation Matrices and Homogeneous Transformations A coordinate frame in an n-dimensional space is defined by n mutually orthogonal unit vectors. In particular, for a two-dimensional (2D) space, i.e., n
More informationRelating Vanishing Points to Catadioptric Camera Calibration
Relating Vanishing Points to Catadioptric Camera Calibration Wenting Duan* a, Hui Zhang b, Nigel M. Allinson a a Laboratory of Vision Engineering, University of Lincoln, Brayford Pool, Lincoln, U.K. LN6
More informationMechanics lecture 7 Moment of a force, torque, equilibrium of a body
G.1 EE1.el3 (EEE1023): Electronics III Mechanics lecture 7 Moment of a force, torque, equilibrium of a body Dr Philip Jackson http://www.ee.surrey.ac.uk/teaching/courses/ee1.el3/ G.2 Moments, torque and
More informationAutomatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior
Automatic Calibration of an In-vehicle Gaze Tracking System Using Driver s Typical Gaze Behavior Kenji Yamashiro, Daisuke Deguchi, Tomokazu Takahashi,2, Ichiro Ide, Hiroshi Murase, Kazunori Higuchi 3,
More informationComponent Ordering in Independent Component Analysis Based on Data Power
Component Ordering in Independent Component Analysis Based on Data Power Anne Hendrikse Raymond Veldhuis University of Twente University of Twente Fac. EEMCS, Signals and Systems Group Fac. EEMCS, Signals
More informationVibrations can have an adverse effect on the accuracy of the end effector of a
EGR 315 Design Project - 1 - Executive Summary Vibrations can have an adverse effect on the accuracy of the end effector of a multiple-link robot. The ability of the machine to move to precise points scattered
More informationLow-resolution Character Recognition by Video-based Super-resolution
2009 10th International Conference on Document Analysis and Recognition Low-resolution Character Recognition by Video-based Super-resolution Ataru Ohkura 1, Daisuke Deguchi 1, Tomokazu Takahashi 2, Ichiro
More informationSpatial location in 360 of reference points over an object by using stereo vision
EDUCATION Revista Mexicana de Física E 59 (2013) 23 27 JANUARY JUNE 2013 Spatial location in 360 of reference points over an object by using stereo vision V. H. Flores a, A. Martínez a, J. A. Rayas a,
More information1 2 3 1 1 2 x = + x 2 + x 4 1 0 1
(d) If the vector b is the sum of the four columns of A, write down the complete solution to Ax = b. 1 2 3 1 1 2 x = + x 2 + x 4 1 0 0 1 0 1 2. (11 points) This problem finds the curve y = C + D 2 t which
More informationBasler. Line Scan Cameras
Basler Line Scan Cameras High-quality line scan technology meets a cost-effective GigE interface Real color support in a compact housing size Shading correction compensates for difficult lighting conditions
More informationEpipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R.
Epipolar Geometry We consider two perspective images of a scene as taken from a stereo pair of cameras (or equivalently, assume the scene is rigid and imaged with a single camera from two different locations).
More informationLab 7: Rotational Motion
Lab 7: Rotational Motion Equipment: DataStudio, rotary motion sensor mounted on 80 cm rod and heavy duty bench clamp (PASCO ME-9472), string with loop at one end and small white bead at the other end (125
More informationComputational Optical Imaging - Optique Numerique. -- Deconvolution --
Computational Optical Imaging - Optique Numerique -- Deconvolution -- Winter 2014 Ivo Ihrke Deconvolution Ivo Ihrke Outline Deconvolution Theory example 1D deconvolution Fourier method Algebraic method
More informationOrthogonal Projections
Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors
More informationDATA ACQUISITION FROM IN VITRO TESTING OF AN OCCLUDING MEDICAL DEVICE
DATA ACQUISITION FROM IN VITRO TESTING OF AN OCCLUDING MEDICAL DEVICE Florentina ENE 1, Carine GACHON 2, Nicolae IONESCU 3 ABSTRACT: This paper presents a technique for in vitro testing of an occluding
More informationClassifying Manipulation Primitives from Visual Data
Classifying Manipulation Primitives from Visual Data Sandy Huang and Dylan Hadfield-Menell Abstract One approach to learning from demonstrations in robotics is to make use of a classifier to predict if
More informationObject tracking & Motion detection in video sequences
Introduction Object tracking & Motion detection in video sequences Recomended link: http://cmp.felk.cvut.cz/~hlavac/teachpresen/17compvision3d/41imagemotion.pdf 1 2 DYNAMIC SCENE ANALYSIS The input to
More informationVEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS
VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS Aswin C Sankaranayanan, Qinfen Zheng, Rama Chellappa University of Maryland College Park, MD - 277 {aswch, qinfen, rama}@cfar.umd.edu Volkan Cevher, James
More informationInner Product Spaces
Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and
More informationIntelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research
20th International Congress on Modelling and Simulation, Adelaide, Australia, 1 6 December 2013 www.mssanz.org.au/modsim2013 Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and
More informationRAY OPTICS II 7.1 INTRODUCTION
7 RAY OPTICS II 7.1 INTRODUCTION This chapter presents a discussion of more complicated issues in ray optics that builds on and extends the ideas presented in the last chapter (which you must read first!)
More informationDevelopment of The Next Generation Document Reader -Eye Scanner-
Development of The Next Generation Document Reader -Eye Scanner- Toshiyuki Amano *a, Tsutomu Abe b, Tetsuo Iyoda b, Osamu Nishikawa b and Yukio Sato a a Dept. of Electrical and Computer Engineering, Nagoya
More informationGeometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi
Geometry of Vectors Carlo Tomasi This note explores the geometric meaning of norm, inner product, orthogonality, and projection for vectors. For vectors in three-dimensional space, we also examine the
More informationNational Technical University of Athens, 9 Iroon Polytechniou str, GR-15773 Athens, Greece. Email: ndroso@image.ntua.gr
An Optical Camera Tracking System for Virtual Sets Applications Athanasios Drosopoulos, Yiannis Xirouhakis and Anastasios Delopoulos Computer Science Div., Dept. of Electrical and Computer Eng., National
More informationFeature Tracking and Optical Flow
02/09/12 Feature Tracking and Optical Flow Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem Many slides adapted from Lana Lazebnik, Silvio Saverse, who in turn adapted slides from Steve
More informationROBUST VEHICLE TRACKING IN VIDEO IMAGES BEING TAKEN FROM A HELICOPTER
ROBUST VEHICLE TRACKING IN VIDEO IMAGES BEING TAKEN FROM A HELICOPTER Fatemeh Karimi Nejadasl, Ben G.H. Gorte, and Serge P. Hoogendoorn Institute of Earth Observation and Space System, Delft University
More informationDev eloping a General Postprocessor for Multi-Axis CNC Milling Centers
57 Dev eloping a General Postprocessor for Multi-Axis CNC Milling Centers Mihir Adivarekar 1 and Frank Liou 1 1 Missouri University of Science and Technology, liou@mst.edu ABSTRACT Most of the current
More informationSolution Guide III-C. 3D Vision. Building Vision for Business. MVTec Software GmbH
Solution Guide III-C 3D Vision MVTec Software GmbH Building Vision for Business Machine vision in 3D world coordinates, Version 10.0.4 All rights reserved. No part of this publication may be reproduced,
More informationSOLID MECHANICS TUTORIAL MECHANISMS KINEMATICS - VELOCITY AND ACCELERATION DIAGRAMS
SOLID MECHANICS TUTORIAL MECHANISMS KINEMATICS - VELOCITY AND ACCELERATION DIAGRAMS This work covers elements of the syllabus for the Engineering Council exams C105 Mechanical and Structural Engineering
More informationHuman-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database
Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database Seungsu Kim, ChangHwan Kim and Jong Hyeon Park School of Mechanical Engineering Hanyang University, Seoul, 133-791, Korea.
More informationTHE problem of visual servoing guiding a robot using
582 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 13, NO. 4, AUGUST 1997 A Modular System for Robust Positioning Using Feedback from Stereo Vision Gregory D. Hager, Member, IEEE Abstract This paper
More informationAn Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network
Proceedings of the 8th WSEAS Int. Conf. on ARTIFICIAL INTELLIGENCE, KNOWLEDGE ENGINEERING & DATA BASES (AIKED '9) ISSN: 179-519 435 ISBN: 978-96-474-51-2 An Energy-Based Vehicle Tracking System using Principal
More informationSolving Simultaneous Equations and Matrices
Solving Simultaneous Equations and Matrices The following represents a systematic investigation for the steps used to solve two simultaneous linear equations in two unknowns. The motivation for considering
More informationDENSITY MEASURING SYSTEMS
SHIBAYAMA SCIENTIFIC CO., LTD. DENSITY MEASURING SYSTEMS Density Gradient Tube Method Direct Reading Type A / Type B Comply with the standards below: JIS K-0061-1992, K-7112 1980 ASTM D1505 Type A /Left
More informationGeometric Optics Converging Lenses and Mirrors Physics Lab IV
Objective Geometric Optics Converging Lenses and Mirrors Physics Lab IV In this set of lab exercises, the basic properties geometric optics concerning converging lenses and mirrors will be explored. The
More informationRotation: Moment of Inertia and Torque
Rotation: Moment of Inertia and Torque Every time we push a door open or tighten a bolt using a wrench, we apply a force that results in a rotational motion about a fixed axis. Through experience we learn
More informationSimple. Intelligent. The SIMATIC VS 100 Series. simatic MACHINE VISION. www.siemens.com/machine-vision
Simple. Intelligent. The SIMATIC VS 100 Series. simatic MACHINE VISION www.siemens.com/machine-vision simatic Intelligence that pays off In answer to the problem of steadily increasing clock-pulse rates
More informationEffective Use of Android Sensors Based on Visualization of Sensor Information
, pp.299-308 http://dx.doi.org/10.14257/ijmue.2015.10.9.31 Effective Use of Android Sensors Based on Visualization of Sensor Information Young Jae Lee Faculty of Smartmedia, Jeonju University, 303 Cheonjam-ro,
More informationAN ASSESSMENT OF MSC.NASTRAN RESIDUAL VECTOR METHODOLOGY FOR DYNAMIC LOADS ANALYSIS
AN ASSESSMENT OF MSC.NASTRAN RESIDUAL VECTOR METHODOLOGY FOR DYNAMIC LOADS ANALYSIS Christopher C. Flanigan Mark C. Stabb, Ph.D., P.E. Incorporated San Diego, California USA 18th International Modal Analysis
More informationAccurate and robust image superresolution by neural processing of local image representations
Accurate and robust image superresolution by neural processing of local image representations Carlos Miravet 1,2 and Francisco B. Rodríguez 1 1 Grupo de Neurocomputación Biológica (GNB), Escuela Politécnica
More informationIndustrial Robotics. Training Objective
Training Objective After watching the program and reviewing this printed material, the viewer will learn the basics of industrial robot technology and how robots are used in a variety of manufacturing
More informationComparison of the Response of a Simple Structure to Single Axis and Multiple Axis Random Vibration Inputs
Comparison of the Response of a Simple Structure to Single Axis and Multiple Axis Random Vibration Inputs Dan Gregory Sandia National Laboratories Albuquerque NM 87185 (505) 844-9743 Fernando Bitsie Sandia
More informationDICOM Correction Item
Correction Number DICOM Correction Item CP-626 Log Summary: Type of Modification Clarification Rationale for Correction Name of Standard PS 3.3 2004 + Sup 83 The description of pixel spacing related attributes
More informationHow To Use Trackeye
Product information Image Systems AB Main office: Ågatan 40, SE-582 22 Linköping Phone +46 13 200 100, fax +46 13 200 150 info@imagesystems.se, Introduction TrackEye is the world leading system for motion
More informationDevelopment of Optical Wave Microphone Measuring Sound Waves with No Diaphragm
Progress In Electromagnetics Research Symposium Proceedings, Taipei, March 5 8, 3 359 Development of Optical Wave Microphone Measuring Sound Waves with No Diaphragm Yoshito Sonoda, Takashi Samatsu, and
More informationThe Image Deblurring Problem
page 1 Chapter 1 The Image Deblurring Problem You cannot depend on your eyes when your imagination is out of focus. Mark Twain When we use a camera, we want the recorded image to be a faithful representation
More informationOBJECT TRACKING USING LOG-POLAR TRANSFORMATION
OBJECT TRACKING USING LOG-POLAR TRANSFORMATION A Thesis Submitted to the Gradual Faculty of the Louisiana State University and Agricultural and Mechanical College in partial fulfillment of the requirements
More informationTWO-DIMENSIONAL TRANSFORMATION
CHAPTER 2 TWO-DIMENSIONAL TRANSFORMATION 2.1 Introduction As stated earlier, Computer Aided Design consists of three components, namely, Design (Geometric Modeling), Analysis (FEA, etc), and Visualization
More informationChapter 10 Rotational Motion. Copyright 2009 Pearson Education, Inc.
Chapter 10 Rotational Motion Angular Quantities Units of Chapter 10 Vector Nature of Angular Quantities Constant Angular Acceleration Torque Rotational Dynamics; Torque and Rotational Inertia Solving Problems
More informationLectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain
Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal
More informationSimple Harmonic Motion
Simple Harmonic Motion 1 Object To determine the period of motion of objects that are executing simple harmonic motion and to check the theoretical prediction of such periods. 2 Apparatus Assorted weights
More informationReal-time Visual Tracker by Stream Processing
Real-time Visual Tracker by Stream Processing Simultaneous and Fast 3D Tracking of Multiple Faces in Video Sequences by Using a Particle Filter Oscar Mateo Lozano & Kuzahiro Otsuka presented by Piotr Rudol
More informationSO(3) Camillo J. Taylor and David J. Kriegman. Yale University. Technical Report No. 9405
Minimization on the Lie Group SO(3) and Related Manifolds amillo J. Taylor and David J. Kriegman Yale University Technical Report No. 945 April, 994 Minimization on the Lie Group SO(3) and Related Manifolds
More informationLinear Algebra Notes for Marsden and Tromba Vector Calculus
Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of
More informationISOMETRIES OF R n KEITH CONRAD
ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x
More informationHow To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud
REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR Paul Mrstik, Vice President Technology Kresimir Kusevic, R&D Engineer Terrapoint Inc. 140-1 Antares Dr. Ottawa, Ontario K2E 8C4 Canada paul.mrstik@terrapoint.com
More informationCenter of Gravity. We touched on this briefly in chapter 7! x 2
Center of Gravity We touched on this briefly in chapter 7! x 1 x 2 cm m 1 m 2 This was for what is known as discrete objects. Discrete refers to the fact that the two objects separated and individual.
More informationAffine Object Representations for Calibration-Free Augmented Reality
Affine Object Representations for Calibration-Free Augmented Reality Kiriakos N. Kutulakos kyros@cs.rochester.edu James Vallino vallino@cs.rochester.edu Computer Science Department University of Rochester
More informationBasic numerical skills: EQUATIONS AND HOW TO SOLVE THEM. x + 5 = 7 2 + 5-2 = 7-2 5 + (2-2) = 7-2 5 = 5. x + 5-5 = 7-5. x + 0 = 20.
Basic numerical skills: EQUATIONS AND HOW TO SOLVE THEM 1. Introduction (really easy) An equation represents the equivalence between two quantities. The two sides of the equation are in balance, and solving
More informationMagnetic Field of a Circular Coil Lab 12
HB 11-26-07 Magnetic Field of a Circular Coil Lab 12 1 Magnetic Field of a Circular Coil Lab 12 Equipment- coil apparatus, BK Precision 2120B oscilloscope, Fluke multimeter, Wavetek FG3C function generator,
More informationSPINDLE ERROR MOVEMENTS MEASUREMENT ALGORITHM AND A NEW METHOD OF RESULTS ANALYSIS 1. INTRODUCTION
Journal of Machine Engineering, Vol. 15, No.1, 2015 machine tool accuracy, metrology, spindle error motions Krzysztof JEMIELNIAK 1* Jaroslaw CHRZANOWSKI 1 SPINDLE ERROR MOVEMENTS MEASUREMENT ALGORITHM
More informationSynthetic Sensing: Proximity / Distance Sensors
Synthetic Sensing: Proximity / Distance Sensors MediaRobotics Lab, February 2010 Proximity detection is dependent on the object of interest. One size does not fit all For non-contact distance measurement,
More informationNMR and IR spectra & vibrational analysis
Lab 5: NMR and IR spectra & vibrational analysis A brief theoretical background 1 Some of the available chemical quantum methods for calculating NMR chemical shifts are based on the Hartree-Fock self-consistent
More informationActive Vibration Isolation of an Unbalanced Machine Spindle
UCRL-CONF-206108 Active Vibration Isolation of an Unbalanced Machine Spindle D. J. Hopkins, P. Geraghty August 18, 2004 American Society of Precision Engineering Annual Conference Orlando, FL, United States
More informationSimulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD)
Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD) Jatin Dave Assistant Professor Nirma University Mechanical Engineering Department, Institute
More information