Eye Gaze Tracking Under Natural Head Movements

Size: px
Start display at page:

Download "Eye Gaze Tracking Under Natural Head Movements"

Transcription

1 Eye Gaze Tracking Under Natural Head Movements Zhiwei Zhu and Qiang Ji Department o ECE, Rensselaer Polytechnic Institute, Troy, NY,12180 {zhuz,jiq}@rpi.edu Abstract Most available remote eye gaze trackers based on Pupil Center Corneal Relection (PCCR) technique have two characteristics that prevent them rom being widely used as an important computer input device or human computer interaction. First, they must oten be calibrated repeatedly or each individual; econd, they have low tolerance or head movements and require the user to hold the head uncomortably still. In this paper, we propose a novel solution or the classical PCCR technique that will simpliy the calibration procedure and allow ree head movements. The core o our method is to analytically obtain a head mapping unction to compensate head movement. peciically, the head mapping unction allows to automatically map the eye movement measurement under an arbitrary head position to a reerence head position so that the gaze can be estimated rom the mapped eye measurement with respect to the reerence head position. Furthermore, our method minimizes the calibration procedure to only one time or each individual. Our proposed method will signiicantly improve the usability o the eye gaze tracking technology, which is a major step or eye tracker to be accepted as a natural computer input device. 1 Introduction Eye Gaze is deined as the line o sight o a person. It represents a person s ocus o attention. Eye gaze tracking has been an active research topic or many decades because o its potential usages in various applications such as Human Computer Interaction, Eye Disease Diagnosis, Human Behavior tudy, etc. Earlier eye gaze trackers were airly intrusive in that they require physical contacts with the user, such as placing a relective white dot directly onto the eye [1] or attaching a number o electrodes around the eye [2]. Except the intrusive properties, most o these technologies also require the viewer s head to be motionless during eye tracking. With the rapid technological advancements in both video cameras and microcomputers, eye gaze tracking technology based on the digital video analysis o eye movements has been widely explored. ince it does not require anything attached to the user, the video technology opens the most promising direction to build a non-intrusive eye gaze tracker. Based on the eye images captured by the video cameras, various techniques [3], [4], [5], [6], [7], [8], [9] have been proposed to do the eye gaze estimation. Yet most o them suer rom two problems, the need o calibration per user session and the large restriction on head motion [10]. Among them, the PCCR technique is the most commonly used approach to perorm the video-based eye gaze tracking. The angle o the visual axis (or the location o the ixation point on the display surace) is calculated by tracking the relative position o the pupil center and a speck o light relected rom the cornea, technically known as glint as shown in Figure 1. The accuracy o the system can be urther enhanced by illuminating the eyes with low-level Inrared light, which produces the bright pupil eect as shown Figure 1. It makes the video image easier to process. glint Fig. 1. Eye image with corneal relection (or glint). everal systems [11], [12], [3], [4], [10], [8] have been built based on the PCCR technique. Most o these systems show that i the user has the ability to keep his head ixed or via the help o a chin rest or bite bar to restrict the head motion, very high accuracy o the eye gaze tracking results can be achieved. peciically, the average error can be less than 1 visual angle, which corresponds to less than 10 mm in the computer screen when the subject is sitting around 550 mm rom the computer screen. But as the head moves away rom the original position where the user perormed the eye gaze calibration, the accuracy o these eye gaze tracking systems drops dramatically. ome detailed data [10] are reported about how the calibration mapping unction decays as the head moves away rom its original position. The current solution to overcome this problem is perorming another new calibration whenever the head moves. This, however, is practically impossible since a person s head constantly moves. Thereore, the limited head movement is one o the worst issues in the current remote eye tracking systems. Jacob reports a similar act in [12], and tried to solve the problem by giving the user the possibility o making local manual re-calibration, which brings numerous troubles or the user. It is obvious that most o the existing gaze tracking systems based on PCCR technique share two common drawbacks: irst, the user must perorm certain experiments in calibrating the user-dependent parameters beore using the gaze tracking system; second, the user must keep his head uncomortably still, no signiicant head movements allowed. In this paper, a solution is proposed to handle these two issues. First, beore using the system, a calibration procedure

2 is perormed or each user to obtain a gaze mapping unction. Ater the calibration, the user does not need to calibrate the system again. econd, completely dierent rom previous eye gaze trackers, the user can move his head reely in ront o the camera, which will make the communications between the computer and user more naturally. Thereore, by using our gaze tracking technique, a more robust, accurate, comortable and useul system can be built. 2 Classical PCCR Technique The PCCR based technique consists o two major components: pupil-glint vector extraction and gaze mapping unction acquisition. 1. Pupil-glint Vector Extraction Gaze estimation starts with the pupil-glint vector extraction. Ater grabbing the eye image rom the camera, computer vision techniques [9], [13] are proposed to extract the pupil center and the glint center robustly and accurately. The pupil center and the glint center is connected to orm a 2D pupilglint vector v as shown in Figure peciic Gaze Mapping Function Acquisition Ater obtaining the pupil-glint vectors, a calibration procedure is proposed to acquire a speciic gaze mapping unction that will map the extracted pupil-glint vector to the user s ixation point in the screen or current head position. The extracted pupil-glint vector v is represented as (v x,v y ) and the screen gaze point s is represented by (x gaze,y gaze ) in the screen coordinate system. The speciic gaze mapping unction s = (v) can be modelled by the ollowing nonlinear equations [8]: x gaze = a 0 + a 1 v x + a 2 v y + a 3 v x v y 2 (1) y gaze = b 0 + b 1 v x + b 2 v y + b 3 v y The coeicients a 0,a 1,a 2,a 3 and b 0,b 1,b 2,b 3 are estimated rom a set o pairs o pupil-glint vectors and the corresponding screen gaze points. These pairs are collected in a calibration procedure. During the calibration, the user is required to visually ollow a shining dot as it displays at several predeined locations on the computer screen. In addition, he must keep his head as still as possible. I the user does not move his head signiicantly ater gaze calibration, the speciic gaze mapping unction can be used to estimate the user s gaze point in the screen accurately based on the extracted pupil-glint vector. But when the user moves his head away rom the position where the speciic gaze calibration is perormed, the speciic gaze mapping unction will ail to estimate the gaze point accurately because o the pupilglint vector changes caused by the head movement. In the ollowing, the head movement eects on the pupil-glint vector will be illustrated. 2.1 Head Motion Eects on Pupil-glint Vector Figure 2 shows the ray diagram o the pupil-glint vector generation in the image when an eye is located at two dierent } 3D positions and in ront o the camera due to head movement. For simplicity, the eye is represented by a cornea, the cornea is modelled as a convex mirror and the IR light source used to generate the glint is located at O, which are applicable to all the subsequent igures. Assume that the origin o the camera is located at O, p 1 and p 2 are the pupil centers and g 1 and g 2 are the glint centers generated in the image. Further, at both positions, the user is looking at the same point o the computer screen. According to the light ray diagram shown in Figure 2, the generated pupil-glint vectors 1 p 1 and g 2 p 2 will be signiicantly dierent in the images as shown in Figure 3. Two actors are responsible or this pupilglint vector dierence: irst, the eyes are at dierent positions in ront o the camera; second, in order to look at the same screen point, eyes at dierent positions rotate themselves dierently. Camera Z Axis Cornea G1 G2 O g 1 p 1 g 2 p 2 Image Plane creen Plane Fig. 2. Pupil and glint image ormations when eyes are located at dierent positions while gazing at the same screen point (side view). p g 1 1 p g 2 2 (a) eye image at location (b) eye image at location Fig. 3. The pupil-glint vectors generated in the eye images when the eye is located at and in Figure 2. The eye will move as the head moves. Thereore, when the user is gazing at a ixed point on the screen while moving his head in ront o the camera, a set o pupil-glint vectors in the image will be generated. These pupil-glint vectors are signiicantly dierent rom each other. I uncorrected, inaccurate gaze points will be estimated ater inputting them into the speciic gaze mapping unction obtained at the reerence eye position. Thereore, the head movement eects on these pupil-glint vectors must be eliminated in order to utilize the speciic gaze mapping unction to estimate the screen gaze points accurately. In the ollowing, a technique is proposed to eliminate the head movement eects on these pupil-glint vectors beore inputting them into the speciic gaze mapping unction. With this technique, accurate gaze screen points can be estimated accurately whether the users has moved his head or not.

3 3 Dynamic Head Compensation Model 3.1 Approach Overview The irst step o our technique is to ind a speciic gaze mapping unction O1 between the pupil-glint vector v 1 and the screen coordinate at a reerence 3D eye position.thisis usually achieved via a gaze calibration procedure using equations 1. The unction O1 can be expressed as ollows: = O1 (v 1 ) (2) pupil-glint vectors g 1 p 1 and g 2 p 2 are generated in the image. Further, as shown in Figure 4, a parallel plane A o the image plane that goes through the point will intersect the line O at G 1 1. Another parallel plane B o the image plane that goes through the point will intersect the line O at G 2 2. Thereore, g 1 p 1 is the projection o the vector G 1 and 2 p 2 is the projection o the vector G 2 in the image plane. Because plane A, plane B and the image plane are parallel, the vectors g 1 p 1, g 2 p 2, G 1 and G 2 can be represented as 2D vectors in the X Y plane o the camera coordinate system. Assume that when the eye moves to a new position as the head moves, a pupil-glint vector v 2 will be created in the image while the user is looking at the same screen point. When is signiicantly dierent rom, v 2 can not be used as the input o the gaze mapping unction O1 to estimate the screen gaze point due to the changes o the pupilglint vector caused by the head movement. I the changes o the pupil-glint vector v 2 caused by the head movement can be eliminated, then a corrected pupil-glint vector v 2 will be obtained. Ideally, this corrected pupil-glint vector v 2 is the generated pupil-glint vector v 1 o the eye at the reerence position when gazing at the same screen point. Thereore, this is equivalent to inding a head mapping unction g between two dierent pupil-glint vectors at two dierent head positions when still gazing at the same screen point. This mapping unction g can be written as ollows: creen Plane X p Image Plane 2 Y Plane A G 1 G 1 Plane B G 2 O s o p 1 c g 2 g 1 O G 2 creen Gaze Point v 2 = g(v 2,, ) (3) where v 2 is the equivalent measurement o v 1 with respect to the initial reerence head position. Thereore, the screen gaze point can be estimated accurately rom the pupil-glint vector v 2 via the speciic gaze mapping unction O1 as ollows: = O1 (g(v 2,, )) = F (v 2, ) (4) where the unction F can be called as a generalized gaze mapping unction or a computational dynamic head compensation model. It provides the gaze mapping unction or a new eye position dynamically. With the use o the proposed technique, whenever the head moves, a gaze mapping unction at each new 3D eye position will be estimated automatically, thereore, the issue o the head movement can be solved through g. 3.2 Head Movement Compensation In this section, we show how to ind the head mapping unction g. Figure 4 shows the process o the pupil-glint vector ormation in the image or an eye in ront o the camera. When the eye is located at two dierent positions and while still gazing at the same screen point, two dierent Z Axis o Camera Fig. 4. Pupil and glint image ormation when the eye is located at dierent positions in ront o the camera Image Projection o Pupil-glint Vector Assume that in the camera coordinate system, the 3D pupil centers and are represented as (x 1,y 1,z 1 ) and (x 2,y 2,z 2 ), the glint centers g 1 and g 2 are represented as (x g1,y g1, ) and (x g2,y g2, ), where is ocus length o the camera, and the screen gaze point is represented by (x s,y s,z s ). Via the pinhole camera model, the image projection o the pupil-glint vectors can be expressed as ollows: 1 p 1 = G 1 (5) z 1 2 p 2 = G 2 (6) z 2 Assume that the pupil-glint vectors g 1 p 1 and g 2 p 2 are represented as (v x1,v y1 ) and (v x2,v y2 ) respectively, and the vectors G 1 and G 2 are represented as (V x1,v y1 ) and (V x2,v y2 ) respectively. Thereore, the ollowing equation can be derived by combining the equations 5 and 6: 1 G 1 is not the actual virtual image o the IR light source 2 G 2 is not the actual virtual image o the IR light source

4 v x1 = V x1 z 2 v x2 (7) V x2 z 1 v y1 = V y1 z 2 v y2 (8) V y2 z 1 The above two equations describe how the pupil-glint vector changes as the head moves in ront o the camera. Also, based on the above equations, it is obvious that each component o the pupil-glint vector can be transormed individually. Thereore, equation 7 or the X component o the pupil-glint vector will be derived irst as ollows First Case: the cornea center and the pupil center lie on the camera s X Z plane: Figure 5 shows the ray diagram o the pupil-glint vector ormation when the cornea center and pupil center o an eye happen to lie on the X Z plane o the camera coordinate system. Thereore, either the generated pupil-glint vectors p 1 g 1 and p 2 g 2 or the vectors P 1 G 1 and P 2 G 2 can be represented as one dimensional vectors, speciically, p 1 g 1 = v x1, p 2 g 2 = v x2, P 1 G 1 = V x1 and P 2 G 2 = V x2. creen Plane X p o Image Plane 2 g 2 c p 1 g1 G P 1 1 G 1 Z Axis o Camera O s O G 2 creen Gaze Point Fig. 5. Pupil and glint image ormation when the eye is located at dierent positions in ront o the camera (Top-down view). According to Figure 5, the vectors G 1 and G 2 can be represented as ollows: G 1 = G 1 O 1 + O 1 (9) G 2 = G 2 O 2 + O 2 (10) For simplicity, r 1 is used to represent the length o O 1, r 2 is used to represent the length o, G 1 is represented as α1, G 2 is represented as α2, G 1 is represented as β1 and G 2 is represented as β2. According to the geometries shown in Figure 5, the vectors G 1 and G 2 can be urther achieved as ollows: 1 2 G 1 = r 1 sin(α1) r 1 cos(α1) tan(β1) (11) G 2 = r 2 sin(α2) r 2 cos(α2) tan(β2) (12) AsshowninFigure5,lineG 1 and line G 2 are parallel to the X axis o the camera. Thereore, tan(β1) and tan(β2) can be obtained rom the rectangles g 1 o c O and g 2 o c O individually as ollows: tan(β1) = tan(β2) = o c g 1 (13) o c g 2 (14) In the above equation, g 1 and g 2 are the glints in the image, and o c is the principal point o the camera. For simplicity, we choose x g1 to represent o c g 1 and x g2 to represent o c g 2. Thereore, ater detecting the glints in the image, tan(β1) and tan(β2) can be obtained accurately. Further, sin(α1), cos(α1), sin(α2) and cos(α2) can be obtained rom the geometries o the rectangles 1 and 2 directly. Thereore, equations 11 and 12 can be derived as ollows: V x1 = r 1 (z s z 1 ) x g1 V x2 = r 2 (z s z 2 ) x g2 + r 1 (x s x 1 ) + r 2 (x s x 2 ) (15) (16) econd Case: the cornea center and the pupil center do not lie on the camera s X Z plane: However, the cornea center and the pupil center do not always lie on the camera s X Z plane, we can obtain the ray diagram shown in Figure 5 by projecting the ray diagram in Figure 4 into X Z plane along the Y axis o the camera s coordinate system. Thereore, as shown in Figure 6, point is the projection o the pupil center P c1, point is the projection o the cornea center O c1, and point is also the projection o the screen gaze point in the X Z plane. tarting rom O c1, a parallel line O c1 o line intersects with line P c1 at. Also starting rom P c1, a parallel line P c1 o line intersects with line at. Because O c1 P c1 represents the distance r between the pupil center to the cornea center, which will not change as the eyeball rotates, can be derived as ollows: r 1 = = r 2 +(y 1 y s ) 2 (17)

5 X O s Assume that a speciic gaze mapping unction P1 is known via the calibration procedure described in ection 2. Thereore, ater integrating the head mapping unction g into the speciic gaze mapping unction P1 via equation 4, the generalized gaze mapping unction F can be rewritten as ollows: Z O c1 P c1 Y Fig. 6. Projection into camera s X Z plane. Thereore, when the eye moves to a new location as showninfigure5, can be represented as ollows: r 2 = = r 2 +(y 2 y s ) 2 (18) Ater substituting the ormulations o r 1 and r 2 into equations 15 and 16, we can obtain Vx1 V x2 as ollows: V x1 = d [(z s z 1 ) x g1 +(x s x 1 ) ] V x2 [(z s z 2 ) x g2 +(x s x 2 ) ] where d is set as ollows: d = (z 2 z s ) 2 +(x 2 x s ) 2 +(y 2 y s ) 2 (z 1 z s ) 2 +(x 1 x s ) 2 +(y 1 y s ) 2 (19) As a result, equations 7 and 8 can be inally obtained as ollows: v x1 = d [(z s z 1 ) x g1 +(x s x 1 ) ] [(z s z 2 ) x g2 +(x s x 2 ) ] z 2 z 1 v x2 (20) v y1 = d [(z s z 1 ) y g1 +(y s y 1 ) ] [(z s z 2 ) y g2 +(y s y 2 ) ] z 2 z 1 v y2 (21) The above equations constitute the head mapping unction g between the pupil-glint vectors o the eyes at dierent positions in ront o the camera, while gazing at the same screen point. 3.3 Iterative Algorithm or Gaze Estimation Actually, the derived equations 20 and 21 o the head mapping unction can not be established unless the gaze point =(x s,y s,z s ) o the screen is known. However, the gaze point is the one that needs to be estimated. As a result, the gaze point is also a variable o the head mapping unction g, which can be urther expressed as ollows: v 2 = g(v 2,,,) (22) = F (v 2,,) (23) Given the extracted pupil-glint vector v 2 rom the eye image and the new location that the eye has moved to, equation 23 becomes a recursive unction. An iterative solution is proposed to solve it. First, the screen center 0 is chosen as an initial gaze point, then a corrected pupil-glint vector v 2 can be obtained rom the detected pupil-glint vector v 2 via the head mapping unction g. By inputting the corrected pupil-glint vector v 2 into the speciic gaze mapping unction P1, a new screen gaze point can be estimated. is urther used to compute a new corrected pupil-glint vector v 2. The loop continues until the estimated screen gaze point does not change any more. Usually, the whole iteration process will converge in less than 5 loops, which is very ast. By the proposed method, the gaze point can be estimated accurately. 4 Experiment Results 4.1 ystem etup Two cameras are mounted under the monitor screen. An IR light illuminator is mounted at the center o the lens o one camera, which will produce the corneal glint in the eye image. The pupil-glint vectors extracted rom the eye images captured by this camera will be used or gaze estimation. ince our algorithm needs the 3D eye position to work, another camera is mounted close to the one with IR illuminator. These two cameras orm a stereo vision system so that the 3D eye position can be obtained accurately. One requirement or our eye tracker system is to know the 3D representation o the monitor screen plane in the coordinate system o the camera with the IR illuminator. This is accomplished via the calibration method proposed in [14]. Ater the calibration, the coniguration between the camera and the monitor is ixed during the eye tracking. 4.2 Head Compensation Model Validation The equations 20 and 21 o the head mapping unction g are validated by the ollowing experiment. A screen point c = (132.75, , ) is chosen as the gaze point. The user was gazing at this point rom twenty dierent locations in ront o the camera; at each location, the pupil-glint vector and the 3D pupil center are collected. The 3D pupil centers and the pupil-glint vectors o the irst two samples, are shown in Table I, where is served as the reerence position. The second column indicates the original pupil-glint vectors, while the third column indicates the transormed pupil-glint vectors by the head

6 mapping unction. The dierence between the transormed pupil-glint vector o and the reerence pupil-glint vector at is deined as the transormation error. Figure 7 illustrates the transormation errors or all these twenty samples. It s observed that the average transormation error is only around 1 pixel. This shows the validity o the proposed head compensation model. TABLE I PUPIL-GLINT VECTOR COMPARION AT DIFFERENT EYE LOCATION 3D pupil 2D Pupil-glint Transormed Pupil-glint position (mm.) Vector (pixel) Vector (pixel) (5.25, 15.56, ) (9.65, ) (9.65, ) (-8.13, 32.29, ) (7.17, ) (8.75, ) Pixels Transormation Error (X Component) Locations Pixels Transormation Error (Y Component) Locations Fig. 7. The pupil-glint vector transormation errors. 4.3 Gaze Estimation Accuracy When the user moves away rom the camera, the eye in the image becomes smaller. Due to the increased pixel measurement error caused by the lower image resolution, the gaze accuracy o the eye gaze tracker will decrease as the user moves away rom the camera. In this experiment, the eect o the distance to the camera on the gaze accuracy o our system is analyzed. A user was asked to perorm the gaze calibration when he was sitting around 330 mm to the camera. Ater the calibration, the user was positioned at our dierent locations, which have dierent distances to the camera as listed in Table II. At each location, the user was asked to ollow the moving objects that will display 12 predeined positions across the screen. Table II lists the computed gaze estimation accuracy at these our dierent locations. It shows that as the user moves away rom the camera, the gaze resolution will decrease. Yet within this space allowed or the head movement, approximately mm (width height depth) at 450 mm rom the camera, the average horizontal angular accuracy is around 1.3 and the average vertical angular accuracy is around 1.7, which is acceptable or most o the Human Computer Interaction applications. In addition, this space volume allowed or the head movement is large enough or a user to sit comortably in ront o the camera and communicate with the computer naturally. 5 Conclusion In this paper, a solution is proposed to improve the classical PCCR eye gaze tracking technique. While enjoying the advantages o high accuracy and stability o classical PCCR TABLE II GAZE ETIMATION ACCURACY Distance to Horizontal Accuracy Vertical Accuracy the camera (mm) (degrees) (degrees) technique, we make it work under natural head movements. Thereore, the user does not need to keep his head uncomortably still as most o eye trackers require. Instead, the user can move his head reely while using our eye tracking system. At the same time, the number o calibration procedures is minimized to only once or a new user. Also, the calibration result is long-lasting: i the user repositions his head or he wants to come back and use the system next time, he does not need to do the calibration again. Both achievements are made possible because o the proposed head mapping unction, which can automatically accommodate the head movement changes. Reerences [1]. Milekic, The more you look the more you get: intention-based interace using gaze-tracking, in Bearman, D., Trant, J.(des.) Museums and the Web 2002: elected papers rom an international onerence, Archives and Mseum Inormatics, [2] K. Hyoki, M. higeta, N. Tsuno, Y. Kawamuro, and T. Kinoshita, Quantitative electro-oculography and electroencephalography as indices o alertness, in Electroencephalography and Clinical Neurophysiology, 1998, pp [3] Y.Ebisawa, M. Ohtani, and A. ugioka, Proposal o a zoom and ocus control method using an ultrasonic distance-meter or video-based eye-gaze detection under ree-hand condition, in Proceedings o the 18th Annual International conerence o the IEEE Eng. in Medicine and Biology ociety, [4] C.H. Morimoto, D. Koons, A. Amir, and M. Flickner, Frame-rate pupil detector and gaze tracker, in IEEE ICCV 99 Frame-rate Workshop, [5] A. T. Duchowski, Eye tracking methodology: Theory and practice, in pring Verlag, [6] R.J.K. Jacob and K.. Karn, Eye tracking in human-computer interaction and usability research: Ready to deliver the promises, 2003, Oxord, Elsevier cience. [7] D. H. Yoo, Non-contact eye gaze estimation system using robsut eature, CVIU, special Issue on eye detection and tracking, [8] LC Technologies Inc, The eyegaze development sytem, [9] Z. Zhu and Q. Ji, Eye and gaze tracking or interactive graphic display, Machine Vision and Applications, vol. 15, no. 3, pp , [10] C.H. Morimoto and M. R.M. Mimica, Eye gaze tracking techniques or intractive applications, CVIU, special issue on eye detection and tracking, [11] T.E. Hutchinson, K.P. White Jr., K.C. Reichert, and L.A. Frey, Human-computer interaction using eye-gaze input, in IEEE Transactions on ystems, Man, and Cybernetics, 1989, pp [12] R.J.K. Jacob, Eye-movement-based human-computer interaction techniques: Towards non-command interaces, 1993, pp , Ablex Publishing corporation, Norwood, NJ. [13] Z. Zhu and Q. Ji, Robust real-time eye detection and tracking under variable lighting conditions and various ace orientations, CVIU, special issue on eye detection and tracking, [14]. W. hih and J. Liu, A novel approach to 3-d gaze tracking using stereo cameras, in IEEE Trans. yst. Man and Cybern., part B, 2004, number 1, pp

FIXED INCOME ATTRIBUTION

FIXED INCOME ATTRIBUTION Sotware Requirement Speciication FIXED INCOME ATTRIBUTION Authors Risto Lehtinen Version Date Comment 0.1 2007/02/20 First Drat Table o Contents 1 Introduction... 3 1.1 Purpose o Document... 3 1.2 Glossary,

More information

The Geometry of Perspective Projection

The Geometry of Perspective Projection The Geometry o Perspective Projection Pinhole camera and perspective projection - This is the simplest imaging device which, however, captures accurately the geometry o perspective projection. -Rays o

More information

Copyright 2005 IEEE. Reprinted from IEEE MTT-S International Microwave Symposium 2005

Copyright 2005 IEEE. Reprinted from IEEE MTT-S International Microwave Symposium 2005 Copyright 2005 IEEE Reprinted rom IEEE MTT-S International Microwave Symposium 2005 This material is posted here with permission o the IEEE. Such permission o the IEEE does t in any way imply IEEE endorsement

More information

THE EFFECT OF THE SPINDLE SYSTEM ON THE POSITION OF CIRCULAR SAW TEETH A STATIC APPROACH

THE EFFECT OF THE SPINDLE SYSTEM ON THE POSITION OF CIRCULAR SAW TEETH A STATIC APPROACH TRIESKOVÉ A BEZTRIESKOVÉ OBRÁBANIE DREVA 2006 12. - 14. 10. 2006 305 THE EFFECT OF THE SPINDLE SYSTEM ON THE POSITION OF CIRCULAR SAW TEETH A STATIC APPROACH Roman Wasielewski - Kazimierz A. Orłowski Abstract

More information

Image Forensic Analyses that Elude the Human Visual System

Image Forensic Analyses that Elude the Human Visual System Image Forensic Analyses that Elude the Human Visual System Hany Farid a and Mary J. Bravo b a Department o Computer Science, Dartmouth College, Hanover NH 3755, USA b Psychology Department, Rutgers University,

More information

THE DIFFERENCE BETWEEN COEFFICIENT-OF-FRICTION AND DRAG FACTOR

THE DIFFERENCE BETWEEN COEFFICIENT-OF-FRICTION AND DRAG FACTOR THE DIERECE BETWEE COEICIET-O-RICTIO AD DRAG ACTOR Mark George, Am SAE-A Director, Accident Investigation Services Pty Ltd, Sydney, Australia. May 2005 The terms coeicient-o-riction (µ) and drag actor

More information

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec.

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec. Physics for Scientists and Engineers, 4e (Giancoli) Chapter 33 Lenses and Optical Instruments 33.1 Conceptual Questions 1) State how to draw the three rays for finding the image position due to a thin

More information

Image Formation. 7-year old s question. Reference. Lecture Overview. It receives light from all directions. Pinhole

Image Formation. 7-year old s question. Reference. Lecture Overview. It receives light from all directions. Pinhole Image Formation Reerence http://en.wikipedia.org/wiki/lens_(optics) Reading: Chapter 1, Forsyth & Ponce Optional: Section 2.1, 2.3, Horn. The slides use illustrations rom these books Some o the ollowing

More information

How To Write A Paper On The Brain Reaction Index During Mental Calculation Rom The Ew

How To Write A Paper On The Brain Reaction Index During Mental Calculation Rom The Ew , pp.123-132 http://dx.doi.org/10.14257/ijbsbt.2014.6.4.12 Suggestion o a ew Brain Reaction Index or the EEG Signal Identiication and Analysis Jungeun Lim 1, Bohyeok Seo 2 and Soonyong Chun * 1 School

More information

Moment of Inertia & Rotational Energy Physics Lab IX Objective

Moment of Inertia & Rotational Energy Physics Lab IX Objective Moment o Inertia & Rotational Energy Physics Lab IX Objective In this lab, the physical nature o the moment o inertia and the conservation law o mechanical energy involving rotational motion will be examined

More information

19 - RAY OPTICS Page 1 ( Answers at the end of all questions )

19 - RAY OPTICS Page 1 ( Answers at the end of all questions ) 19 - RAY OPTICS Page 1 1 ) A ish looking up through the water sees the outside world contained in a circular horizon. I the reractive index o water is 4 / 3 and the ish is 1 cm below the surace, the radius

More information

3D Scanner using Line Laser. 1. Introduction. 2. Theory

3D Scanner using Line Laser. 1. Introduction. 2. Theory . Introduction 3D Scanner using Line Laser Di Lu Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute The goal of 3D reconstruction is to recover the 3D properties of a geometric

More information

Tracking Moving Objects In Video Sequences Yiwei Wang, Robert E. Van Dyck, and John F. Doherty Department of Electrical Engineering The Pennsylvania State University University Park, PA16802 Abstract{Object

More information

Using quantum computing to realize the Fourier Transform in computer vision applications

Using quantum computing to realize the Fourier Transform in computer vision applications Using quantum computing to realize the Fourier Transorm in computer vision applications Renato O. Violin and José H. Saito Computing Department Federal University o São Carlos {renato_violin, saito }@dc.uscar.br

More information

NATIONAL SENIOR CERTIFICATE GRADE 12

NATIONAL SENIOR CERTIFICATE GRADE 12 NATIONAL SENI CERTIFICATE GRADE 1 PHYSICAL SCIENCES: PHYSICS (P1) NOVEMBER 010 MEMANDUM MARKS: 150 This memorandum consists o 3 pages. NOTE: Marking rule 1.5 was changed according to decisions taken at

More information

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - nzarrin@qiau.ac.ir

More information

Reflection and Refraction

Reflection and Refraction Equipment Reflection and Refraction Acrylic block set, plane-concave-convex universal mirror, cork board, cork board stand, pins, flashlight, protractor, ruler, mirror worksheet, rectangular block worksheet,

More information

Darcy Friction Factor Formulae in Turbulent Pipe Flow

Darcy Friction Factor Formulae in Turbulent Pipe Flow Darcy Friction Factor Formulae in Turbulent Pipe Flow Jukka Kiijärvi Lunowa Fluid Mechanics Paper 110727 July 29, 2011 Abstract The Darcy riction actor in turbulent pipe low must be solved rom the Colebrook

More information

Spatial location in 360 of reference points over an object by using stereo vision

Spatial location in 360 of reference points over an object by using stereo vision EDUCATION Revista Mexicana de Física E 59 (2013) 23 27 JANUARY JUNE 2013 Spatial location in 360 of reference points over an object by using stereo vision V. H. Flores a, A. Martínez a, J. A. Rayas a,

More information

!!! Technical Notes : The One-click Installation & The AXIS Internet Dynamic DNS Service. Table of contents

!!! Technical Notes : The One-click Installation & The AXIS Internet Dynamic DNS Service. Table of contents Technical Notes: One-click Installation & The AXIS Internet Dynamic DNS Service Rev: 1.1. Updated 2004-06-01 1 Table o contents The main objective o the One-click Installation...3 Technical description

More information

A RAPID METHOD FOR WATER TARGET EXTRACTION IN SAR IMAGE

A RAPID METHOD FOR WATER TARGET EXTRACTION IN SAR IMAGE A RAPI METHO FOR WATER TARGET EXTRACTION IN SAR IMAGE WANG ong a, QIN Ping a ept. o Surveying and Geo-inormatic Tongji University, Shanghai 200092, China. - wangdong75@gmail.com ept. o Inormation, East

More information

Tobii 50 Series. Product Description. Tobii 1750 Eye Tracker Tobii 2150 Eye Tracker Tobii x50 Eye Tracker

Tobii 50 Series. Product Description. Tobii 1750 Eye Tracker Tobii 2150 Eye Tracker Tobii x50 Eye Tracker Product Description Tobii 50 Series Tobii 1750 Eye Tracker Tobii 2150 Eye Tracker Tobii x50 Eye Tracker Web: www.tobii.com E-mail: sales@tobii.com Phone: +46 (0)8 663 69 90 Copyright Tobii Technology AB,

More information

Optics. Kepler's telescope and Galileo's telescope. f 1. f 2. LD Physics Leaflets P5.1.4.2. Geometrical optics Optical instruments

Optics. Kepler's telescope and Galileo's telescope. f 1. f 2. LD Physics Leaflets P5.1.4.2. Geometrical optics Optical instruments Optics Geometrical optics Optical instruments LD Physics Lealets P5.1.4.2 Kepler's telescope and Galileo's telescope Objects o the experiment g Veriying that the length o a telescope is given by the sum

More information

8. Hardware Acceleration and Coprocessing

8. Hardware Acceleration and Coprocessing July 2011 ED51006-1.2 8. Hardware Acceleration and ED51006-1.2 This chapter discusses how you can use hardware accelerators and coprocessing to create more eicient, higher throughput designs in OPC Builder.

More information

Endoscope Optics. Chapter 8. 8.1 Introduction

Endoscope Optics. Chapter 8. 8.1 Introduction Chapter 8 Endoscope Optics Endoscopes are used to observe otherwise inaccessible areas within the human body either noninvasively or minimally invasively. Endoscopes have unparalleled ability to visualize

More information

Power functions: f(x) = x n, n is a natural number The graphs of some power functions are given below. n- even n- odd

Power functions: f(x) = x n, n is a natural number The graphs of some power functions are given below. n- even n- odd 5.1 Polynomial Functions A polynomial unctions is a unction o the orm = a n n + a n-1 n-1 + + a 1 + a 0 Eample: = 3 3 + 5 - The domain o a polynomial unction is the set o all real numbers. The -intercepts

More information

Why focus on assessment now?

Why focus on assessment now? Assessment in th Responding to your questions. Assessment is an integral part o teaching and learning I this sounds amiliar to you it s probably because it is one o the most requently quoted lines rom

More information

A MPCP-Based Centralized Rate Control Method for Mobile Stations in FiWi Access Networks

A MPCP-Based Centralized Rate Control Method for Mobile Stations in FiWi Access Networks A MPCP-Based Centralized Rate Control Method or Mobile Stations in FiWi Access Networks 215 IEEE. Personal use o this material is permitted. Permission rom IEEE must be obtained or all other uses, in any

More information

Is the trailing-stop strategy always good for stock trading?

Is the trailing-stop strategy always good for stock trading? Is the trailing-stop strategy always good or stock trading? Zhe George Zhang, Yu Benjamin Fu December 27, 2011 Abstract This paper characterizes the trailing-stop strategy or stock trading and provides

More information

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY V. Knyaz a, *, Yu. Visilter, S. Zheltov a State Research Institute for Aviation System (GosNIIAS), 7, Victorenko str., Moscow, Russia

More information

CREATING MENTOR NETWORKS IN THE OSCE REGION: A Practical Roadmap. Organization for Security and Co-operation in Europe

CREATING MENTOR NETWORKS IN THE OSCE REGION: A Practical Roadmap. Organization for Security and Co-operation in Europe CREATING MENTOR NETWORKS IN THE OSCE REGION: A Practical Roadmap. Organization or Security and Co-operation in Europe table o contents Project Partners: The Organization or Security and Co-operation in

More information

A NOVEL ALGORITHM WITH IM-LSI INDEX FOR INCREMENTAL MAINTENANCE OF MATERIALIZED VIEW

A NOVEL ALGORITHM WITH IM-LSI INDEX FOR INCREMENTAL MAINTENANCE OF MATERIALIZED VIEW A NOVEL ALGORITHM WITH IM-LSI INDEX FOR INCREMENTAL MAINTENANCE OF MATERIALIZED VIEW 1 Dr.T.Nalini,, 2 Dr. A.Kumaravel, 3 Dr.K.Rangarajan Dept o CSE, Bharath University Email-id: nalinicha2002@yahoo.co.in

More information

Chapter 7: Acceleration and Gravity

Chapter 7: Acceleration and Gravity Chapter 7: Acceleration and Gravity 7.1 The Principle o Equivalence We saw in the special theory o relativity that the laws o physics must be the same in all inertial reerence systems. But what is so special

More information

Lecture 17. Image formation Ray tracing Calculation. Lenses Convex Concave. Mirrors Convex Concave. Optical instruments

Lecture 17. Image formation Ray tracing Calculation. Lenses Convex Concave. Mirrors Convex Concave. Optical instruments Lecture 17. Image formation Ray tracing Calculation Lenses Convex Concave Mirrors Convex Concave Optical instruments Image formation Laws of refraction and reflection can be used to explain how lenses

More information

Specifying Calibration Standards and Kits for Agilent Vector Network Analyzers. Application Note 1287-11

Specifying Calibration Standards and Kits for Agilent Vector Network Analyzers. Application Note 1287-11 Speciying Calibration Standards and Kits or Agilent Vector Network Analyzers Application Note 1287-11 Table o Contents Introduction... 3 Measurement errors... 3 Measurement calibration... 3 Calibration

More information

Maintenance Scheduling Optimization for 30kt Heavy Haul Combined Train in Daqin Railway

Maintenance Scheduling Optimization for 30kt Heavy Haul Combined Train in Daqin Railway 5th International Conerence on Civil Engineering and Transportation (ICCET 2015) Maintenance Scheduling Optimization or 30kt Heavy Haul Combined Train in Daqin Railway Yuan Lin1, a, Leishan Zhou1, b and

More information

Work, Energy & Power. AP Physics B

Work, Energy & Power. AP Physics B ork, Energy & Power AP Physics B There are many dierent TYPES o Energy. Energy is expressed in JOULES (J) 4.19 J = 1 calorie Energy can be expressed more speciically by using the term ORK() ork = The Scalar

More information

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Geometric Optics Converging Lenses and Mirrors Physics Lab IV Objective Geometric Optics Converging Lenses and Mirrors Physics Lab IV In this set of lab exercises, the basic properties geometric optics concerning converging lenses and mirrors will be explored. The

More information

Chapter 23. The Reflection of Light: Mirrors

Chapter 23. The Reflection of Light: Mirrors Chapter 23 The Reflection of Light: Mirrors Wave Fronts and Rays Defining wave fronts and rays. Consider a sound wave since it is easier to visualize. Shown is a hemispherical view of a sound wave emitted

More information

Pattern recognition using multilayer neural-genetic algorithm

Pattern recognition using multilayer neural-genetic algorithm Neurocomputing 51 (2003) 237 247 www.elsevier.com/locate/neucom Pattern recognition using multilayer neural-genetic algorithm Yas Abbas Alsultanny, Musbah M. Aqel Computer Science Department, College o

More information

INTRODUCTION TO RENDERING TECHNIQUES

INTRODUCTION TO RENDERING TECHNIQUES INTRODUCTION TO RENDERING TECHNIQUES 22 Mar. 212 Yanir Kleiman What is 3D Graphics? Why 3D? Draw one frame at a time Model only once X 24 frames per second Color / texture only once 15, frames for a feature

More information

Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the Eye Tracking Accuracy

Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the Eye Tracking Accuracy Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the Eye Tracking Accuracy Michal Kowalik Supervised by: Radoslaw Mantiuk Faculty of Computer Science West Pomeranian University of Technology

More information

THE MODELING AND CALCULATION OF SOUND RADIATION FROM FACILITIES WITH GAS FLOWED PIPES INTRODUCTION

THE MODELING AND CALCULATION OF SOUND RADIATION FROM FACILITIES WITH GAS FLOWED PIPES INTRODUCTION THE MODELING AND CALCULATION OF SOUND ADIATION FOM FACILITIES WITH GAS FLOWED PIPES INTODUCTION Analysis o the emission caused by industrial acilities like chemical plants, reineries or other production

More information

Risk Arbitrage Performance for Stock Swap Offers with Collars

Risk Arbitrage Performance for Stock Swap Offers with Collars Risk Arbitrage Perormance or Stock Swap Oers with Collars Ben Branch Isenberg School o Management University o Massachusetts at Amherst, MA 01003 Phone: 413-5455690 Email: branchb@som.umass.edu Jia Wang

More information

Chapter 36 - Lenses. A PowerPoint Presentation by Paul E. Tippens, Professor of Physics Southern Polytechnic State University

Chapter 36 - Lenses. A PowerPoint Presentation by Paul E. Tippens, Professor of Physics Southern Polytechnic State University Chapter 36 - Lenses A PowerPoint Presentation by Paul E. Tippens, Professor of Physics Southern Polytechnic State University 2007 Objectives: After completing this module, you should be able to: Determine

More information

Solving Newton s Second Law Problems

Solving Newton s Second Law Problems Solving ewton s Second Law Problems Michael Fowler, Phys 142E Lec 8 Feb 5, 2009 Zero Acceleration Problems: Forces Add to Zero he Law is F ma : the acceleration o a given body is given by the net orce

More information

Globally Optimal Inlier Set Maximization With Unknown Rotation and Focal Length

Globally Optimal Inlier Set Maximization With Unknown Rotation and Focal Length Globally Optimal Inlier Set Maximization With Unknown Rotation and Focal Length Jean-Charles Bazin 1, Yongduek Seo, Richard Hartley 3, Marc Polleeys 1 1 Department o Computer Science, ETH Zurich, Switzerland

More information

Cabri Geometry Application User Guide

Cabri Geometry Application User Guide Cabri Geometry Application User Guide Preview of Geometry... 2 Learning the Basics... 3 Managing File Operations... 12 Setting Application Preferences... 14 Selecting and Moving Objects... 17 Deleting

More information

Eye Tracking Instructions

Eye Tracking Instructions Eye Tracking Instructions [1] Check to make sure that the eye tracker is properly connected and plugged in. Plug in the eye tracker power adaptor (the green light should be on. Make sure that the yellow

More information

Self-Calibrated Structured Light 3D Scanner Using Color Edge Pattern

Self-Calibrated Structured Light 3D Scanner Using Color Edge Pattern Self-Calibrated Structured Light 3D Scanner Using Color Edge Pattern Samuel Kosolapov Department of Electrical Engineering Braude Academic College of Engineering Karmiel 21982, Israel e-mail: ksamuel@braude.ac.il

More information

International Journal of Pure and Applied Sciences and Technology

International Journal of Pure and Applied Sciences and Technology Int. J. Pure Appl. Sci. Technol., 18(1) (213), pp. 43-53 International Journal o Pure and Applied Sciences and Technology ISS 2229-617 Available online at www.iopaasat.in Research Paper Eect o Volatility

More information

How To Use Eye Tracking With A Dual Eye Tracking System In A Collaborative Collaborative Eye Tracking (Duet)

How To Use Eye Tracking With A Dual Eye Tracking System In A Collaborative Collaborative Eye Tracking (Duet) Framework for colocated synchronous dual eye tracking Craig Hennessey Department of Electrical and Computer Engineering University of British Columbia Mirametrix Research craigah@ece.ubc.ca Abstract Dual

More information

Manufacturing and Fractional Cell Formation Using the Modified Binary Digit Grouping Algorithm

Manufacturing and Fractional Cell Formation Using the Modified Binary Digit Grouping Algorithm Proceedings o the 2014 International Conerence on Industrial Engineering and Operations Management Bali, Indonesia, January 7 9, 2014 Manuacturing and Fractional Cell Formation Using the Modiied Binary

More information

2. Getting Started with the Graphical User Interface

2. Getting Started with the Graphical User Interface May 2011 NII52017-11.0.0 2. Getting Started with the Graphical User Interace NII52017-11.0.0 The Nios II Sotware Build Tools (SBT) or Eclipse is a set o plugins based on the Eclipse ramework and the Eclipse

More information

Projection Center Calibration for a Co-located Projector Camera System

Projection Center Calibration for a Co-located Projector Camera System Projection Center Calibration for a Co-located Camera System Toshiyuki Amano Department of Computer and Communication Science Faculty of Systems Engineering, Wakayama University Sakaedani 930, Wakayama,

More information

Calibration and Uncertainties of Pipe Roughness Height

Calibration and Uncertainties of Pipe Roughness Height 9 th IWA/IAHR Conerence on Urban Drainage Modelling Calibration and Uncertainties o Pipe Roughness Height Kailin Yang, Yongxin Guo, Xinlei Guo,Hui Fu and Tao Wang China Institute o Water Resources and

More information

Real-Time Reservoir Model Updating Using Ensemble Kalman Filter Xian-Huan Wen, SPE and Wen. H. Chen, SPE, ChevronTexaco Energy Technology Company

Real-Time Reservoir Model Updating Using Ensemble Kalman Filter Xian-Huan Wen, SPE and Wen. H. Chen, SPE, ChevronTexaco Energy Technology Company SPE 92991 Real-Time Reservoir Model Updating Using Ensemble Kalman Filter Xian-Huan Wen, SPE and Wen. H. Chen, SPE, ChevronTexaco Energy Technology Company Copyright 2004, Society o Petroleum Engineers

More information

Time-Memory Trade-Offs: False Alarm Detection Using Checkpoints

Time-Memory Trade-Offs: False Alarm Detection Using Checkpoints Time-Memory Trade-Os: False Alarm Detection Using Checkpoints Gildas Avoine 1, Pascal Junod 2, and Philippe Oechslin 1,3 1 EPFL, Lausanne, Switzerland 2 Nagravision SA (Kudelski Group), Switzerland 3 Objecti

More information

Template-based Eye and Mouth Detection for 3D Video Conferencing

Template-based Eye and Mouth Detection for 3D Video Conferencing Template-based Eye and Mouth Detection for 3D Video Conferencing Jürgen Rurainsky and Peter Eisert Fraunhofer Institute for Telecommunications - Heinrich-Hertz-Institute, Image Processing Department, Einsteinufer

More information

7.2. Focusing devices: Unit 7.2. context. Lenses and curved mirrors. Lenses. The language of optics

7.2. Focusing devices: Unit 7.2. context. Lenses and curved mirrors. Lenses. The language of optics context 7.2 Unit 7.2 ocusing devices: Lenses and curved mirrors Light rays often need to be controlled and ed to produce s in optical instruments such as microscopes, cameras and binoculars, and to change

More information

6. Friction, Experiment and Theory

6. Friction, Experiment and Theory 6. Friction, Experiment and Theory The lab thi wee invetigate the rictional orce and the phyical interpretation o the coeicient o riction. We will mae ue o the concept o the orce o gravity, the normal

More information

TWO-DIMENSIONAL TRANSFORMATION

TWO-DIMENSIONAL TRANSFORMATION CHAPTER 2 TWO-DIMENSIONAL TRANSFORMATION 2.1 Introduction As stated earlier, Computer Aided Design consists of three components, namely, Design (Geometric Modeling), Analysis (FEA, etc), and Visualization

More information

A Proposal for Estimating the Order Level for Slow Moving Spare Parts Subject to Obsolescence

A Proposal for Estimating the Order Level for Slow Moving Spare Parts Subject to Obsolescence usiness, 2010, 2, 232-237 doi:104236/ib201023029 Published Online September 2010 (http://wwwscirporg/journal/ib) A Proposal or Estimating the Order Level or Slow Moving Spare Parts Subject to Obsolescence

More information

Wii Remote Calibration Using the Sensor Bar

Wii Remote Calibration Using the Sensor Bar Wii Remote Calibration Using the Sensor Bar Alparslan Yildiz Abdullah Akay Yusuf Sinan Akgul GIT Vision Lab - http://vision.gyte.edu.tr Gebze Institute of Technology Kocaeli, Turkey {yildiz, akay, akgul}@bilmuh.gyte.edu.tr

More information

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY Dr. T. Clarke & Dr. X. Wang Optical Metrology Centre, City University, Northampton Square, London, EC1V 0HB, UK t.a.clarke@city.ac.uk, x.wang@city.ac.uk

More information

The Complex Step Method Applied to Waterflooding Optimization

The Complex Step Method Applied to Waterflooding Optimization JOSE L. CARBALLAL JR. and BERNARDO HOROWITZ. THE COMPLEX STEP METHOD APPLIED TO WATERFLOODING 45 The Complex Step Method Applied to Waterlooding Optimization José L. Carballal Jr. 1, Bernardo Horowitz

More information

Time-Optimal Online Trajectory Generator for Robotic Manipulators

Time-Optimal Online Trajectory Generator for Robotic Manipulators Time-Optimal Online Trajectory Generator or Robotic Manipulators Francisco Ramos, Mohanarajah Gajamohan, Nico Huebel and Raaello D Andrea Abstract This work presents a trajectory generator or time-optimal

More information

Mirrors and Lenses. Clicker Questions. Question P1.03

Mirrors and Lenses. Clicker Questions. Question P1.03 3 Mirrors and Lenses Clicker Questions Question P.03 Descrition: Reasoning with geometric otics and ray-tracing. Question An object is located on the otical axis and a distance o 8 cm rom a thin converging

More information

Vehicle Tracking System Robust to Changes in Environmental Conditions

Vehicle Tracking System Robust to Changes in Environmental Conditions INORMATION & COMMUNICATIONS Vehicle Tracking System Robust to Changes in Environmental Conditions Yasuo OGIUCHI*, Masakatsu HIGASHIKUBO, Kenji NISHIDA and Takio KURITA Driving Safety Support Systems (DSSS)

More information

SIMPLIFIED CBA CONCEPT AND EXPRESS CHOICE METHOD FOR INTEGRATED NETWORK MANAGEMENT SYSTEM

SIMPLIFIED CBA CONCEPT AND EXPRESS CHOICE METHOD FOR INTEGRATED NETWORK MANAGEMENT SYSTEM SIMPLIFIED CBA CONCEPT AND EXPRESS CHOICE METHOD FOR INTEGRATED NETWORK MANAGEMENT SYSTEM Mohammad Al Rawajbeh 1, Vladimir Sayenko 2 and Mohammad I. Muhairat 3 1 Department o Computer Networks, Al-Zaytoonah

More information

Integrated airline scheduling

Integrated airline scheduling Computers & Operations Research 36 (2009) 176 195 www.elsevier.com/locate/cor Integrated airline scheduling Nikolaos Papadakos a,b, a Department o Computing, Imperial College London, 180 Queen s Gate,

More information

Under 20 of rotation the accuracy of data collected by 2D software is maintained?

Under 20 of rotation the accuracy of data collected by 2D software is maintained? Under 20 of rotation the accuracy of data collected by 2D software is maintained? Introduction This study was undertaken in order to validate the Quintic two-dimensional software. Unlike threedimensional

More information

Geometrical Optics - Grade 11

Geometrical Optics - Grade 11 OpenStax-CNX module: m32832 1 Geometrical Optics - Grade 11 Rory Adams Free High School Science Texts Project Mark Horner Heather Williams This work is produced by OpenStax-CNX and licensed under the Creative

More information

Cardio -Vascular Image Contrast Improvement via Hardware Design

Cardio -Vascular Image Contrast Improvement via Hardware Design Cardio -Vascular Image Contrast Improvement via Hardware Design Aimé Lay-Ekuakille 1, Giuseppe Vendramin 1, Ivonne Sgura, Amerigo Trotta 3 1 Dipartimento d Ingegneria dell Innovazione, University o Salento,

More information

2D Shape Transformation Using 3D Blending

2D Shape Transformation Using 3D Blending D Shape Transormation Using 3D Blending G. Pasko, A. Pasko,, M. Ikeda, and T. Kunii, IT Institute, Kanazawa Institute o Technology, Tokyo, Japan E-mail: {gpasko, ikeda}@iti.kanazawa-it.ac.jp Hosei University,

More information

The Eyegaze Edge : How does it work? What do you need to know?

The Eyegaze Edge : How does it work? What do you need to know? The Eyegaze Edge : How does it work? What do you need to know? Eye tracking overview The Eyegaze Edge is an eye-controlled communication and control system. The Edge uses the pupilcenter/corneal-reflection

More information

HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT

HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT International Journal of Scientific and Research Publications, Volume 2, Issue 4, April 2012 1 HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT Akhil Gupta, Akash Rathi, Dr. Y. Radhika

More information

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS E. Batzies 1, M. Kreutzer 1, D. Leucht 2, V. Welker 2, O. Zirn 1 1 Mechatronics Research

More information

A performance analysis of EtherCAT and PROFINET IRT

A performance analysis of EtherCAT and PROFINET IRT A perormance analysis o EtherCAT and PROFINET IRT Conerence paper by Gunnar Prytz ABB AS Corporate Research Center Bergerveien 12 NO-1396 Billingstad, Norway Copyright 2008 IEEE. Reprinted rom the proceedings

More information

SUPERIOR EYE TRACKING TECHNOLOGY. Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software. www.eyegaze.com

SUPERIOR EYE TRACKING TECHNOLOGY. Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software. www.eyegaze.com SUPERIOR EYE TRACKING TECHNOLOGY Totally Free Head Motion Unmatched Accuracy State-Of-The-Art Analysis Software www.eyegaze.com LC TECHNOLOGIES EYEGAZE EDGE SYSTEMS LC Technologies harnesses the power

More information

Problem Set 5 Work and Kinetic Energy Solutions

Problem Set 5 Work and Kinetic Energy Solutions MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department o Physics Physics 8.1 Fall 1 Problem Set 5 Work and Kinetic Energy Solutions Problem 1: Work Done by Forces a) Two people push in opposite directions on

More information

Discrimination of Gaze Directions Using Low-Level Eye Image Features

Discrimination of Gaze Directions Using Low-Level Eye Image Features Discrimination of Gaze Directions Using Low-Level Eye Image Features Yanxia Zhang Lancaster University yazhang@lancasteracuk Andreas Bulling University of Cambridge & Lancaster University andreasbulling@acmorg

More information

Rotation: Moment of Inertia and Torque

Rotation: Moment of Inertia and Torque Rotation: Moment of Inertia and Torque Every time we push a door open or tighten a bolt using a wrench, we apply a force that results in a rotational motion about a fixed axis. Through experience we learn

More information

Lecture Notes for Chapter 34: Images

Lecture Notes for Chapter 34: Images Lecture Notes for hapter 4: Images Disclaimer: These notes are not meant to replace the textbook. Please report any inaccuracies to the professor.. Spherical Reflecting Surfaces Bad News: This subject

More information

Product Information. QUADRA-CHEK 3000 Evaluation Electronics For Metrological Applications

Product Information. QUADRA-CHEK 3000 Evaluation Electronics For Metrological Applications Product Information QUADRA-CHEK 3000 Evaluation Electronics For Metrological Applications April 2016 QUADRA-CHEK 3000 The evaluation electronics for intuitive 2-D measurement The QUADRA-CHEK 3000 evaluation

More information

Labor Demand. 1. The Derivation of the Labor Demand Curve in the Short Run:

Labor Demand. 1. The Derivation of the Labor Demand Curve in the Short Run: CON 361: Labor conomics 1. The Derivation o the Curve in the Short Run: We ill no complete our discussion o the components o a labor market by considering a irm s choice o labor demand, beore e consider

More information

Thin Lenses Drawing Ray Diagrams

Thin Lenses Drawing Ray Diagrams Drawing Ray Diagrams Fig. 1a Fig. 1b In this activity we explore how light refracts as it passes through a thin lens. Eyeglasses have been in use since the 13 th century. In 1610 Galileo used two lenses

More information

Idiosyncratic repeatability of calibration errors during eye tracker calibration

Idiosyncratic repeatability of calibration errors during eye tracker calibration Idiosyncratic repeatability of calibration errors during eye tracker calibration Katarzyna Har eżlak and Pawel Kasprowski and Mateusz Stasch Institute of Informatics Silesian University of Technology Gliwice,

More information

Space Perception and Binocular Vision

Space Perception and Binocular Vision Space Perception and Binocular Vision Space Perception Monocular Cues to Three-Dimensional Space Binocular Vision and Stereopsis Combining Depth Cues 9/30/2008 1 Introduction to Space Perception Realism:

More information

Intuitive Navigation in an Enormous Virtual Environment

Intuitive Navigation in an Enormous Virtual Environment / International Conference on Artificial Reality and Tele-Existence 98 Intuitive Navigation in an Enormous Virtual Environment Yoshifumi Kitamura Shinji Fukatsu Toshihiro Masaki Fumio Kishino Graduate

More information

Highway Asset Management

Highway Asset Management Highway Asset Management By Chao Yang A Doctoral Thesis Submitted in partial ulilment o the requirements or the award o Doctor o Philosophy o the University o Nottingham Abstract The aim o this thesis

More information

Virtual Fitting by Single-shot Body Shape Estimation

Virtual Fitting by Single-shot Body Shape Estimation Virtual Fitting by Single-shot Body Shape Estimation Masahiro Sekine* 1 Kaoru Sugita 1 Frank Perbet 2 Björn Stenger 2 Masashi Nishiyama 1 1 Corporate Research & Development Center, Toshiba Corporation,

More information

Quality assurance of solar thermal systems with the ISFH- Input/Output-Procedure

Quality assurance of solar thermal systems with the ISFH- Input/Output-Procedure Quality assurance o solar thermal systems with the ISFH- Input/Output-Procedure Peter Paerisch * and Klaus Vanoli Institut uer Solarenergieorschung Hameln (ISFH), Am Ohrberg 1, 31860 Emmerthal, Germany

More information

Understanding astigmatism Spring 2003

Understanding astigmatism Spring 2003 MAS450/854 Understanding astigmatism Spring 2003 March 9th 2003 Introduction Spherical lens with no astigmatism Crossed cylindrical lenses with astigmatism Horizontal focus Vertical focus Plane of sharpest

More information

Experiment 3 Lenses and Images

Experiment 3 Lenses and Images Experiment 3 Lenses and Images Who shall teach thee, unless it be thine own eyes? Euripides (480?-406? BC) OBJECTIVES To examine the nature and location of images formed by es. THEORY Lenses are frequently

More information

How To Fix Out Of Focus And Blur Images With A Dynamic Template Matching Algorithm

How To Fix Out Of Focus And Blur Images With A Dynamic Template Matching Algorithm IJSTE - International Journal of Science Technology & Engineering Volume 1 Issue 10 April 2015 ISSN (online): 2349-784X Image Estimation Algorithm for Out of Focus and Blur Images to Retrieve the Barcode

More information

A LabVIEW Based Experimental Platform for Ultrasonic Range Measurements

A LabVIEW Based Experimental Platform for Ultrasonic Range Measurements DSP Journal, Volume 6, Issue, February, 007 A LabVIEW Based Experimental Platorm or Ultrasonic Range Measurements Abdallah Hammad, Ashra Haez, Mohamed arek Elewa Benha University - Faculty o Engineering

More information

Rainfall generator for the Meuse basin

Rainfall generator for the Meuse basin KNMI publication; 196 - IV Rainall generator or the Meuse basin Description o 20 000-year simulations R. Leander and T.A. Buishand De Bilt, 2008 KNMI publication = KNMI publicatie; 196 - IV De Bilt, 2008

More information

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network

An Energy-Based Vehicle Tracking System using Principal Component Analysis and Unsupervised ART Network Proceedings of the 8th WSEAS Int. Conf. on ARTIFICIAL INTELLIGENCE, KNOWLEDGE ENGINEERING & DATA BASES (AIKED '9) ISSN: 179-519 435 ISBN: 978-96-474-51-2 An Energy-Based Vehicle Tracking System using Principal

More information

PRODUCT SHEET. info@biopac.com support@biopac.com www.biopac.com

PRODUCT SHEET. info@biopac.com support@biopac.com www.biopac.com EYE TRACKING SYSTEMS BIOPAC offers an array of monocular and binocular eye tracking systems that are easily integrated with stimulus presentations, VR environments and other media. Systems Monocular Part

More information