Michael Cline. University of British Columbia. Vancouver, British Columbia. bimanual user interface.

Size: px
Start display at page:

Download "Michael Cline. University of British Columbia. Vancouver, British Columbia. cline@cs.ubc.ca. bimanual user interface."

Transcription

1 Higher Degree-of-Freedom Bimanual User Interfaces for 3-D Computer Graphics Michael Cline Dept. of Computer Science University of British Columbia Vancouver, British Columbia Canada V6T 1Z4 ABSTRACT We compare unimanual and bimanual versions of an interface that use 6 degree-of-freedom elastic rate-control devices for object and camera control in a 3D object docking task. An experiment is presented which compares the performance of these two interfaces as well as the status-quo mouse interface. We nd that the performance benets of the bimanual technique over the unimanual technique are larger for a more complex task which requires more epistemic actions. We also nd that for a trained user, the bimanual interface can usually outperform both the unimanual 6 DOF interface and the unimanual mouse interface. Keywords Bimanual input, 3D interfaces, 6 DOF input devices, camera control, interaction techniques, virtual environments 1 INTRODUCTION Although virtual reality (VR) has nearly two decades of research behind it, human interface technologies for interacting with 3-Dimensional virtual worlds are still in the experimental stage. The mouse has clearly established itself as the user interface of choice for the majority of applications that require interaction in a 2D graphical environment. However, it is not yet clear what is the best interface for tasks in a 3D VR environment, which typically require more degrees of freedom (DOF) than a mouse can provide. People often use both hands cooperatively to perform tasks that might be cumbersome with just one hand. Research has shown that it is possible to take advantage of our natural two-handedness by designing bimanual (two-handed) user interfaces that oer improved performance and ease of use over one-handed devices. Navigating through virtual environments and manipulating 3-D virtual objects are tasks that might benet from a bimanual user interface. Several studies have shown that in a compound task, abimanual interface can be superior to a unimanual interface. Balakrishnan and Kurtenbach [2] note that little formal evaluation of bimanual 3D interfaces has been carried out. Instead, most of the work in bimanual input up to this point has focused on tasks in 2D, using mice or tablets as input devices. Balakrishnan and Kurtenbach explored bimanual camera control and object manipulation for 3D graphics, comparing a twomouse interface to a one-mouse interface. Although participants in their experiment showed strong preference for the two-handed interaction technique, their study showed little hard evidence of temporal benets of one method over the other. This paper intends to build on the work of Balakrishnan and Kurtenbach by exploring bimanual camera control and object manipulation using higher degree-of-freedom input devices. We believe that the two-mouse interface used in their experiment may have been limited in its performance because of the awkward mapping that they were forced to use when applying a low DOF input device to a high DOF task. We therefore believe that by using higher DOF input devices, we are more likely to realize the benets of a bimanual interface in a 3D task. We present the results of an experiment in which a bimanual interface using two 6 DOF controls is compared with a unimanual interface with one 6 DOF control. 2 BACKGROUND Much of the work in bimanual user interfaces [6, 5,2, 8, 9, 1, 7] isdriven by Guiard's [4] theoretical work on Kinematic Chain (KC) model of skilled bimanual action. The component of Guiard's theory that is most relevant tobimanual user interfaces is his principle of right-to-left spatial reference, which says that (for right handers) the right hand follows a frame of reference dened by the left hand. Balakrishnan and Hinckley [1] have shown experimental evidence that the reference principle applies in situations where the spatial separation between the hands does not directly correspond to

2 the position of the hands as sensed by the input device. In other words, even if the hands work in two separate kinesthetic frames, each with its own independent origin, the principle of right-to-left spatial reference still applies. Hinckley, Pausch, and Prott [7] experimented with a bimanual user interface on an object-alignment task in a 3D environment. They concluded that the use of both hands forms a hand-relative-to-hand frame of reference that can help the user gain a better sense of the space that they are working in. Balakrishnan and Kurtenbach [2] conducted two experiments comparing a one-mouse and a two-mouse user interface for target selection and docking tasks in a 3D environment. In the object selection experiment, the bimanual interface was about 20 percent faster than the unimanual interface. The results from their object docking experimentshowed no signicant dierence in average task completion time between the bimanual and unimanual interface. In their conclusions, they suggest that the performance of the bimanual interface may have suered because the interaction style deviated from Guiard's KC model: the interface encouraged a parallel, symmetric style of interaction. Studies [2, 9]have pointed out that two-handed interfaces are useful not only for pragmatic actions (actions which are directed towards a task's goal), but also for epistemic actions (actions which facilitate cognition). For example, in the 3D docking task experiment mentioned in [2] it is noted that when using the two handed interaction technique, the users tended to move the camera around more, in order to enhance their perception of the 3-D scene. 3 CONTEXT AND GOALS OF THIS PROJECT Little research has been done that uses 6 DOF input devices in a bimanual user interface. Therefore, much of the research done with bimanual interfaces has concentrated on tasks with low degrees-of-freedom such as a 2D connect-the-dots task [1, 8], or a task where the user positions and scales a 2D object [3]. Some notable exceptions are [5, 10]. In Balakrishnan and Kurtenbach's [2] comparison of the one-mouse and two-mouse interfaces, we believe that the performance of their bimanual interface was limited by the awkward mapping they were forced to use because they applied a low DOF input device to a high DOF task. Combined camera and single object manipulation can use up to 12 DOF (not including camera zoom). Since the docking task in their experiment used a sphere, rotation was irrelevant, reducing the object manipulation to 3 DOF. The camera movement was restricted to inward pointing views on a sphere, requiring only 2 DOF. Thus the combined task was still a 5 DOF task, one more degree than their interface. To deal with this, they restricted object translation to a plane parallel to the X-Y plane of the camera. This restriction is what hurt the performance of the interface, because it required the hands to work together in a complex way, breaking the separation between the tasks of the two hands. For example, in order for the user to move the object along the Z dimension, they would have torst rotate by 90 degrees so that the old z-axis became the new x or y axis. The interface did not comply with Guaird's KC model very well. We hypothesize that if the above experiment was duplicated using 6 DOF controls instead of mice, then the dierences between the performance of the unimanual and bimanual methods would be more apparent. The questions this study attempts to answer, along with the experimental hypotheses, are: 1. Can a bimanual interface that uses two 6 DOF controllers provide advantages over a unimanual 6 DOF interface? HYPOTHESIS: The bimanual interface will always perform better than the unimanual interface 2. How do the bimanual and unimanual 6 DOF interfaces compare with a unimanual mouse interface? HYPOTHESIS: Even novice users of our system were \expert mouse users", so we suspect that for novice users, the mouse may be faster than the 6 DOF interface. However, for an expert user, we suspect that the 6 DOF interfaces will perform better than the mouse interface. 3. How does the complexity of the task aect the relative performance of the unimanual and bimanual approaches? HYPOTHESIS: A more complex task will require more epistemic actions. Since the bimanual interface allows the user to more easily make epistemic actions, we suspect that the relative advantages of bimanual over unimanual will be greater in more complex tasks. 4 EXPERIMENTAL PROCEDURE The experimental task was a docking task (see Fig. 1) similar to that of [2]. The user's goal is to move a docking object inside a docking station. After a trial is completed, the docking station and object are randomly repositioned.

3 100 cube trials for each of the three interfaces. The author had roughly 5 hours of cumulative experience with each oftheinterfaces. Figure 1: Object docking task The 6 DOF interface used to conduct the experiments was an elastic rate control device (see Fig. 2) similar to Zhai's EGG interface [11]. The user controls this device by moving (translating or rotating) a tennis ball which is suspended on elastic bands. The position and orientation of the ball are sensed using a Polhemus Fastrak. This kind of interface has a number a qualities that are desirable for the docking task. One reason why rate control is benecial is that it can eliminate the need for clutching. Also, the integral transformation in rate control acts as a low pass lter that removes high-frequency noise (jittery hands), producing a smoother trajectory [12]. In this experiment, three interfaces were compared against each other: a two-handed interface with two 6 DOF controls, where the left hand controls the camera view and the right hand controls the position of the docking object a one-handed 6 DOF interface, where object control mode and camera control mode were alternated by pressing a mode switch in the left hand a one-handed mouse interface, where the user drags on the docking object to move it, or drags on any other point to change the azimuth and elevation of the camera view (this is the same an the unimanual interface used in [2]) In order to determine how the complexity of the task aects the relative performance of the interfaces, the docking task consisted of two phases. In the rst phase, the task was to dock a sphere inside a cube. Thus, rotation of the object was irrelevant. The second phase required the user to place a smaller cube within a larger cube, requiring the user to properly orient the docking object. The task was nished when all 8 corners of the docking object were within the docking station. In both phases, the docking station was slightly larger (about 20 percent) than the docking object, so the user had a certain margin of positional and rotational error. Six novice users participated in the experiment. Each of the six participants performed 10 sphere trials and 10 cube trials on all three of the interfaces. To cancel out any learning eect, each participant used the interfaces in a dierent order. To gather data comparing the interfaces for an experienced user, the author performed 100 sphere trials and Figure 2: The elastic rate control device used in the experiment The EGG interface's self centering nature facilitates rate control, because the user can simply let go of the control to cause the controlled object to stop moving. The EGG especially lends itself to camera control, because beingabletoquickly stabilize the camera is desirable. The elastic rate control is also a good choice for object control in the docking task because the user needs to be able to reliably keep the docking object in one place while they move the camera around to examine the scene. With an isomorphic (one-to-one) controldisplay mapping, the user would be required to hold their hand steadily in one spot. We attempted to select the mappings for the 6 DOF rate control devices so that they would allow the user to complete the task in a minimum amount of time. For the translational motions of the docking object, we used a non-linear mapping:

4 v x =(k c x ) 2 v y =(k c y ) 2 v z =(k c z ) 2 where v x, v y,and v y are the velocities of the docking object in the x, y and z directions, c x, c y,andc z are the coordinates of the control relative to the elastic centre of the device, and k is a constant. This mapping allowed the docking object to have a wide range of velocities, yet still allowed ne control of the object. Rotational motion of the control was mapped linearly onto rotational velocity of the docking object. The camera control was simplied by the choice of rotational mapping. Rather than having the camera rotate in place, camera rotation was performed by rotating the camera about the centre of the virtual world. Thus, if no translational motions were applied to the camera, it would always point towards the centre of the world, so that the docking object and docking station were within view. This is similar to the waycameracontrol was done with the mouse in [2], where the camera was restricted to inward-looking views from a sphere of possible camera positions. 5 EXPERIMENTAL RESULTS Figure 3 shows the average trial completion times for novice users for both the sphere and the cube task. For the sphere task, the bimanual interface performed much worse than either the unimanual or mouse interface. However, for the cube task, the performance of the bimanual interface was 32 percent faster than the unimanual interface. This evidence supports hypotheses 2 and 3, but refutes hypothesis 1. seconds Figure 3: users Unimanual Bimanual Mouse Sphere Task Cube Task Average trial completion times for novice Figure 4 shows average trial completion times for the expert user. This data supports all three hypotheses: in both tasks, the bimanual interface performed better that the unimanual interface, and the 6 DOF interfaces performed better than the mouse in most cases. The task complexity seemed to have a large eect on the relative performance of the unimanual and bimanual interfaces: for the sphere task, bimanual was 28 percent faster than the unimanual, whereas in the more complex cube task, the bimanual was 39 percent faster than the unimanual. seconds Unimanual Bimanual Mouse Sphere Task Cube Task Figure 4: Average trial completion times for expert user Data was also collected that measured the translational and rotational ineciency of the user's motions. Ineciency is a ratio of the user's actual path length to the shortest possible path length [12]. For translational motions, the shortest path is a straight line from the starting position to the goal. For rotations, the shortest path is a rotation about a single axis from the start position to the goal position. The ineciency data we collected was inconclusive. It was erratic, and did not consistently support or refute any ofourhypotheses. 6 CONCLUSIONS Our experiments haveshown that using a 6 DOF bimanual interface can be benecial over using a unimanual 6 DOF interface. Our data suggests that the bimanual interface was most benecial when the task was complex and required many epistemic actions. For novice users, the unimanual mouse interface performed better than either of the 6 DOF interfaces. For the expert user, the mouse performed comparably to the 6 DOF interfaces on the sphere task (slightly better than unimanual, slightly worse than bimanual). However, on the cube task the 6 DOF interfaces performed much better than the mouse interface. Subjectively, the expert user found the 6 DOF interfaces easier and more pleasant to use than the mouse, especially on the more complex tasks. However, Novice users found the 6 DOF interfaces dicult to adjust to. In light of this, it seems that the bimanual 6 DOF in-

5 terfaces can be benecial, but their benets are only fully realized for expert users on complex, compound 3D tasks. Examples of where this technology may be applicable are computer aided design, or for tele-operation of a robot with many degrees of freedom. 7 ACKNOWLEDGEMENTS We would like to thank all the volunteers who participated in the experiment. We would also like to thank Dr. Sidney Fels for his helpful comments, and for supplying the Polhemus Fastrak and writing the drivers that allowed its use to be possible. REFERENCES [1] R. Balakrishnan and K. Hinckley. The role of kinesthetic reference frames in two-handed input performance. In Proceedings of UIST'99, pages 171{178, [2] R. Balakrishnan and G. Kurtenbach. Exploring bimanual camera control and object manipulation in 3D graphics interfaces. In Proceedings of ACM CHI 99 Conference on Human Factors in Computing Systems, volume 1 of Object Manipulation Studies in Virtual Environments, pages 56{63, [10] R. C. Zeleznik, A. S. Forsberg, and P. S. Strauss. Two pointer input for 3D interaction. In Proceedings of the Symposium on Interactive 3D Graphics, pages 115{120, New York, Apr. 27{ ACM Press. [11] S. Zhai. Investigation of feel for 6DOF inputs: Isometric and elastic rate control for manipulationin 3D environments. In Proceedings of the Human Factors and Ergonomics Society 37th Annual Meeting, volume 1 of COMPUTER SYSTEMS: 3D Input and Display, pages 323{327, [12] S. Zhai and P. Milgram. Quantifying coordination in multiple DOF movement and its application to evaluating 6 DOF input devices. In Proceedings of ACM CHI 98 Conference on Human Factors in Computing Systems, volume 1 of 3D, pages 320{ 327, [3] W. Buxton. A study in two-handed input. In Proceedings of CHI'86, pages 321{326, [4] Y. Guiard. Asymmetric division of labour in human skilled bimanual action. Journal of Motor Behaviour, 19:486{517, [5] K. Hinckley, R. Pausch, D. Prott, and N. F. Kassell. Two-handed virtual manipulation. ACM Transactions on Computer-Human Interaction, 5(3):260{302, [6] K. Hinckley, R. Pausch, D. Prott, J. Patten, and N. Kassell. Cooperative bimanual action. In Proceedings of ACM CHI 97 Conference on Human Factors in Computing Systems, volume 1 of PA- PERS: Handy User Interfaces, pages 27{34, [7] K. Hinckley, R. Pausch, and D. Prott. Attention and visual feedback: The bimanual frame of reference. In Proceedings of the 1997 Symposium on Interactive 3D Graphics, pages 121{126, [8] P. Kabbash, W. Buxton, and A. Sellen. Twohanded input in a compound task. In B. Adelson, S. Dumais, and J. Olson, editors, Proceedings of the Conference on Human Factors in Computing Systems, pages 417{423, New York, NY, USA, Apr ACM Press. [9] A. Leganchuk, S. Zhai, and W. Buxton. Manual and cognitive benets of two-handed input: An experimental study. ACM Transactions on Computer-Human Interaction, 5(4):326{359, 1998.

Improving Browsing Performance: A study of four input devices for scrolling and pointing tasks

Improving Browsing Performance: A study of four input devices for scrolling and pointing tasks 3URFHHGLQJVRI,17(5$&77KH6L[WK,),3&RQIHUHQFHRQ+XPDQ&RPSXWHU,QWHUDFWLRQ6\GQH\ $XVWUDOLD-XO\SS Improving Browsing Performance: A study of four input devices for scrolling and pointing tasks Shumin Zhai Barton

More information

Six Degree of Freedom Control with a Two-Dimensional Input Device: Intuitive Controls and Simple Implementations

Six Degree of Freedom Control with a Two-Dimensional Input Device: Intuitive Controls and Simple Implementations Six Degree of Freedom Control with a Two-Dimensional Input Device: Intuitive Controls and Simple Implementations Mark A. Livingston 1,2, Arthur Gregory 2, Bruce Culbertson 1 Computer Systems Laboratory

More information

The Computer Mouse and Related Input Devices

The Computer Mouse and Related Input Devices The Computer Mouse and Related Input Devices Shumin Zhai IBM Almaden Research Center Human input to computer systems is a critical and integral part of any human-computer interaction system. Input should

More information

Effects of Orientation Disparity Between Haptic and Graphic Displays of Objects in Virtual Environments

Effects of Orientation Disparity Between Haptic and Graphic Displays of Objects in Virtual Environments Human Computer Interaction INTERACT 99 Angela Sasse and Chris Johnson (Editors) Published by IOS Press, c IFIP TC.13, 1999 1 Effects of Orientation Disparity Between Haptic and Graphic Displays of Objects

More information

CS-525V: Building Effective Virtual Worlds. Input Devices. Robert W. Lindeman. Worcester Polytechnic Institute Department of Computer Science

CS-525V: Building Effective Virtual Worlds. Input Devices. Robert W. Lindeman. Worcester Polytechnic Institute Department of Computer Science CS-525V: Building Effective Virtual Worlds Input Devices Robert W. Lindeman Worcester Polytechnic Institute Department of Computer Science gogo@wpi.edu Motivation The mouse and keyboard are good for general

More information

Scooter, 3 wheeled cobot North Western University. PERCRO Exoskeleton

Scooter, 3 wheeled cobot North Western University. PERCRO Exoskeleton Scooter, 3 wheeled cobot North Western University A cobot is a robot for direct physical interaction with a human operator, within a shared workspace PERCRO Exoskeleton Unicycle cobot the simplest possible

More information

Tracking Moving Objects In Video Sequences Yiwei Wang, Robert E. Van Dyck, and John F. Doherty Department of Electrical Engineering The Pennsylvania State University University Park, PA16802 Abstract{Object

More information

Real-time haptic-teleoperated robotic system for motor control analysis

Real-time haptic-teleoperated robotic system for motor control analysis Journal of Neuroscience Methods 151 (2006) 194 199 Real-time haptic-teleoperated robotic system for motor control analysis Pete B. Shull, Roger V. Gonzalez Biomedical and Mechanical Engineering, LeTourneau

More information

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users INSTRUCTOR WORKBOOK for MATLAB /Simulink Users Developed by: Amir Haddadi, Ph.D., Quanser Peter Martin, M.A.SC., Quanser Quanser educational solutions are powered by: CAPTIVATE. MOTIVATE. GRADUATE. PREFACE

More information

Morten Fjeld. Designing for Tangible Interaction. Man-Machine Interaction IHA, ETH Zurich

Morten Fjeld. Designing for Tangible Interaction. Man-Machine Interaction IHA, ETH Zurich Morten Fjeld Designing for Tangible Interaction Man-Machine Interaction IHA, ETH Zurich Morten Fjeld, ETH Zürich LMU München, 3. November 2003 1/23 Overview What is Tangible Interaction? Collaborative

More information

Intuitive Navigation in an Enormous Virtual Environment

Intuitive Navigation in an Enormous Virtual Environment / International Conference on Artificial Reality and Tele-Existence 98 Intuitive Navigation in an Enormous Virtual Environment Yoshifumi Kitamura Shinji Fukatsu Toshihiro Masaki Fumio Kishino Graduate

More information

Animations in Creo 3.0

Animations in Creo 3.0 Animations in Creo 3.0 ME170 Part I. Introduction & Outline Animations provide useful demonstrations and analyses of a mechanism's motion. This document will present two ways to create a motion animation

More information

CATIA V5 Tutorials. Mechanism Design & Animation. Release 18. Nader G. Zamani. University of Windsor. Jonathan M. Weaver. University of Detroit Mercy

CATIA V5 Tutorials. Mechanism Design & Animation. Release 18. Nader G. Zamani. University of Windsor. Jonathan M. Weaver. University of Detroit Mercy CATIA V5 Tutorials Mechanism Design & Animation Release 18 Nader G. Zamani University of Windsor Jonathan M. Weaver University of Detroit Mercy SDC PUBLICATIONS Schroff Development Corporation www.schroff.com

More information

How To Run A Factory I/O On A Microsoft Gpu 2.5 (Sdk) On A Computer Or Microsoft Powerbook 2.3 (Powerpoint) On An Android Computer Or Macbook 2 (Powerstation) On

How To Run A Factory I/O On A Microsoft Gpu 2.5 (Sdk) On A Computer Or Microsoft Powerbook 2.3 (Powerpoint) On An Android Computer Or Macbook 2 (Powerstation) On User Guide November 19, 2014 Contents 3 Welcome 3 What Is FACTORY I/O 3 How Does It Work 4 I/O Drivers: Connecting To External Technologies 5 System Requirements 6 Run Mode And Edit Mode 7 Controls 8 Cameras

More information

Maya 2014 Basic Animation & The Graph Editor

Maya 2014 Basic Animation & The Graph Editor Maya 2014 Basic Animation & The Graph Editor When you set a Keyframe (or Key), you assign a value to an object s attribute (for example, translate, rotate, scale, color) at a specific time. Most animation

More information

To Virtualize or Not? The Importance of Physical and Virtual Components in Augmented Reality Board Games

To Virtualize or Not? The Importance of Physical and Virtual Components in Augmented Reality Board Games To Virtualize or Not? The Importance of Physical and Virtual Components in Augmented Reality Board Games Jessica Ip and Jeremy Cooperstock, Centre for Intelligent Machines, McGill University, Montreal,

More information

A Tutorial for 3D Point Cloud Editor

A Tutorial for 3D Point Cloud Editor A Tutorial for 3D Point Cloud Editor Yue Li and Matthew Hielsberg Texas A&M University April 9, 2012 Abstract This tutorial illustrates the uses of the point cloud editor with examples. 1 Introduction

More information

VRSPATIAL: DESIGNING SPATIAL MECHANISMS USING VIRTUAL REALITY

VRSPATIAL: DESIGNING SPATIAL MECHANISMS USING VIRTUAL REALITY Proceedings of DETC 02 ASME 2002 Design Technical Conferences and Computers and Information in Conference Montreal, Canada, September 29-October 2, 2002 DETC2002/ MECH-34377 VRSPATIAL: DESIGNING SPATIAL

More information

The Influence of Muscle Groups on Performance of Multiple Degree-of-Freedom Input

The Influence of Muscle Groups on Performance of Multiple Degree-of-Freedom Input CHI@6 APRIt. 13-Za, ~996 The Influence of Muscle Groups on Performance of Multiple Degree-of-Freedom Input Shumin Zhai 1,2 Paul Milgram ~ William Buxton 2 ~Department of Industrial Engineering and 2Department

More information

MMVR: 10, January 2002. On Defining Metrics for Assessing Laparoscopic Surgical Skills in a Virtual Training Environment

MMVR: 10, January 2002. On Defining Metrics for Assessing Laparoscopic Surgical Skills in a Virtual Training Environment On Defining Metrics for Assessing Laparoscopic Surgical Skills in a Virtual Training Environment Shahram Payandeh, Alan J. Lomax, John Dill, Christine L. Mackenzie and Caroline G. L. Cao Schools of Engineering

More information

A study of techniques for selecting and positioning objects in immersive VEs: effects of distance, size, and visual feedback

A study of techniques for selecting and positioning objects in immersive VEs: effects of distance, size, and visual feedback A study of techniques for selecting and positioning objects in immersive VEs: effects of distance, size, and visual feedback Ivan Poupyrev 1, 2, Suzanne Weghorst 2, Mark Billinghurst 2, Tadao Ichikawa

More information

A Novel Multitouch Interface for 3D Object Manipulation

A Novel Multitouch Interface for 3D Object Manipulation A Novel Multitouch Interface for 3D Object Manipulation Oscar Kin-Chung Au School of Creative Media City University of Hong Kong kincau@cityu.edu.hk Chiew-Lan Tai Department of Computer Science & Engineering

More information

Video-Based Eye Tracking

Video-Based Eye Tracking Video-Based Eye Tracking Our Experience with Advanced Stimuli Design for Eye Tracking Software A. RUFA, a G.L. MARIOTTINI, b D. PRATTICHIZZO, b D. ALESSANDRINI, b A. VICINO, b AND A. FEDERICO a a Department

More information

A Desktop Input Device and Interface for Interactive 3D Character Animation

A Desktop Input Device and Interface for Interactive 3D Character Animation A Desktop Input Device and Interface for Interactive 3D Character Animation Sageev Oore Department of Computer Science University of Toronto Demetri Terzopoulos Department of Computer Science New York

More information

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research 20th International Congress on Modelling and Simulation, Adelaide, Australia, 1 6 December 2013 www.mssanz.org.au/modsim2013 Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and

More information

The Influence of Head Tracking and Stereo on User Performance with Non-Isomorphic 3D Rotation

The Influence of Head Tracking and Stereo on User Performance with Non-Isomorphic 3D Rotation EGVE Symposium (2008) B. Mohler and R. van Liere (Editors) The Influence of Head Tracking and Stereo on User Performance with Non-Isomorphic 3D Rotation Joseph J. LaViola Jr. 1 and Andrew S. Forsberg 2

More information

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine Blender Notes Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine The Blender Game Engine This week we will have an introduction to the Game Engine build

More information

Towards Intuitive Exploration Tools for Data Visualization in VR

Towards Intuitive Exploration Tools for Data Visualization in VR Towards Intuitive Exploration Tools for Data Visualization in VR Gerwin de Haan, Michal Koutek, Frits H. Post Faculty of Information Technology and Systems Delft University of Technology Mekelweg 4, 26

More information

SimFonIA Animation Tools V1.0. SCA Extension SimFonIA Character Animator

SimFonIA Animation Tools V1.0. SCA Extension SimFonIA Character Animator SimFonIA Animation Tools V1.0 SCA Extension SimFonIA Character Animator Bring life to your lectures Move forward with industrial design Combine illustrations with your presentations Convey your ideas to

More information

Overview of the Adobe Flash Professional CS6 workspace

Overview of the Adobe Flash Professional CS6 workspace Overview of the Adobe Flash Professional CS6 workspace In this guide, you learn how to do the following: Identify the elements of the Adobe Flash Professional CS6 workspace Customize the layout of the

More information

Objects in Alice: Positioning and Moving Them

Objects in Alice: Positioning and Moving Them Objects in Alice: Positioning and Moving Them Download the Alice World that goes along with this tutorial. You will be learning about the objects in Alice, how they are positioned in the space of the Alice

More information

Force/position control of a robotic system for transcranial magnetic stimulation

Force/position control of a robotic system for transcranial magnetic stimulation Force/position control of a robotic system for transcranial magnetic stimulation W.N. Wan Zakaria School of Mechanical and System Engineering Newcastle University Abstract To develop a force control scheme

More information

Chapter. 4 Mechanism Design and Analysis

Chapter. 4 Mechanism Design and Analysis Chapter. 4 Mechanism Design and Analysis 1 All mechanical devices containing moving parts are composed of some type of mechanism. A mechanism is a group of links interacting with each other through joints

More information

The Rocket Steam Locomotive - Animation

The Rocket Steam Locomotive - Animation Course: 3D Design Title: Rocket Steam Locomotive - Animation Blender: Version 2.6X Level: Beginning Author; Neal Hirsig (nhirsig@tufts.edu) (May 2012) The Rocket Steam Locomotive - Animation In this tutorial

More information

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino ROBOTICS 01PEEQW Basilio Bona DAUIN Politecnico di Torino Probabilistic Fundamentals in Robotics Robot Motion Probabilistic models of mobile robots Robot motion Kinematics Velocity motion model Odometry

More information

Resilience / Expertise and technology

Resilience / Expertise and technology Resilience / Expertise and technology The case of robotic surgery AS Nyssen & A. Blavier Cognitive Ergonomics University of Liege, Belgium asnyssen@ulg.ac.be Research question Resilience System s capacity

More information

Comparison of relative (mouse-like) and absolute (tablet-like) interaction with a large stereoscopic work-space

Comparison of relative (mouse-like) and absolute (tablet-like) interaction with a large stereoscopic work-space Comparison of relative (mouse-like) and absolute (tablet-like) interaction with a large stereoscopic work-space Melinos Averkiou and Neil A. Dodgson The Computer Laboratory, University of Cambridge 15

More information

Medical Imaging Specialists and 3D: A Domain Perspective on Mobile 3D Interactions

Medical Imaging Specialists and 3D: A Domain Perspective on Mobile 3D Interactions Medical Imaging Specialists and 3D: A Domain Perspective on Mobile 3D Interactions Teddy Seyed teddy.seyed@ucalgary.ca Frank Maurer frank.maurer@ucalgary.ca Francisco Marinho Rodrigues fm.rodrigues@ucalgary.ca

More information

Assessing the Use of Cognitive Resources in Virtual Reality

Assessing the Use of Cognitive Resources in Virtual Reality Assessing the Use of Cognitive Resources in Virtual Reality William E. Marsh 1, Jonathan W. Kelly 1,2, Veronica J. Dark 1,2, and James H. Oliver 1 1 Human Computer Interaction Graduate Program, Iowa State

More information

Metrics on SO(3) and Inverse Kinematics

Metrics on SO(3) and Inverse Kinematics Mathematical Foundations of Computer Graphics and Vision Metrics on SO(3) and Inverse Kinematics Luca Ballan Institute of Visual Computing Optimization on Manifolds Descent approach d is a ascent direction

More information

Centripetal force, rotary motion, angular velocity, apparent force.

Centripetal force, rotary motion, angular velocity, apparent force. Related Topics Centripetal force, rotary motion, angular velocity, apparent force. Principle and Task A body with variable mass moves on a circular path with ad-justable radius and variable angular velocity.

More information

Robotics. Chapter 25. Chapter 25 1

Robotics. Chapter 25. Chapter 25 1 Robotics Chapter 25 Chapter 25 1 Outline Robots, Effectors, and Sensors Localization and Mapping Motion Planning Motor Control Chapter 25 2 Mobile Robots Chapter 25 3 Manipulators P R R R R R Configuration

More information

Animation. Basic Concepts

Animation. Basic Concepts Animation Basic Concepts What is animation? Animation is movement of graphics or text Some common uses of animation include: Advertising o Example: Web site advertisements that are animated to attract

More information

CATIA: Navigating the CATIA V5 environment. D. CHABLAT / S. CARO Damien.Chablat@irccyn.ec-nantes.fr

CATIA: Navigating the CATIA V5 environment. D. CHABLAT / S. CARO Damien.Chablat@irccyn.ec-nantes.fr CATIA: Navigating the CATIA V5 environment D. CHABLAT / S. CARO Damien.Chablat@irccyn.ec-nantes.fr Standard Screen Layout 5 4 6 7 1 2 3 8 9 10 11 12 13 14 15 D. Chablat / S. Caro -- Institut de Recherche

More information

4D Interactive Model Animations

4D Interactive Model Animations Animation Using 4D Interactive Models MVSand EVS-PRO have two distinctly different animation concepts. Our traditional animations consist of a sequence of bitmap images that have been encoded into an animation

More information

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS E. Batzies 1, M. Kreutzer 1, D. Leucht 2, V. Welker 2, O. Zirn 1 1 Mechatronics Research

More information

How To Analyze Ball Blur On A Ball Image

How To Analyze Ball Blur On A Ball Image Single Image 3D Reconstruction of Ball Motion and Spin From Motion Blur An Experiment in Motion from Blur Giacomo Boracchi, Vincenzo Caglioti, Alessandro Giusti Objective From a single image, reconstruct:

More information

Interactive Computer Graphics

Interactive Computer Graphics Interactive Computer Graphics Lecture 18 Kinematics and Animation Interactive Graphics Lecture 18: Slide 1 Animation of 3D models In the early days physical models were altered frame by frame to create

More information

Circle detection and tracking speed-up based on change-driven image processing

Circle detection and tracking speed-up based on change-driven image processing Circle detection and tracking speed-up based on change-driven image processing Fernando Pardo, Jose A. Boluda, Julio C. Sosa Departamento de Informática, Universidad de Valencia Avda. Vicente Andres Estelles

More information

Project 2: Character Animation Due Date: Friday, March 10th, 11:59 PM

Project 2: Character Animation Due Date: Friday, March 10th, 11:59 PM 1 Introduction Project 2: Character Animation Due Date: Friday, March 10th, 11:59 PM The technique of motion capture, or using the recorded movements of a live actor to drive a virtual character, has recently

More information

D. E. Perry A. Porter? L. G. Votta M. W. Wade. Software Production Research Dept Quality Management Group

D. E. Perry A. Porter? L. G. Votta M. W. Wade. Software Production Research Dept Quality Management Group Evaluating Workow and Process Automation in Wide-Area Software Development D. E. Perry A. Porter? Software Production Research Dept Computer Science Dept Bell Laboratories University of Maryland Murray

More information

Clipping Plane. Overview Posterior

Clipping Plane. Overview Posterior Automated 3D Video Documentation for the Analysis of Medical Data S. Iserhardt-Bauer 1, C. Rezk-Salama 2, T. Ertl 1,P. Hastreiter 3,B.Tomandl 4, und K. Eberhardt 4 1 Visualization and Interactive Systems

More information

VRL and the 3D Rotation in Virtual Reality

VRL and the 3D Rotation in Virtual Reality Three Primary School Students Cognition about 3D Rotation in a Virtual Reality Learning Environment Andy Yeh Queensland University of Technology This paper reports on three primary school

More information

Virtual Data Gloves : Interacting with Virtual Environments through Computer Vision

Virtual Data Gloves : Interacting with Virtual Environments through Computer Vision Virtual Data Gloves : Interacting with Virtual Environments through Computer Vision Richard Bowden (1), Tony Heap(2), Craig Hart(2) (1) Dept of M & ES (2) School of Computer Studies Brunel University University

More information

A General Framework for Overlay Visualization

A General Framework for Overlay Visualization Replace this file with prentcsmacro.sty for your meeting, or with entcsmacro.sty for your meeting. Both can be found at the ENTCS Macro Home Page. A General Framework for Overlay Visualization Tihomir

More information

Static Environment Recognition Using Omni-camera from a Moving Vehicle

Static Environment Recognition Using Omni-camera from a Moving Vehicle Static Environment Recognition Using Omni-camera from a Moving Vehicle Teruko Yata, Chuck Thorpe Frank Dellaert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 USA College of Computing

More information

MATHS LEVEL DESCRIPTORS

MATHS LEVEL DESCRIPTORS MATHS LEVEL DESCRIPTORS Number Level 3 Understand the place value of numbers up to thousands. Order numbers up to 9999. Round numbers to the nearest 10 or 100. Understand the number line below zero, and

More information

Input Technologies and Techniques

Input Technologies and Techniques Input Technologies and Techniques Ken Hinckley Microsoft Research One Microsoft Way Redmond, WA 98052 kenh@microsoft.com DRAFT. To appear in: Chapter 7 Handbook of Human-Computer Interaction Ed. By Andrew

More information

Abstract of Principles and Applications of Multi-touch Interaction by Tomer Moscovich, Ph.D., Brown University, May 2007.

Abstract of Principles and Applications of Multi-touch Interaction by Tomer Moscovich, Ph.D., Brown University, May 2007. Abstract of Principles and Applications of Multi-touch Interaction by Tomer Moscovich, Ph.D., Brown University, May 2007. Many everyday activities rely on our hands ability to deftly control the physical

More information

Fall 2011. Andrew U. Frank, October 23, 2011

Fall 2011. Andrew U. Frank, October 23, 2011 Fall 2011 Andrew U. Frank, October 23, 2011 TU Wien, Department of Geoinformation and Cartography, Gusshausstrasse 27-29/E127.1 A-1040 Vienna, Austria frank@geoinfo.tuwien.ac.at Part I Introduction 1 The

More information

Autodesk Fusion 360 Badge Guide: Design an F1 in Schools Trophy

Autodesk Fusion 360 Badge Guide: Design an F1 in Schools Trophy Autodesk Fusion 360 Badge Guide: Design an F1 in Schools Trophy Abstract: Gain basic understanding of creating 3D models in Fusion 360 by designing an F1 in Schools trophy. This badge may be claimed by

More information

Sensory-motor control scheme based on Kohonen Maps and AVITE model

Sensory-motor control scheme based on Kohonen Maps and AVITE model Sensory-motor control scheme based on Kohonen Maps and AVITE model Juan L. Pedreño-Molina, Antonio Guerrero-González, Oscar A. Florez-Giraldo, J. Molina-Vilaplana Technical University of Cartagena Department

More information

CATIA Functional Tolerancing & Annotation TABLE OF CONTENTS

CATIA Functional Tolerancing & Annotation TABLE OF CONTENTS TABLE OF CONTENTS Introduction...1 Functional Tolerancing and Annotation...2 Pull-down Menus...3 Insert...3 Functional Tolerancing and Annotation Workbench...4 Bottom Toolbar Changes...5 3D Grid Toolbar...5

More information

L R I R A P P O R T D E R E C H E R C H E PUSH MENU : EXTENDING MARKING MENUS FOR PRESSURE-ENABLED INPUT SERVICES

L R I R A P P O R T D E R E C H E R C H E PUSH MENU : EXTENDING MARKING MENUS FOR PRESSURE-ENABLED INPUT SERVICES A P P O T D C H C H L I PUSH MNU : XTNDING MAKING MNUS FO PSSU-NABLD INPUT SVICS HUOT S / NANCL M / BAUDOUIN-LAFON M Unité Mixte de echerche 86 CNS-Université Paris Sud LI /8 apport de echerche N 5 CNS

More information

Go to contents 18 3D Visualization of Building Services in Virtual Environment

Go to contents 18 3D Visualization of Building Services in Virtual Environment 3D Visualization of Building Services in Virtual Environment GRÖHN, Matti Gröhn; MANTERE, Markku; SAVIOJA, Lauri; TAKALA, Tapio Telecommunications Software and Multimedia Laboratory Department of Computer

More information

ExperiScope: An Analysis Tool for Interaction Data

ExperiScope: An Analysis Tool for Interaction Data ExperiScope: An Analysis Tool for Interaction Data François Guimbretière a, Morgan Dixon a, Ken Hinckley b a Department of Computer Science Human-Computer Interaction Lab University of Maryland, College

More information

GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS

GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS GAZETRACKERrM: SOFTWARE DESIGNED TO FACILITATE EYE MOVEMENT ANALYSIS Chris kankford Dept. of Systems Engineering Olsson Hall, University of Virginia Charlottesville, VA 22903 804-296-3846 cpl2b@virginia.edu

More information

Graph Visualization U. Dogrusoz and G. Sander Tom Sawyer Software, 804 Hearst Avenue, Berkeley, CA 94710, USA info@tomsawyer.com Graph drawing, or layout, is the positioning of nodes (objects) and the

More information

Castle Modeling. In this PDF tutorial we will be modeling a simple castle as pictured above.

Castle Modeling. In this PDF tutorial we will be modeling a simple castle as pictured above. Course: 3D Design Title: Castle Modeling Blender: Version 2.6X Level: Beginning Author; Neal Hirsig (nhirsig@tufts.edu) May, 2012 This tutorial assumes that you already know how to: Display orthographic

More information

Face Locating and Tracking for Human{Computer Interaction. Carnegie Mellon University. Pittsburgh, PA 15213

Face Locating and Tracking for Human{Computer Interaction. Carnegie Mellon University. Pittsburgh, PA 15213 Face Locating and Tracking for Human{Computer Interaction Martin Hunke Alex Waibel School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Abstract Eective Human-to-Human communication

More information

Mushaca: A 3-Degrees-of-Freedom Mouse Supporting Rotation

Mushaca: A 3-Degrees-of-Freedom Mouse Supporting Rotation Mushaca: A 3-Degrees-of-Freedom Mouse Supporting Rotation Udayan Umapathi a and Niklas Elmqvist b a MIT Media Lab, Boston, MA, USA b University of Maryland, College Park, MD, USA Abstract Based on kinesiology

More information

Sweet Home 3D user's guide

Sweet Home 3D user's guide 1 de 14 08/01/2013 13:08 Features Download Online Gallery Blog Documentation FAQ User's guide Video tutorial Developer's guides History Reviews Support 3D models Textures Translations Forum Report a bug

More information

Operations management: Special topic: supply chain management

Operations management: Special topic: supply chain management OpenStax-CNX module: m35461 1 Operations management: Special topic: supply chain management Global Text Project This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution

More information

EXCITE: EXploring Collaborative Interaction in Tracked Environments

EXCITE: EXploring Collaborative Interaction in Tracked Environments EXCITE: EXploring Collaborative Interaction in Tracked Environments Nicolai Marquardt 1, Frederico Schardong 2, Anthony Tang 2 1 University College London, Gower Street, London, UK nicolai.marquardt@acm.org

More information

Access 2007 Creating Forms Table of Contents

Access 2007 Creating Forms Table of Contents Access 2007 Creating Forms Table of Contents CREATING FORMS IN ACCESS 2007... 3 UNDERSTAND LAYOUT VIEW AND DESIGN VIEW... 3 LAYOUT VIEW... 3 DESIGN VIEW... 3 UNDERSTAND CONTROLS... 4 BOUND CONTROL... 4

More information

Eye-contact in Multipoint Videoconferencing

Eye-contact in Multipoint Videoconferencing Eye-contact in Multipoint Videoconferencing Birgit Quante and Lothar Mühlbach Heinrich-Hertz-Institut für Nachrichtentechnik Berlin GmbH (HHI) Einsteinufer 37, D-15087 Berlin, Germany, http://www.hhi.de/

More information

Voice Driven Animation System

Voice Driven Animation System Voice Driven Animation System Zhijin Wang Department of Computer Science University of British Columbia Abstract The goal of this term project is to develop a voice driven animation system that could take

More information

Experiments with a Camera-Based Human-Computer Interface System

Experiments with a Camera-Based Human-Computer Interface System Experiments with a Camera-Based Human-Computer Interface System Robyn Cloud*, Margrit Betke**, and James Gips*** * Computer Science Department, Boston University, 111 Cummington Street, Boston, MA 02215,

More information

This tutorial assumes that Visual3D has been installed and that a model has been created as described in Tutorial #1.

This tutorial assumes that Visual3D has been installed and that a model has been created as described in Tutorial #1. C-Motion Online Documentation Visual3D : Tutorial : Data Visualization Objectives (# 1318) This tutorial assumes that Visual3D has been installed and that a model has been created as described in Tutorial

More information

CATIA for Design and Engineering. Version 5 Releases 14 & 15. David S. Kelley. Central Michigan University SDC

CATIA for Design and Engineering. Version 5 Releases 14 & 15. David S. Kelley. Central Michigan University SDC CATIA for Design and Engineering ersion 5 Releases 4 & 5 David S. Kelley Central Michigan University SDC PUBLICATIONS Schroff Development Corporation www.schroff.com www.schroff-europe.com TUTORIAL Extruded

More information

The aims: Chapter 14: Usability testing and field studies. Usability testing. Experimental study. Example. Example

The aims: Chapter 14: Usability testing and field studies. Usability testing. Experimental study. Example. Example Chapter 14: Usability testing and field studies The aims: Explain how to do usability testing through examples. Outline the basics of experimental design. Discuss the methods used in usability testing.

More information

Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF

Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF Kucsera Péter (kucsera.peter@kvk.bmf.hu) Abstract In this article an autonomous advertising mobile robot that has been realized in

More information

Practical Work DELMIA V5 R20 Lecture 1. D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr

Practical Work DELMIA V5 R20 Lecture 1. D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr Practical Work DELMIA V5 R20 Lecture 1 D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr Native languages Definition of the language for the user interface English,

More information

Robot Perception Continued

Robot Perception Continued Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart

More information

Working With Animation: Introduction to Flash

Working With Animation: Introduction to Flash Working With Animation: Introduction to Flash With Adobe Flash, you can create artwork and animations that add motion and visual interest to your Web pages. Flash movies can be interactive users can click

More information

CATIA Drafting TABLE OF CONTENTS

CATIA Drafting TABLE OF CONTENTS TABLE OF CONTENTS Introduction...1 Drafting...2 Drawing Screen...3 Pull-down Menus...4 File...4 Edit...5 View...6 Insert...7 Tools...8 Drafting Workbench...9 Views and Sheets...9 Dimensions and Annotations...10

More information

Tilt-Controlled Mobile Games: Velocity-control vs. Position-control

Tilt-Controlled Mobile Games: Velocity-control vs. Position-control Tilt-Controlled Mobile Games: -control vs. -control Catalina I. Constantin & I. Scott MacKenzie Dept. of Electrical Engineering and Computer Science York University Toronto Ontario Canada M3J 1P3 catalinaioanaconstantin@gmail.com

More information

Section 13.5 Equations of Lines and Planes

Section 13.5 Equations of Lines and Planes Section 13.5 Equations of Lines and Planes Generalizing Linear Equations One of the main aspects of single variable calculus was approximating graphs of functions by lines - specifically, tangent lines.

More information

Solving Simultaneous Equations and Matrices

Solving Simultaneous Equations and Matrices Solving Simultaneous Equations and Matrices The following represents a systematic investigation for the steps used to solve two simultaneous linear equations in two unknowns. The motivation for considering

More information

A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS

A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS Sébastien Briot, Ilian A. Bonev Department of Automated Manufacturing Engineering École de technologie supérieure (ÉTS), Montreal,

More information

Visualization of 2D Domains

Visualization of 2D Domains Visualization of 2D Domains This part of the visualization package is intended to supply a simple graphical interface for 2- dimensional finite element data structures. Furthermore, it is used as the low

More information

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video

More information

Renishaw 2008. apply innovation TM. Calibrating 5-axis machines to improve part accuracy. 5Align

Renishaw 2008. apply innovation TM. Calibrating 5-axis machines to improve part accuracy. 5Align Calibrating 5-axis machines to improve part accuracy 5Align Productive Process Pyramid TM Understanding and tracking machine behaviour Process verification Thermal compensation In-cycle process control

More information

ART 269 3D Animation Fundamental Animation Principles and Procedures in Cinema 4D

ART 269 3D Animation Fundamental Animation Principles and Procedures in Cinema 4D ART 269 3D Animation Fundamental Animation Principles and Procedures in Cinema 4D Components Tracks An animation track is a recording of a particular type of animation; for example, rotation. Some tracks

More information

INTERACTING WITH EYE MOVEMENTS IN VIRTUAL ENVIRONMENTS

INTERACTING WITH EYE MOVEMENTS IN VIRTUAL ENVIRONMENTS INTERACTING WITH EYE MOVEMENTS IN VIRTUAL ENVIRONMENTS Vildan Tanriverdi and Robert J.K. Jacob Department of Electrical Engineering and Computer Science Tufts University Medford, MA 02155, USA {vildan,jacob}@eecs.tufts.edu

More information

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR Paul Mrstik, Vice President Technology Kresimir Kusevic, R&D Engineer Terrapoint Inc. 140-1 Antares Dr. Ottawa, Ontario K2E 8C4 Canada paul.mrstik@terrapoint.com

More information

SketchUp Instructions

SketchUp Instructions SketchUp Instructions Every architect needs to know how to use SketchUp! SketchUp is free from Google just Google it and download to your computer. You can do just about anything with it, but it is especially

More information

Tutorial: Biped Character in 3D Studio Max 7, Easy Animation

Tutorial: Biped Character in 3D Studio Max 7, Easy Animation Tutorial: Biped Character in 3D Studio Max 7, Easy Animation Written by: Ricardo Tangali 1. Introduction:... 3 2. Basic control in 3D Studio Max... 3 2.1. Navigating a scene:... 3 2.2. Hide and Unhide

More information

Introduction to scripting with Unity

Introduction to scripting with Unity Introduction to scripting with Unity Scripting is an essential part of Unity as it defines the behaviour of your game. This tutorial will introduce the fundamentals of scripting using Javascript. No prior

More information

Selection and Zooming using Android Phone in a 3D Virtual Reality

Selection and Zooming using Android Phone in a 3D Virtual Reality Selection and Zooming using Android Phone in a 3D Virtual Reality Yanko Sabev Director: Prof. Gudrun Klinker (Ph.D.) Supervisors: Amal Benzina Technische Universität München Introduction Human-Computer

More information