AR PROJECT: A GENERIC SYSTEM FOR AUGMENTED REALITY APPLICATIONS A THESIS SUBMITTED TO THE GRADUATE SCHOOL OF NATURAL AND APPLIED SCIENCES

Size: px
Start display at page:

Download "AR PROJECT: A GENERIC SYSTEM FOR AUGMENTED REALITY APPLICATIONS A THESIS SUBMITTED TO THE GRADUATE SCHOOL OF NATURAL AND APPLIED SCIENCES"

Transcription

1 AR PROJECT: A GENERIC SYSTEM FOR AUGMENTED REALITY APPLICATIONS A THESIS SUBMITTED TO THE GRADUATE SCHOOL OF NATURAL AND APPLIED SCIENCES OF ATILIM UNIVERSITY BY HAKAN BOZLU IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE IN THE DEPARTMENT OF COMPUTER ENGINEERING SEPTEMBER 2005

2 Approval of the Graduate School of Natural and Applied Sciences Prof. Dr. brahim Akman Director I certify that this thesis satisfies all the requirements as a thesis for the degree of Master of Science/Arts. Prof. Dr. brahim Akman Head of Department This is to certify that we have read this thesis and that in our opinion it is fully adequate, in scope and quality, as a thesis for the degree of Master of Science/Arts. Instructor Bulent Gürsel Emiro+lu Co-Supervisor Asst. Prof. Dr. Nergiz Ercil Ça+.ltay Supervisor Examining Committee Members Asst. Prof. Dr. Çi+dem Turhan Asst. Prof. Dr. Nergiz Ercil Ça+.ltay Asst. Prof. Dr. Nevzat Sezer Asst. Prof. Dr. Murat Erten Instructor Bülent Gürsel Emiro+lu

3 ABSTRACT AR PROJECT: A GENERIC SYSTEM FOR AUGMENTED REALITY APPLICATIONS Bozlu, Hakan M.S., Computer Engineering Department Supervisor: Asst. Prof. Dr. Nergiz Ercil Cagiltay Co-Supervisor: Instructor Bulent Gursel Emiroglu September 2005, 126 pages The aim of this thesis is to implement a generic system for augmented reality applications; that enables the usage of augmented reality in various areas. Developed system expands the capabilities of ARToolKit, an open source non-commercial library that enables development of Augmented Reality applications. Our system not only implements 3D models as animation input but also supplies a graphical user interface that allows even non-technical users from any discipline to benefit from augmented reality. This interface accepts models and various parameters from the user by means of which the user can tailor the augmented environment according to his/her aim of usage. With these inputs, 3D objects are superimposed on real world to manipulate animation schemas based on real patterns and virtual objects. Keywords: ARToolKit, Augmented Reality 3

4 ÖZ AR PROJECT: EKLENM GERÇEKLK UYGULAMALARI ÇN GENEL BR SSTEM Bozlu, Hakan Yüksek Lisans, Bilgisayar Mühendisli+i Bölümü Tez Yöneticisi: Yrd. Doç. Dr. Nergiz Ercil Ça+.ltay Ortak Tez Yöneticisi: Ö+r. Gör. Bülent Gürsel Emiro+lu Eylül 2005, 126 sayfa Bu çal.eman.n amac., eklenmie gerçekli+in de+ieik alanlarda kullan.lmas.n. sa+lamak üzere genel bir sistem olueturmakt.r. GeliEtirilen sistem ticari olmayan ve aç.k kaynak kodlu ARToolKit kütüphanesinin yeteneklerini genieletir. Sistemimiz sadece 3B modelleri animasyon girdileri olarak kabul etmekle kalmaz, ayn. zamanda herhangi bir disiplinden teknik bilgisi olmayan kullan.c.lar.n bile eklenmie gerçeklikten faydalanabilece+i bir grafik arayüzü sa+lar. Bu arayüz, modelleri ve çeeitli parametreleri eklenmie gerçeklik ortam.n. kullan.m amac.na göre biçim verebilece+i Eekilde kullan.c.dan kabul eder. Bu girdilerle 3B nesneler, gerçek paternlere ve sanal nesnelere dayal. animasyon Eemalar. olueturmak üzere gerçek dünyaya yerleetirilir. Anahtar Kelimeler: ARToolKit, EklenmiE gerçeklik 4

5 To My Parents and friends for their precious support 5

6 ACKNOWLEDGMENTS I express sincere appreciation to my supervisor Asst. Prof. Dr. Nergiz Ercil Ça+.ltay for her guidance and insight throughout the research. Thanks also go to my cosupervisor Instructor Bulent Gürsel Emiro+lu. 6

7 TABLE OF CONTENTS ABSTRACT... 3 ÖZ... 4 TABLE OF CONTENTS... 7 LIST OF TABLES LIST OF FIGURES LIST OF ABBREVIATIONS CHAPTER 1 INTRODUCTION CHAPTER 2 OVERVIEW OF AR Characteristics of AR Whatever done must be in real time Computer generated information must be aligned with real world An appropriate level of interaction is required An appropriate level of realism is required AR is not limited with adding computer generated graphics An AR example History of AR Difference between Augmented Reality and Virtual Reality (Advantages over Virtual Reality) A Detailed Example: VR versus AR by means of education AR Applications Medical Manufacturing and Repair Annotation and Visualization Robot Path Planning Entertainment Military Purposes

8 2.5.7 Games Other Possible Uses Specific Hardware used in AR Head Mounted Displays Future Roadmap of AR A Prediction: Possible Usages of AR in Security Forces Tools for AR Applications Limitations of AR Computational Problems Problems with Mobility Tracking Problems Problems with Registering User Interface (UI) Problems Resistance of people Other Problems Summary CHAPTER 3 THE SCOPE OF THE STUDY Defining Requirements Comparison with ARToolKit Summary CHAPTER 4 ENVIRONMENT About ARToolKit Working Method of ARToolKit Preparing the Development Environment Other Programs/Tools Used To Support This Project Test Systems Test System Test System Patterns Creating a pattern

9 4.6 Models CHAPTER 5 DESIGN AND TECHNICAL DETAILS OF THE DEVELOPED SYSTEM System Design System-wide design decisions System Architectural Design Interface Design System Development System Testing Stage 1 Testing Stage 2 Testing What Makes AR Project Different Chapter 6 A Case Study with the Developed Environment: Solar System Running the AR Project Program Loading Patterns Loading Model Files Defining Animations Assigning Animations to Patterns Results Advanced Options Tab CHAPTER 7 DISCUSSION AND CONCLUSION Future Work REFERENCES APPENDICES Appendix A (ARToolKit Sample Code)

10 Appendix B Class Diagrams of the Developed System Appendix C- Sample Object_Data File Appendix D Sample Model Data File Appendix E Sample Animation Info File Appendix F Sample Scenario Data File Appendix G Some of the Program Source Code Appendix H Interview questions for the Testers Appendix I A Sample Interview Table Appendix J - Rating Scale for Testing Minimum Pattern Size (Scaled)

11 LIST OF TABLES TABLE 1: COMPARISON OF AR PROJECT WITH ARTOOLKIT TABLE 2: STEPS OF ARTOOLKIT [1] TABLE 3: TEST SYSTEM TABLE 4: TEST SYSTEM TABLE 5: A SAMPLE FUNCTIONAL TESTING TABLE TABLE 6: A SAMPLE GUI TESTING TABLE TABLE 7: PROFILE OF THE TESTERS TABLE 8: THE LIST OF ADDED ANIMATIONS

12 LIST OF FIGURES FIGURE 1: MILGRAM S REALITY VIRTUALITY CONTINUUM [40] FIGURE 2: AN AR APPLICATION EXAMPLE. [37] FIGURE 3: REAL WORLD AND RENDERED WORLD [25] FIGURE 4 : VIRTUAL FETUS INSIDE WOMB OF PREGNANT PATIENT. [2] FIGURE 5 : PROTOTYPE LASER PRINTER MAINTENANCE APPLICATION [3] FIGURE 6 : MICROVISION'S NOMAD DISPLAY SYSTEM FOR TECHNICIANS [39] FIGURE 7 : A LABEL SUPERIMPOSED ON A STUDENT FIGURE 8: A BASIC HEAD MOUNTED DISPLAY SYSTEM [37] FIGURE 9 : SCHEMATIC OF OPTICAL BASED AR [22] FIGURE 10 : CONCEPTS OF OPTICAL SEE-THROUGH AND VIDEO SEE- THROUGH DISPLAYS [3] FIGURE 11: SONY S GLASSTRON AUDIO/VIDEO HEADSET [36] FIGURE 12: MICROVISION S NOMAD DISPLAY SYSTEM FOR MILITARY APPLICATIONS [39] FIGURE 13: A USUAL MOBILE AR SYSTEM [34] FIGURE 14: HIRO PATTERN USED IN ARTOOLKIT EXAMPLES [1]

13 FIGURE 15: THE FIRST PATTERN CREATED DURING THIS PROJECT FIGURE 16: MECHANISM DESIGNED FOR ARTOOLKIT TO RECOGNIZE PATTERNS FIGURE 17: THE SOURCE USED FOR 3DS MODEL FILE FIGURE 18: 3D OF OUR MODEL FILE FIGURE 19: 3D MODEL ALIGNED WITH REAL WORLD FIGURE 20: MAIN PAGE OF THE AR PROJECT FIGURE 21: LOAD PATTERN FIGURE 22: LOAD MODEL FIGURE 23: DEFINE ANIMATION FIGURE 24: ASSIGN ANIMATIONS TO PATTERNS FIGURE 25: ADVANCED OPTIONS FIGURE 26: VIDEO INTERFACE IDENTIFICATION AND DIAGRAM FIGURE 27: SOLAR SYSTEM PATTERNS. A) CRESCENT, B) EARTH, C) STAR FIGURE 28: ADDING A NEW PATTERN (PART OF THE SCREEN) FIGURE 29: THE TEXTURE USED IN EARTH.3DS FIGURE 30: THE EARTH.3DS MODEL FILE

14 FIGURE 31: THE PROPERTIES OF 3DS MODEL FILES SEEN IN LOAD MODEL SECTION. (PART OF THE SCREEN) FIGURE 32: ASSIGNING ANIMATIONS TO PATTERNS. (PART OF THE SCREEN) FIGURE 33: VIDEO INTERFACE IDENTIFICATION DIAGRAM WINDOW FIGURE 34: THE SUN ROTATING AROUND ITSELF FIGURE 35: EARTH AND THE MOON FIGURE 36: THE EARTH AND THE SUN FIGURE 37: THE CALIBRATIONS PART IN THE ADVANCED OPTIONS SECTION. (PART OF THE SCREEN) FIGURE 38: THE MANUAL FILE EDIT OPERATIONS PART IN THE ADVANCED OPTIONS SECTION. (PART OF THE SCREEN) FIGURE 39: SAMPLE OBJECT_DATA FILE FIGURE 40: SAMPLE MODEL_DATA FILE FIGURE 41: SAMPLE ANIM_DATA FILE. (PART OF THE FILE) FIGURE 42: COMPARISON OF ARTOOLKIT AND ARTAG PATTERNS [01], [33]

15 LIST OF ABBREVIATIONS 3D: Three Dimensional AR: Augmented Reality AV: Augmented Virtuality GPS: Global Positioning System GUI: Graphical User Interface HMD: Head Mounted Display HMS: Helmet-Mounted Sights HUD: Head Up Display IT: Information Technologies MR: Mixed Reality PC: Personal Computer SDLC: Software Development Life Cycle SG: Stereo Graphics SV: Stereo Video UI: User Interface VE: Virtual Environment(s) VR: Virtual Reality 15

16 Chapter 1 Introduction The rapid change in technological progress impresses even the most visionary ones. Computers and computer related products are now everywhere, it is impossible to imagine a world without them. The new terms which have hardly a few decades of history like Internet and Virtual Reality (VR) are now parts of our lives. However, our hunger for progress is not satisfied, we are still looking for better ways to benefit from new technologies. Augmented Reality (AR) is a promising area to insert computer assets into our lives. The basic idea of AR is to superimpose sense enhancements over a real-world environment. It is a perfect solution as it combines the advantages of both virtuality and reality. Until the end of 90 s, AR was only in demo phase because of the inadequate technological resources. In the most recent years, the developed countries government and the private sector are recognizing the value of AR as the progress in hardware provided wider facilities to construct AR systems. AR is mostly used in (but not limited to) adding computer generated visual enhancements into real life applications. It is clear that AR will be in our lives more and in the near future, it will be indispensable from our daily life. The aim of this study is to emphasize importance of AR and develop a tool; our study not only extends the capabilities of the current situation, but also presents a user friendly environment for even non-technical persons to increase the usage of AR. This thesis is organized as follows; the second chapter gives description and characteristics of the AR, as well as application areas, limitations and possible future roadmap of AR. Chapter three defines the requirements and scope of our study, gives a brief explanation of our work. Chapter four gives information about the development environment; namely the tools, and the technologies used. Chapter five is the core of this study. The design and testing processes of the tool are explained in this chapter. Chapter six explains the usage of the tool over a case study. Full capabilities and outcomes of the tool can are explained in detail in this chapter. 16

17 Chapter seven discusses the study and provides a conclusion and finally; possible future work of our study is mentioned. 17

18 Chapter 2 Overview of AR AR is a variation of Virtual Environments (VE), or Virtual Reality as it is more commonly called [2]. Instead of having increasingly smaller and increasingly more difficult- to-use displays, information and interfaces should be superimposed directly onto the vision of users [23]. Figure 1 shows the Reality-Virtuality Continuum. The whole spectrum, between pure reality and pure computer generated graphics, is called Mixed Reality. Figure 1: Milgram s Reality Virtuality Continuum [40] Although this classification can be subcategorized as AR and Augmented Virtuality [40], [5], [41] as seen in the Figure 1, some authorities prefer a unified classification under the name AR [2], [7]. 2.1 Characteristics of AR Although it is helpful for a glance, Milgram s Reality Virtuality Continuum is not sufficient to fully describe what AR is and what it is not [40]. Without defining the prerequisites, even science fiction movies using 2-D overlays as in Jurassic Park seem like AR examples. The following sections are organized to describe some important characteristics of AR Whatever done must be in real time This is the most significant characteristic of AR. All computation and superimposing work must be done in real time. Even if all of the other constraints are met, a prerendered scenario can not be considered as an AR example [6]. 18

19 2.1.2 Computer generated information must be aligned with real world This means overlaying a layer of real world graphics onto a layer of computer generated graphics does not qualify as AR [19]. A TV broadcast of a football match may be in real-time and may also include a computer generated TV channel logo but that does not mean it is AR An appropriate level of interaction is required Real world and computer generated graphics must be connected; somehow a change in the real world must make at least a little difference in the virtual part or vice versa. [41] An appropriate level of realism is required Ultimate goal of AR is that the user can not differentiate the Virtuality from the reality. Although this goal is nearly impossible to be achieved for now, an appropriate level of realism is required for user AR is not limited with adding computer generated graphics AR is not limited with only the sight sense. AR can potentially apply to all senses, including hearing, touch and smell [3], [13]. Moreover, it is not limited with adding, certain AR applications also require removing real objects from the perceived environment. 2.2 An AR example Figure 2 can be given as an example of AR. It illustrates a car s front window with information displayed on it. While the driver travels through the city, related information is reflected onto the window. 19

20 Figure 2: An AR Application Example. [37] The data changes in real-time as the coordinates of the vehicle, time or information filter mechanism changes. Address information is aligned with the apartments, traffic labels and so on. All reflected data seems like street labels or car displays so that the user can adapt information and feels familiar with them. Information does not block the sight of user to prevent accidents. 2.3 History of AR The idea of incorporating the computer transparently into our daily lives is not new. More than a decade ago, the computer scientist Mark Weiser termed this ubiquitous computing [35], where there are small computers all around us, but not directly visible to or interesting for the user [11]. Wendy E. Mackay mentions about a top secret talk with N. Sheridon in 1991 about a new technology called electronic paper which is a thin layer with tiny pixel sized oil drops that can be rearranged again and again to form meaningful text and images [21]. Although this technology and similar ones could never be realized, they became the basis of the latter ones and form the fundamentals of AR concepts [20]. 20

21 Willhem F. Bruns describes the latter milestones in his paper as follows [12]; vision of a room with action generated by ubiquitous computers (1991), paradigmatic shift of computer augmented environments (1993), behavior construction kits on real objects (1993) and graspable user interfaces (1995). After then, countless accomplishments have been made [3], [41]. 2.4 Difference between Augmented Reality and Virtual Reality (Advantages over Virtual Reality) The worlds created with VR are sometimes lacks for user interfaces created for real life applications. Either these worlds are very simplistic such as environments created for immersive entertainment and games, or the system that can create a more realistic environment has a million dollar price tag such as flight simulators [7], [14]. Figure 3: Real World and Rendered World [25] AR, the less obstrusive cousin of VR has a better chance of becoming a viable user interface for applications requiring manipulation of complex three-dimensional information as a daily routine as AR rhetorically does not conceal any of the real-life information as opposite to the VR s rendered real-like environment as seen in Figure 3 [29]. Perhaps most diametrically opposed to our vision is the notion of "virtual reality," which attempts to make a world inside the computer. Users not only use special 21

22 goggles that project an artificial scene on their eyes but also they wear gloves or even body suits that sense their motions and gestures so that they can move about and manipulate virtual objects (as in AR). Although it may have its purpose in allowing people to explore realms otherwise inaccessible, virtual reality is only a map, not a territory. It excludes desks, offices, other people not wearing goggles and body suits, weather, grass, trees and in general everything (but the computer generated information), the infinite details of the environment. Virtual reality focuses on simulating the world rather than on invisibly enhancing the world that already exists [35]. In contrast, AR allows the user to see the real world, with virtual objects superimposed upon or composited with the real world. Therefore, AR supplements reality, rather than completely replacing it [2]. It benefits from both real and virtual environments advantages. You do not loose the details of real life which is nearly impossible to be computer generated in real-time, in addition you can enrich the environment with computer generated information and simulations. Clearly VR comprises a broad spectrum of ideas and many of the technologies appear to be similar for AR and VR systems. For example, Head Mounted Displays (HMD s) may be used in both systems; fast real-time rendering processes are necessary to achieve sufficient performance; tracking of user is required in both environments; both systems need immersive environments [16]. However, the differences are quite significant. In order to have a better view about the difference of VR and the AR, both technologies are analyzed below in the context of education A Detailed Example: VR versus AR by means of education Both VR and AR can show their strengths by means of education. As an example, conventional methods can be used to teach geometry but only mixed and virtual realities are capable of letting student to walk around the surfaces and see how the variables inter-relate [15]. AR systems are superior to VR systems because the students actually see three-dimensional objects proportional to their environment [10]. 22

23 Research showed that the most significant weakness of VR as an educational tool is that VR is not successful at displaying text, especially where Greek letters, mathematical symbols, and super- and sub-scripts are involved due to the fact that VR is a graphical environment rather than a textual one [15]. AR is not affected by that as VR does. For example, it is cleared that a workbench combines both powers of virtual and real environments [13].As a conclusion; AR does not only house all the advantages of VR, but also carries them at a higher level. 2.5 AR Applications In this section, some applications of AR on different areas such as medicine, manufacture, military and entertainment are discussed Medical Doctors could use AR as a visualization and training aid for surgery. It may be possible to collect 3-D datasets of a patient in real time, using non-invasive sensors like Magnetic Resonance Imaging (MRI), Computed Tomography scans (CT), or ultrasound imaging to be rendered and combined in real time with a view of the real patient giving a doctor "X-ray vision" inside a patient. This would be very useful during minimally-invasive surgery, which reduces the trauma of an operation by using small incisions or no incisions at all [2]. At UNC Chapel Hill, a research group has conducted trial runs of scanning the womb of a pregnant woman with an ultrasound sensor, generating a 3-D representation of the fetus inside the womb and displaying that in a see-through HMD (Figure 4). The goal is making a 3D stethoscope. [2]. 23

24 Figure 4 : Virtual fetus inside womb of pregnant patient. [2] Manufacturing and Repair Another category of AR applications is the assembly, maintenance, and repair of complex machinery. Instructions might be easier to understand if they were available as 3-D drawings superimposed upon the actual equipment. As shown in Figure 5, these superimposed 3-D drawings can be animated, making the directions even more explicit [2]. Figure 5 : Prototype laser printer maintenance application [3]. For a few years, these AR aided manufacturing and repair systems are realized. Figure 6 is a commercial example. Research showed that AR helps technicians to be 24

25 10 percent faster when compared to the current situation which is using notebook computers [39]. Figure 6 : Microvision's Nomad Display System for Technicians [39] Annotation and Visualization AR could be used to annotate objects and environments with public or private information. For example, a hand-held display could provide information about the contents of library shelves as the user walks around the library [43]. Researchers at Columbia demonstrated this with the notion of attaching windows from a standard user interface onto specific locations in the world, or attached to specific objects as reminders [8]. Figure 7 shows a label window superimposed upon a student. Student has a tracking device and the instructor has a HMD. As the student moves around, the label follows his location, providing the instructor with a reminder of what s/he needs to talk to the student about. 25

26 Figure 7 : A Label Superimposed on a student Robot Path Planning Tele operation of a robot is often a difficult problem, especially when the robot is far away, with long delays in the communication link [2]. Under this circumstance, instead of controlling the robot directly, it may be preferable to control a virtual version of the robot. The user plans and specifies the robot's actions by manipulating the local virtual version in real time. Once the plan is tested and determined, then the user tells the real robot to execute the specified plan [2] Entertainment At SIGGRAPH '95 conference, several exhibitors showed "Virtual Sets" that merge real actors with virtual backgrounds, in real time and in 3-D. The actor stands in front of a large blue screen, while a computer-controlled motion camera records the scene. Since the camera's location is tracked, and the actor's motions are scripted, it is possible to digitally composite the actor into a 3-D virtual background. The entertainment industry sees this as a way to reduce production costs: creating and storing sets virtually is potentially cheaper than constantly building new physical sets from scratch. 26

27 2.5.6 Military Purposes From all of the possible benefits of AR, situation awareness systems seem the most promising [19]. For example infrastructure utility information such as the target s position can be reflected to the soldier s eye like an X-Ray vision so effective urban operations can be achieved [18]. For many years, military aircraft and helicopters have used Head-Up Displays (HUDs) and Helmet-Mounted Sights (HMS) to superimpose vector graphics upon the pilot's view of the real world. Besides providing basic navigation and flight information, these graphics are sometimes registered with targets in the environment, providing a way to aim the aircraft's weapons. [3] Games Ever since the first graphical computer game Pong, the game industry became the locomotive of graphics technologies development and AR is no exception. Currently AR version of famous game Quake, which is renamed as ARQuake is one of the best examples [22]. Since the game engine is designed for desktops, the software team encountered many limitations, but there are also successful AR games created Other Possible Uses AR can be used for construction applications. It has several benefits in the design and marketing, Visualization during construction and Maintenance and Renovation areas [5], [27]. AR is also a perfect environment for telecommunication. Kansei, adding emotions and sensitivity to communication, can be achieved using AR. This way, it will be like the ultimate form of communication, face-to-face [30]. 2.6 Specific Hardware used in AR The success of any virtual/augmented reality experience depends on the user s sense of presence in the environment. There are different types of devices trying to accomplish that. Different devices best suit different interaction techniques. 27

28 2.6.1 Head Mounted Displays Just as monitors allow us to see text and graphics generated by computers, headmounted displays (HMDs) as seen in Figure 8 will enable us to view graphics and text created by augmented-reality systems. As AR systems are slowly becoming widespread, there haven't been many HMDs created specifically with AR in mind. Most of the displays were specifically created for VR. Figure 8: A basic Head Mounted Display System [37] There are two basic types of HMDS; video see-through as seen in Figure 9 and optical see-through as seen in Figure 10. Figure 9 : Schematic of optical based AR [22]. 28

29 The main difference is as follows; optical see-through HMDs allows user to see the real world and superimpose computer generated information, video see-through HMDs mix reality and virtuality then project to the user. Video see-through displays block out the wearer's surrounding environment, on the inside of the display, the video image is played in real-time and the graphics are superimposed on the video. This type of HMDs cause more lag, meaning that there is a delay in imageadjustment when the viewer moves his or her head. Figure 10 : Concepts of optical see-through and video see-through displays [3] AR displays are still pretty primitive; but developers believe that they can create a display that resembles a pair of eyeglasses. Most companies who have made optical see-through displays have gone out of business. Sony, a company in electronics market, makes a see-through display that some researchers use, called the Glasstron as seen in Figure 11. Unfortunately, Sony discontinued manufacturing this model, leaving the electronics market selling only leftover products for exorbitant prices. 29

30 Figure 11: Sony s Glasstron Audio/Video Headset [36] Microvision's Virtual Retinal Display holds the most promise for an augmentedreality system but the problem with the Microvision display is that, it currently costs about $10,000 [41]. Figure 12 shows an example of this technology. Figure 12: Microvision s Nomad Display System for Military Applications [39] 30

31 2.7 Future Roadmap of AR Computer generated material can be broadcasted which can be referred as Augmented TV. Customers with different requests can filter as they wish to superimpose their own materials. For example, a customer driving abroad can filter AR to see; Road signs in his own language, Where to eat for his favorite cuisine, Directions to his destination etc. A user may wish to add any information, highlight some of them or even delete some real-life elements (with considering safety issues of course) A Prediction: Possible Usages of AR in Security Forces Security forces have a wide range of areas to use AR, as their job is done in realtime, interactively and with interacting real people; meaning that VR can not answer their demands. Some application areas can be predicted as below; The Robocop like applications as described in chapter can be developed for security forces. Cabs or patrol cars can be tagged electronically thus making them visible to security forces easily, silently and securely. If a problematic situation occurs (e.g. kidnapping), electronic tag will be activated, a vision only accessible to HMD users having a valid encryption key will be generated making the car easily traceable without endangering or annoying the crowd. AR can be a perfect tool for shooting exercises. With conventional methods, static targets are used, that never helps to train for the dynamic conditions thus lacking the main purpose of it. Virtual reality may be offered as a solution, yet if we turn a blind eye on the fact that it will be an extremely expensive alternative, we are still missing something; the feeling of reality, which we can not risk to lose for such a critical case. Concerning all of these, other than extremely expensive or inapplicable solutions [18], AR seems to be the only reasonable solution. 31

32 2.8 Tools for AR Applications There are some available tools for AR applications like Studierstube and D Fusion [29]. However, the most popular one is ARToolKit. The following features of ARToolKit make it more popular among the others. ARToolKit is an Open Source library; it can be developed for academic purposes. It is free for non-commercial purposes It is widely used by academic majority It overcomes registering problems with a simple pattern recognizing mechanism It overcomes UI problems with the same solution. Registering and tracking problems are no issue for same reason It can be used with a simple webcam-computer mechanism, the cost of installation is minimum It does not need huge processing resources, It is compatible with HMD s [1]. On the other hand, ARToolKit comes with its own drawbacks; it is not designed for outdoor use, it needs square patterns in 2D plane to work, it does not have any GUI or user friendly facility, it demands excessive amount of programming to be developed. 2.9 Limitations of AR No matter what tool is used, AR is still facing difficulties mostly because of the fact that amount of progress in needed technologies is not adequate. These problems can be subcategorized as below; Computational Problems For AR applications, the reasonable image generation rate is 10 frames per second [28]. Rendering all the information and registering them in real-time needs excessive amount of computation which is not feasible. 32

33 To maximize the effectiveness, algorithms must be developed to provide some kind of information filtering meaning that identifying and prioritizing the information and displaying only the needed information at a given point in time [8]. For example only the name of the object will be displayed normally, but extended information like the usage of the object will be displayed when the user comes near of it or the texture - quality or polygon number can be reduced Problems with Mobility Outdoor AR systems are seriously confronted mobility problems. Figure 13: A usual Mobile AR system [34] Figure 13 shows a mobile AR system [34]. A standard AR outdoor system includes a wearable computer, a battery, a GPS unit, an orientation tracker, a GPS antenna, a camera, a see through display (HMD) and a wireless input device [42]. This equipment weights almost 25 kilograms and this can not be reduced for now. AR systems need more compact and lightweight hardware. 33

34 2.9.3 Tracking Problems Tracking accuracy is the most expensive problem of AR systems to overcome and the success of any mixed reality experience depends on the user s sense of presence which is bound to tracking. An ideal tracking system should generate position and orientation computations in real time without affecting the user s freedom of motion [17]. Tracking accuracy can change dynamically as a function of the user s location and other variables specific to the tracking technologies used. This is especially problematic for mobile AR systems, which ideally require extremely precise position tracking for the user s head [26]. In order for the computer to render the part of the world for the user, this information about the orientation of the user s head is also required [22]. Even the small errors can lead to significant errors, thus the error threshold is very low [19]. There are many techniques of tracking, including ultrasonic trackers, optical trackers, inertial trackers, magnetic trackers, kinematical GPS trackers and so on [9], [4]. There are also hybrid solutions to increase efficiency [26[. These techniques are either expensive like satellite GPS systems or less accurate like WiFi network antennas [41]. Practically, user s location and orientation are tracked by two different tracking mechanisms adding applicational and computational cost to the system Problems with Registering To register visual or audio information, the system must know the exact information of both user s location and orientation, otherwise resultant system will be misleading or worse, unusable [26]. If the real world is static, the problem will be solved in advanced by measuring the real world before and/or modeling it, but generally this is not the case, real world is usually dynamic [5], [19]. Moreover, AR systems should be able to adapt to dynamic systems to be successful User Interface (UI) Problems The UI problems also arise for the outdoor AR systems. As user is not within the reach of input devices like keyboard and mouse, new input devices like gloves, 34

35 gesture devices and speech techniques must be developed [24], [16], [18]. Not only these input devices add computational load on the system, but also they need tracking Resistance of people So far, we have dwelt upon the capabilities and benefits of AR, but there is a serious drawback which will make AR to face the resistance of people. To encourage people to use AR systems forms one simple fact which is tagging them electronically. Their exact locations, choices, destinations and private information (like the way they filter information) will be well-known real-time and seven days 24 hours. This can be a major attack on the people s anonymity which they are losing gradually day by day. And they have a point; research showed the more private information is distributed, the more it is used for illegal or non-ethical reasons Other Problems AR may also cloud the reality. What happens if when the links break down and user still thinks of himself in the AR environment? Even if the electronic ink mentioned before was successful, a broken link of the augmentation of the physical paper creates confusion in a previously-clear situation [21], [20]. AR is not widely known by the public, there are a few tools to introduce AR to nontechnical persons. This prevents AR to become widespread Summary Quality of AR systems is directly proportional to their costs and the scale is not linear. Problems, especially about the tracking and computational ones, still seem to be in need of technological development to be solved. Furthermore, these problems arise for outdoor systems and mobility drawbacks are also added. Because of the cost and technological limitations, there are a few examples of easyto-use tools that stop public to be aware of AR. 35

36 Chapter 3 The Scope of the Study This study aims to implement a generic rule based system for AR applications that enables the usage of AR in various areas. The system accepts various parameters and model formats from the user by means of which the user can tailor the augmented environment according to the aim of usage. Our system is implemented on the ARToolKit project, a GNU licensed AR tool which will be described in detail in the next chapter. To give a brief explanation, ARToolKit uses predetermined printed patterns to superimpose computer graphics onto real world. However, ARToolKit has some limitations as described in chapter 2.9. Current situation of using ARToolKit is as follows; if the user intends to make up an AR system, s/he should code and compile a program from scratch by using ARToolKit Libraries to meet the requirements. In that sense the user needs to have technical background. Furthermore, even if the user is capable of programming, ARToolKit does not offer any choices of 3D models to depict in augmented environment. It only uses OpenGL graphics in its examples. Accordingly, in order to increase the wide usage of AR, we need more user friendly and functional tools. AR Project is an effort to provide an easy to use environment for developing complex AR applications for non-technical people. 3.1 Defining Requirements Considering the weaknesses of ARToolKit we have designed an AR system to meet possible requirements of an AR user. We defined the requirements as follows; The user does not need to have expert programming skills to use the system. The system must be generic and should be applicable to different situations The system should respond to different situations with different reactions. (Different animations for different pattern combinations) 36

37 User should not be forced to stick with OpenGL graphics, s/he should be able to use more lifelike file formats like.3ds. Resultant system allows models with.3ds extensions (that are developed with 3DSMax software) to be used, offers a graphical user interface to add another layer to ARToolKit usage, and accepts various parameters to design scenarios including different patterns and animations. The system is designed such that any other 3D model types can easily be implemented within structure of the program. The models superimposed in augmented environment can be placed and moved by taking other patterns and models as the reference. Together with this property, considering the scenario definition functionality, rule based structure of the system is assured. The developed tool is named as AR Project. 3.2 Comparison with ARToolKit Table 1 summarizes some features of both AR Project and ARToolKit. Table 1: Comparison of AR Project with ARToolKit As a conclusion, AR Project enhances ARToolKit by adding new capabilities. Furthermore it provides GUI and GUI related facilities to serve a user friendly environment. It includes advanced algorithms which would cost months of 37

38 programming time for an average programmer. It provides a generic tool capable of adapting different situations. It provides an open source code which will enable other developers to contribute. 3.3 Summary As mentioned in Chapter 2, AR applications are dealing with excess amount of technical limitations due to tracking and calibration in general. Beating these limitations is a function of time and money, and sometimes is impossible. As the amount of work and expense needed increases, the feasibility is drastically reduced. ARToolKit, a C language library, overcomes most of these limitations by using a simple pattern recognition mechanism. On the other hand, ARToolKit comes with its own drawbacks which are discussed within this chapter in detail. AR Project overcomes some of these drawbacks, extends the capabilities of ARToolKit libraries and adds a user friendly GUI ensuring the ease of use. The product aimed to be developed for generic purposes such as educational and other specific fields and it will use a rule-based approach. 38

39 Chapter 4 Environment In this chapter, the environment of the study is explained. Brief information about the tools and the technology used are explained here. 4.1 About ARToolKit ARToolKit is a C language software library that lets programmers to easily develop AR applications. AR is the overlay of virtual computer graphics images in the real world, and has many potential applications in industrial and academic research [1]. One of the most difficult parts of developing an AR application is precisely calculating the user s viewpoint in real time so that the virtual images are exactly aligned with real world objects which have been discussed at ARToolKit uses computer vision techniques to calculate the real camera position and orientation relative to marked cards, allowing the programmer to overlay virtual objects onto these cards [1]. The fast and precise tracking provided by ARToolKit should enable the rapid development of many new and interesting AR applications. ARToolKit has a GNU license meaning that it is free software for non-commercial applications; you can redistribute it and/or modify it under the terms of GNU (General Public License) as published by the Free Software Foundation; either version 2 of the license, or any later version Working Method of ARToolKit The core of ARToolKit is composed of a main loop. As seen in Table 2, there are 6 steps of ARToolKit. The functions, which correspond to steps 2 through 5, run repeatedly from main loop. First of all, 3d video frame is transformed into 2d. 39

40 Table 2: Steps of ARToolKit [1] Then the algorithm searches for the black squares which are called markers. After that, all markers are examined to see if they are matching patterns. After the transformation calculations, all patterns are aligned with virtual objects. These steps run simultaneously until the program is aborted. A sample code of ARToolKit that implements these steps is shown in Appendix A. 4.2 Preparing the Development Environment For AR Project, the following environment is used; Microsoft Visual Studio. (Either VS.Net 2003 or VS 6) ARToolKit libraries. DSVideo libraries. GLUT library. Directx 9.0 or later Software Development Kit (SDK) ARToolKit demands a Directx 9.0 compatible Graphics card and a webcam to run. In order to set-up the development environment, the following steps have been followed; Step 1: Installing Microsoft Visual C++. Step 2: Creating a folder which includes ARToolKit files. This folder will be referred to below as (ARToolKit). Step 3: Unpacking DSVideo Libraries and copying required dll files into appropriate folders. (DSVideoLib.dll and DSVideoLibd.dll fromartoolkit\dsvideolib\bin.vc70 into ARToolKit\bin) 40

41 Step 4: Running register filtering bat files. Step 5: Installing GLUT libraries and headers Step 6: Creating config.h by using configure.win32.bat script. Step 7: Opening ARToolKit.dsw file. Step 8: Arranging the VS search path settings. 4.3 Other Programs/Tools Used To Support This Project Other than the environment described above, Microsoft Office XP which is an office program is used for creating documents and tables. Printing capabilities of Microsoft Office XP was useful. Other than this, Irfanview, a freeware multi-format image viewer/converter for Windows Operating Systems is used for managing image files. Another tool used is Deep Exploration which is a standalone application to manage 2D, 3D, animation and video assets. It is used to import 3ds models, to render these models and to change the pivot points and textures of them. IconArt ++ is used to create.ico files and Enterprise Architect is used to draw class diagrams. Microsoft Visual Studio 6.0 is used for developing the application. ARToolKit libraries use C as the programming language. However, C++ is used for this project; as C++ is an enhanced language to develop object-oriented structure of the application. Visual Studio supported usage of C++ and enabled easier development of GUIs. 4.4 Test Systems AR Project is tested and compiled using 2 separate computer systems which are: 41

42 4.4.1 Test System 1 Test System 01 Mainboard Asus A7v333 CPU AMD Athlon 1700xp (*) RAM 768 MB Kingston 333 MHZ SdDDR HDD 80 GB Samsung 7200 RPM Display Adapter Nvidia Riva TNTII (**) Display Samsung SyncMaster 900 IFT Operating System Windows 2000 Advanced Server Input Device Apache Webcam Other LG 32x Cd Writer, LG 16x Dvd Writer Microsoft Optical Mouse + Multimedia Keyboard, High Speed USB Card Writer, 40 GB Toshiba External Disk,Surecom Network Adapter, Mitsumi Bluetooth Adapter Table 3: Test System 1 (*) Athlon 2400XP is used in performance tests. (**) ATI Radeon 9800 Pro is used in performance tests as it fully supports Directx Test System 2 Test System 02 Mainboard Intel 885GME Chipset CPU Intel Centrino 1.60 GHZ RAM 768 MB DELL 333 MHZ SdDDR HDD 40 GB Toshiba 4200 RPM Display Adapter 64 MB Intel Extreme Graphics (dx9) Display 12.1" WSXGA TFT Active-Matrix (***) Operating System Windows XP Home SP2 Input Device D-Link Webcam Other NEC 8x Dvd Writer Microsoft Optical Mouse, Embedded SD Card Writer, 40 GB Toshiba External Disk Table 4: Test System 2 (*) Native Resolution of that display system is 1280 x

43 4.5 Patterns For each square, the pattern inside the square is captured and matched again some pre-trained pattern templates. If there is a match, then ARToolKit has found one of the AR tracking markers. ARToolKit then uses the known square size and pattern orientation to calculate the position of the real video camera relative to the physical marker. A 3x4 matrix is filled in with the video camera real world coordinates relative to the card. This matrix is then used to set the position of the virtual camera coordinates. The restriction is that the grids cannot be rotationally symmetric with itself or other grids in the group; else the ARToolKit will not be able to extract orientation from the target pattern Creating a pattern Our first pattern seen in Figure 14 is named as HAK referring to Hirozaku Kato s Hiro pattern [1] as seen in Figure 15 as it is an abbreviation of the name of this thesis author, Hakan Bozlu. It also has some other meanings in Turkish language including God and Justice thus adding extra meaning to our choice. Figure 14: Hiro pattern used in ARToolKit examples [1] 43

44 First step for creating a pattern is designing it using a blank marker provided by ARToolKit. While designing, there are some precautions to be taken The design must be unique to avoid confusion: If the designs of two separate patterns are similar, the system will be confused due to the fact that the resolution of the input hardware is not sensitive enough. The design must not be symmetrical in both x axis and y axis; otherwise the orientation information which is important for AR Project will be lost. The length of each side must be known to introduce to our system properly. AR Project needs to know the exact length to calculate the distance. The design inside the blank marker must be in pure black. Pattern recognition mechanism fails to recognize the design if done otherwise. Figure 15: The first pattern created during this project. Designing a pattern can be done with using any image editing program. Our example is given in Figure

45 Figure 16: Mechanism Designed for ARToolKit to Recognize Patterns After that, the.patt file of the pattern should be created using the advanced section of AR Project which will be discussed in the next chapter. To create.patt file, as seen in the Figure 16, the pattern must be perpendicular to the lens of the webcam. In other words, pattern plane and webcam plane must be parallel Pattern Tests In order to develop AR Project, we needed to know the constraints of our environment. For that reason we have run some tests on pattern limitations. The number of patterns that can be used at a time is directly proportional to the minimum size limit of the pattern. Furthermore, the effective radius of our tool is limited with the pattern recognition distance. To calculate these two factors we ran some test given below. Pattern Size Test To calculate the minimum visible pattern size, we used a rating scale consisting of similar patterns with descending sizes. The rating scale is given at Appendix J. Our tool involves the interaction of user with patterns, so we decided that the patterns 45

46 should be within the reach of the user and 100 cm as the camera-pattern distance was adequate. The zoom of the camera is calibrated before the tests. After the tests we found that minimum pattern size to be recognized is 23 mm under normal light conditions and from a 100 cm distance. Smaller patterns are recognized only as markers. The results were the same for both 640x480 and 320x240 resolution tests. Pattern Distance Test To calculate the maximum pattern recognition distance, we used standard markers with 80 mm sides. Under normal light conditions, we increased the camera-pattern distance gradually. After 520 cm, the algorithm started to confuse with differentiating similar-looking patterns. The results were the same for both 640x480 and 320x240 resolution tests. 4.6 Models Originally ARToolKit uses only OpenGL graphics but as AR Project accepts different model formats, we needed to create 3D model files in order to use all the capabilities of our system. First we need a texture, then a 3D file. Figure 17 shows the first texture file we have used. Figure 17: The source used for.3ds model file 46

47 Figure 18: 3D of our model file After creating the.3ds file as seen in Figure 18, we used the texture to texturize our model. While creating our model, there are some precautions to be taken; The pivot point of the model should be same as the center of gravity for the symmetrical objects. This will ensure the precision of the animation. The size of the animation should be normalized. AR Project gives detailed options to scale models in animations but rational proportions among 3D models provides ease of use. The result will be as seen in Figure

48 Figure 19: 3D model aligned with real world The scaled and rotated face model is aligned with the environment (Face model and the borders around the markers are computer generated information, the rest is reality). 48

49 Chapter 5 Design and Technical Details of the Developed System In this chapter, the design issues and technical details of AR Project will be given. 5.1 System Design The design standard is derived from Military Standard 498 [31] and IEEE [32]. We modified these standards to create a base to our model System-wide design decisions Application uses web-cam as video input Application uses computer screen as video output User can define new 3D models to the system User can define new patterns to the system Different file types can be accepted by the system as 3D model file The Video Output is displayed in full-screen mode. System can use patterns that are defined by using ARToolKit pattern definition tool. System uses OpenGL graphics library for AR and 3D graphical applications System stores data within text files Database entrances are added by GUIs The system lets user to manually override database AR scenarios can be formed based on rules defined by animations and assignments System Architectural Design System Components System is composed of the following modules: AR Module Pattern Module Model Module Animation Module Scenario Module Advanced Options Application Module 49

50 User Interface Module Architectural design of the modules of the system is explained in the following sections. Each module is explained by determining the associated software units. Concept of execution is explained in the following part, which gives a brief diagram of the system with relationships among the modules. Class diagrams for each software unit are given at Appendix B AR Module Class CARProjDlg The AR system is developed as a dialog-based MFC application using Microsoft Visual C++ 6. The main dialog of the system is defined in CARProjDlg class. Main user interface implemented in this class is shown in Figure 13. This class is backbone of the system in which AR operations and other basic functionalities are conducted. Global instances of each dialog class are created to reach variables that are described in above sections. On initialization, other dialogs are created. The user can define all parameters of the system by reaching dialogs using the buttons on this main interface. When Start Augmenting Our World! button is pressed, the system switches to camera mode for AR environment. Webcam initialization is done and video capture is started. Also.3ds models are initialized. After initializations, system enters the main loop. In each loop following operations are done; o OpenGL parameters are set, o Markers on the current video capture are detected, o For each pattern defined, by comparing ID s of markers and patterns, patterns on the capture are found. Detected pattern indexes are saved, o The patterns are transformed with respect to camera, o CheckPatternAssignment function is called to play animations as defined in the scenario_data file. Advanced controls like disabling 50

51 o o animations with one or two models when these models are included by an active animation with three models, PlayAnimation function is called to decide on models to be rendered, their transformations according to references (models or patterns) and convert AR transformation values to OpenGL transformations, For each model Render3DSModel is called that sets transformation change, scales, rotations on self axis, movements (cyclic, elliptic) for each individual model and to render the model on the video capture view with the parameters specified Pattern Module User interface to update these parameters is shown in this module. The related class to make necessary operations on this user interface is CLoadPatternDlg. Variables in this class about patterns are as follows; char char ObjectData_T int *model_name; *model_path; *object; objectnum; Struct ObjectData_T Parameters for patterns are stored by using below structure named ObjectData_T. An example of text file containing these parameters can be shown in Appendix C. typedef struct char name[256]; int id; int visible; int collide; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; char path[256]; ObjectData_T; name[] variable holds the name of the pattern defined, which is also a primary key for the patterns. id is a value specific to pattern. visible determines if the pattern was 51

52 detected in the previous loop. Marker_Coord defines (x,y) coordinates of vertices of the pattern. trans is the key variable that holds transformation values of pattern with respect to camera in AR format. This value is converted to OpenGL style in later phases of the program. Marker_width is the width of the marker. Marker_center holds the coordinates of center of the pattern. path holds the name of the folder in which pattern files are held. These variables are assigned from database and entered to database by means of the following methods: ObjectData_T *read_objdata(char *name, int *objectnum); void write_objnum(char *model_name, int *objectnum); void write_objdata(char *model_name, CString name, CString path, CString width, int *objectnum); User can add new patterns, edit and delete existing patterns by means of GUI. User interface operations are conducted by using handlers like; OnInitDialog On(xxx)ButtonClick OnSelChange(xxx)ListBoxValues Model Module User interface to update these parameters is shown in figure 15. The related class to make necessary operations on this user interface is CLoadModelDlg. Variables in this class about patterns are as follows; int char ModelData Struct ModelData *loadedmodel_name; *model; modelnum; 52

53 Parameters for models are stored by using below structure. An example of text file containing these parameters can be shown in Appendix D. typedef struct char name[256]; char path[256]; char type; ModelData; name[] variable holds the name of the model defined, which is also a primary key for the models. path holds the name of the folder in which model files are held. Type specifies the file type of 3D model. These variables are assigned from database and entered to database by means of the following methods: ModelData *read_modeldata(char *name, int *modelnum); void write_modelnum(char *loadedmodel_name, int *modelnum); void write_modeldata(char *loadedmodel_name, CString name, CString path, int *modelnum); User can add new models and delete existing models by means of GUI. User interface operations are conducted by using handlers like; OnInitDialog On(xxx)ButtonClick OnSelChange(xxx)ListBoxValues Animation Module User interface to update these parameters is shown in figure 17. The related class to make necessary operations on this user interface is CDefineAnimationDlg. Variables in this class about patterns are as follows; char *animdefinition_name; AnimationInfo *animation; int animationnum; Struct AnimationInfo Struct ModelAnimInfo 53

54 Struct ModelMovement Struct ModelAxisRotate Struct ModelScale Struct ModelTransform Parameters for animations are stored by using below data structures and parameters. An example of text file containing these parameters can be shown in Appendix E. typedef struct double x; double y; double z; ModelTransform; typedef struct double scale_x; double scale_y; double scale_z; ModelScale; typedef struct bool axis_rotate_x; //determine if it turns around x axis bool axis_rotate_y; bool axis_rotate_z; double vel_axis_rotate_x; double vel_axis_rotate_y; double vel_axis_rotate_z; ModelAxisRotate; typedef struct int movement_type; int movement_axis_1; //x, y or z. int movement_axis_2; //second axis. only (0,1) or (1,2) combinations can apply. -1 means none char movement_ref[256]; double movement_vel; int radius_a; int radius_b; int radius_c; //radius for third axis ModelMovement; typedef struct char anim_model_name[256]; ModelTransform trans; char model_transform_ref[256]; ModelScale scale; ModelAxisRotate axis_rotate; ModelMovement model_movement_prop; ModelAnimInfo; typedef struct 54

55 char anim_name[256]; ModelAnimInfo model_anim_info[max_models_in_animation]; AnimationInfo; anim_name[] variable holds the name of the animation defined, which is also a primary key for the animations. In an animation, up to three models can be shown. For this reason, the variable named model_anim_info[] is a 3-length array (which can be updated by changing value of definition). Anim_model_name[] determines names of models in the animation. Model is transformed in x,y and x axis relative to its reference. These values are stored in trans which is derived from ModelTransform struct. Coordinates of the model with respect to camera are determined according to a reference for the model. This reference can either be a pattern or a model. If the reference is a model, placement of model is adjusted according to dynamic coordinates of the model. Reference value is stored in model_transform_ref variable. Scale of each model can be changed by scale variable derived from ModelScale struct. Model can be rotated on its own axis by means of axis_rotate variable derived from ModelAxisRotate struct. Rotation axis and velocity for each axis can be defined. Various movements are assigned to model by determining movement with movement_type variable derived from ModelMovement struct. Movement type can be defined as circular or elliptic. One or two movement axis can be defined. Radius values for each axis are assigned. Velocity of movement is also identified. If a movement is defined, model uses movement_ref variable as the reference. This can also be a dynamic model. CNewAnimDlg class is only used for displaying new anim name enter dialog. These variables are assigned from database and entered to database by means of the following methods: 55

56 AnimationInfo *read_animationinfo(char *animdefinition_name, int *animationnum); void write_animnum(char *loadedanim_name, int *animnum); void write_animdata(char *loadedanim_name, CString animname, CString modelname[max_models_in_animation], double trans_x[max_models_in_animation], double trans_y[max_models_in_animation], double trans_z[max_models_in_animation], CString model_trans_ref[max_models_in_animation], double scale_x[max_models_in_animation], double scale_y[max_models_in_animation], double scale_z[max_models_in_animation], bool axis_rotatex[max_models_in_animation], bool axis_rotatey[max_models_in_animation], bool axis_rotatez[max_models_in_animation], double vel_axis_rotatex[max_models_in_animation], double vel_axis_rotatey[max_models_in_animation], double vel_axis_rotatez[max_models_in_animation], int model_movement_type[max_models_in_animation], int model_movement_axis_1[max_models_in_animation], int model_movement_axis_2[max_models_in_animation], CString model_movement_ref[max_models_in_animation], double model_movement_vel[max_models_in_animation], int model_radius_a[max_models_in_animation], int model_radius_b[max_models_in_animation], int model_radius_c[max_models_in_animation], int *animnum); User can add new animations, edit and delete existing animations by means of GUI. User interface operations are conducted by using handlers like; OnInitDialog On(xxx)ButtonClick 56

57 OnSelChange(xxx)ListBoxValues OnSelChange(xxx)ComboBoxValues Additionally, various support functions and variables are implemented to be able to establish complex operations on GUI of this class Scenario Module User interface to update these parameters is shown in figure 17. The related class to make necessary operations on this user interface is CAssg_PattToAnimDlg. Variables in this class about patterns are as follows; char ARAssignment int *scenariodefinition_name; *animtopatt_assignment; assignmentnum; Struct ARAssignment Parameters for assignments are stored by using below structure. An example of text file containing these parameters can be shown in Appendix F. typedef struct char assignment_pattname[256][3]; char assignment_pattname[max_pat_num_defined_in_assg][256]; char anim_name[256]; ARAssignment; assignment_pattname[] variable holds the name of the patterns to which animation is assigned. Assignment_pattname is a character array of length 3, if all 3 patterns are defined, this means that animation will be shown if all patterns are detected in the view. If only one pattern is determined (others are null), animation will be shown if only one pattern is detected. Anim_name variable specifies the animation that is activated when determined patterns are detected. These variables are assigned from database and entered to database by means of the following methods: 57

58 ARAssignment *read_assignmentdata(char *scenariodefinition_name, int *assignmentnum); void write_assigndata(char *assign_name, CString assignmentname, CString assig_pattnames[max_pat_num_defined_in_assg], CString animname, int *assignmentnum); User can add new assigments and delete existing assignments by means of GUI. User interface operations are conducted by using handlers like; OnInitDialog On(xxx)ButtonClick OnSelChange(xxx)ComboBoxValues Additional parameters are used to fill combo boxes by values assigned in CLoadPatternDlg and CDefineAnimationDlg classes Advanced Options Application Module Class CAdvancedOptions From this GUI, database text files are opened for manual edit and external calibration applications are called. External calibration application files are as follows; Define New Pattern : Root\External\mkpatt.exe Optical See-Through Calibration: Root\External\opticald.exe Camera Parameters Calibration : Root\External\calib_cparam.exe Camera Distortion Calibration : Root\External\calib_distortion.exe Database text files to be edited manually are as follows; Edit object_data file manually : Root\Data\object_data Edit model_data file manually : Root\Data\model_data Edit animation_data file manually : Root\Data\animation_data Edit scenario_data file manually : Root\Data\scenario_data 58

59 5.1.3 Interface Design In this section, only the main user interface options are presented. In Chapter 6, detailed information about each option is given by means of a case study. 59

60 AR Main Interface Identification and Diagram Figure 20: Main page of the AR Project Load pattern button opens the dialog for adding, editing and deleting patterns. Load Model File button opens the dialog for adding and deleting models. Define Animation button opens the dialog for adding and deleting animations. Assign Animations to Patterns button opens the dialog for creating scenarios by assigning animations to specific patterns. Advanced Options button opens the dialog for reaching high level applications for calibrating and managing the database. Start Augmenting Our World button starts AR application. Exit Program button terminates the program. 60

61 Pattern Interface Identification and Diagram Figure 21: Load pattern Add New Pattern enables the dialogue for defining the patterns. Delete Selected Pattern button deletes the selected pattern. Edit Selected Pattern enables the dialogue for editing patterns. Add Update button adds or updates the pattern when active. Cancel Button disables the dialogue for defining/editing patterns. OK button returns to the main window. 61

62 Model Interface Identification and Diagram Figure 22: Load Model Add New Model enables the dialogue for defining the models. Delete Selected Model button deletes the selected model. Add Update button adds the model when active. Cancel Button disables the dialogue for defining models. OK button returns to the main window. 62

63 Animation Interface Identification and Diagram Figure 23: Define Animation Add New Animation button triggers a pop-up window to enter a name for the animation then enables the selecting models part. Add Button adds the selected model when active. Finish button disables the select Models part and enables the Scale, Transformation, Rotation, Reference for Transformation, Movement and Movement Velocity parts when active. Edit Selected Animation enables the appropriate parts to edit the selected animation. Delete Selected Animation button deletes the selected animation. OK button returns to the main window. 63

64 Scenario Interface Identification and Diagram Figure 24: Assign animations to patterns New Anims to Patterns Assignment button enables the Enter Assignment Name, Select Patterns and Select Animation parts. Delete Selected Assignment button. Add The Assignment button accepts the defined scenario when active. Cancel button disables the Enter Assignment Name, Select Patterns and Select Animation parts when active. OK button returns to the main window. 64

65 Advanced Options Interface Identification and Diagram Figure 25: Advanced Options Define New Pattern button triggers the external program for defining the printed patterns. Optical See-Through Calibration button triggers the external program for calibrating HMDs. Camera Parameters Calibration button triggers the external program for calibrating cameras. Camera Distortion Calibration button triggers the external program for calibrating cameras. Edit object_data file manually button opens the database text file for object_data for manual editing. Edit model_data file manually button opens the database text file for model_data for manual editing. Edit animation_data file manually button opens the database text file for animation_data for manual editing. Edit scenario_data file manually button opens the database text file for scenario_data for manual editing. 65

66 Video Interface Identification and Diagram Figure 26: Video Interface Identification and Diagram The frame rate changes the frame rate. The Output Size pull-down menu gives option for different resolutions. 5.2 System Development Our system development phase is as follows; after drawing the requirements, we designed a GUI on paper which also gave new ideas. While planning these requirements and the GUI, our motivation was the thought of a non-technical person and a case study. While realizing the GUI, the code has been changed and due to the new code, GUI has been changed a little bit more. The feedbacks from our tests were less forceful to change the development phase as they were kind of validation tests. It will be fair to say our SDLC was far from waterfall model, we were developing our application in a circular way. We used Visual Studio 6.0 to develop our application in C++. Visual Studio s MFC wizard is used for developing the GUI. 66

67 The information we will store includes only the values of the variables and structures so it should never be bigger than 100 kb. Realizing that, we preferred to use text files for storing database. This method of approach provided us a versatile and easy to manipulate environment. We could control and modify values of variables easily; furthermore we are not bound to a database management program. 5.3 System Testing Our testing procedures are divided into two; Stage 1 and Stage 2 testing. The first stage tests are the ones that are done by us. We grouped these tests into two; functional testing and GUI testing. They all have their ID s, their purposes, dependencies, setups and cleanups stored in tables. All the procedures and validations are explained step by step. The step codes started with capital A and validation codes started with capital V. Table 5 can be given as an example for the Stage 1 function tests and table 6 can be given as an example for the Stage 1 GUI tests. Stage 1 Testing We ran all the tests on both Microsoft Windows 2000 advanced server and Microsoft Windows XP with service pack 2. The results were Boolean, meaning that there were not any results to take average, so every test was run only once provided that no change has been made to the program code. Functional Testing In the following sections, test cases for each of the modules are defined. Table 5 shows a sample functional test table. 67

68 Functional Testing TC ID TC-FN Purpose Read the properties of an existing pattern in the system Dependency LoadPattern GUI Setup Open application, open pattern dialog Procedure [A01] Click Add New Pattern button [A02] Select the pattern file [A03] Enter marker width [A04] Click Add/Update Pattern button [A05] Open object_data file with notepad [V01] Observe that newly added pattern is written successfully to object database, as explained in design section. Cleanup Close notepad, dialog and application Table 5: A Sample Functional Testing Table 1 AR Module 2 Pattern Module Test Case 2.1 Define New Pattern Table ALPHAFUNC. Test Case 2.2 Edit Existing Pattern Test Case 2.3 Delete Existing Pattern Model Module Test Case 2.1 Define New Model Test Case 2.2 Edit Existing Model Test Case 2.3 Delete Existing Model Animation Module Test Case 2.1 Define New Animation Test Case 2.2 Edit Existing Animation Test Case 2.3 Delete Existing Animation Scenario Module 68

69 Test Case 2.1 Define New Scenario Test Case 2.2 Edit Existing Scenario Test Case 2.3 Delete Existing Scenario Advanced Options Application Module GUI Testing In the following sections, test cases for each of the modules are defined. Table GUI shows a sample 6 test table. GUI Testing TC ID TC-FN Purpose Read the properties of an existing pattern in the system Dependency LoadPattern GUI Setup Open application, open pattern dialog Procedure [A01] Click Add New Pattern button [A02] Select the pattern file [A03] Enter marker width [A04] Click Add/Update Pattern button [V01] Observe that newly added pattern is written successfully to the GUI Cleanup Close dialog and application Table 6: A Sample GUI Testing Table Test Case 2.1 AR Main GUI Test Case 2.2 Pattern GUI Test Case 2.3 Model GUI Test Case 2.4 Animation GUI Test Case 2.5 Scenario GUI Test Case 2.6 Advanced Options GUI Test Case 2.7 Video Interface GUI 69

70 5.3.2 Stage 2 Testing In the second stage of the tests, a group of technical and non-technical subjects between ages is selected for the testing group. They are introduced to AR and AR Project for 10 minutes and a 20 minute period is given to examine the tool. Then they are interviewed to get their opinions. All the interviews are recorded and than converted into meaningful information as interview tables. The interview includes; Personal information about the entity Whether they found the tool useful for an AR application or not Whether they found the tool easy to use or not Their opinion about the AR and the tool Their suggestions for the possible usage of this tool in their area Their suggestions for the progress of the tool The interview questions can be examined from Appendix H. A sample copy of an interview table can be examined from I. Testers Profile: We have selected 10 testers to test AR Project; all of our testers were having at least bachelor s degree. Their profiles are summarized in Table 7. The testers are coded T1, T2 70

71 Tester No Age Gender Profession Education Level Background T1 27 M Engineer Graduate Student Technical T2 23 M Instruction Technologist Graduate Student Non-Technical T3 22 M Engineer Graduate Student Technical T4 35 F Programmer Graduate Student Technical T5 35 M Academic PhD Student Technical T6 26 M International Relationships Bachelor Non-Technical T7 29 M IT Bachelor Technical T8 27 F Economist PhD Student Non-Technical T9 30 F International Relationships PhD Student Non-Technical T10 31 M Economist PhD Student Non-Technical Average 28,5 Table 7: Profile of the Testers We have selected an equal ratio from Technical and Non-Technical testers. Technical testers are the ones having bachelor s degree from either one of the following areas; engineering, programming and information technologies. The age interval of testers is from 22 to 35, with an average value of 28. %70 of the testers are male and the %30 of the testers are female. Testers Opinion about AR All of the testers were introduced to AR for the first time; the introduction was surprising and entertaining for them. Their opinions were extremely positive; they had an undivided opinion that AR will be the future of bringing computer enhancements into our lives. Most of them used words like definite, certain and sure to emphasize their statements certainty. For example T5 s exact opinion was; Certainly promising, AR will be in our lives for sure. Testers Opinion about AR Project Aside from two testers, all of them found the tool useful. T7, who has a technical background annotated that the tool must be developed more to be used functionally and added that for now it is only good for demonstration of AR. Also T9 who has a non-technical background found the tool a little bit primitive to be used. Other than that, there were no negative feedbacks, the testers found out the tool fine, open for progress and very successful. Tester 10 s (Non-Technical) exact opinion was; Entertaining, very successful. T10 s exact opinion was; The tool is entertaining and it is very successful. I really think this tool is useful. 71

72 When it came to the user-friendliness of the tool, %70 of the testers was satisfied. The rest were (mostly non-technical) not fully satisfied though; they said it would be hard to use the tool if they were not informed, the tool must come with help files, even demo videos for the first time user. And finally, %90 of the testers found the tool effective and ready to use except that T7, as he had some doubts as explained before. Testers Suggestions for the Possible Usages of the Tool All of the testers are asked for the possible usages of the tool. Testers with technical background were more susceptible to find out ideas. One surprising result was a common answer; almost all of the testers declared that AR Project can be used efficiently in education. They also diversify the alternatives like child education or education of the physically handicapped. T5 s exact opinion was; This tool can be used for education but the tool is not limited for ordinary education techniques. It can be used for pilot education, astronomy education and these types of professional education types.. Other than that, many applicable areas are suggested; medicine, astronomy, technical information, and the areas that needs manual skills, assembly, army, meteorology, architecture, interior decoration, demonstrations, performance, simulations, marketing and advertising. Ideas for Improvements All the testers are asked whether if they have any suggestions to improve this tool or not. Testers with technical backgrounds offered more possible ideas for improvements. The rational suggestions are as follows; Help section could be more detailed A read me about the tool can be added 72

73 Animations and videos could be supplement to inform first-time user Predefined patterns and models could be included in the system. This could be offered as a library or an add-on The pattern limit (3) could be increased The video interface could be interactive, the models and scenarios could be changed real-time. Preview for models and patterns could be added Easy to find file formats like JPEG and Bitmap could be used for patterns or models 5.4 What Makes AR Project Different AR Project is a GUI based user friendly tool based on ARToolKit libraries. AR Project s contributions are as follows; AR Project provides a multiple patterns to a single model matching algorithm as an enhancement over ARToolKit libraries AR Project provides a multiple patterns to multiple models matching algorithm as an enhancement over ARToolKit libraries AR Project provides a multiple patterns to a single animation matching algorithm that ARToolKit did not have ARToolKit comes with an advanced algorithm that disables the subsets of patterns sets. For example, if patterns a, b and c are in view, only matching animation to a+b+c pattern set is shown. Pattern subsets like a+b, a+c, a, b are disabled AR Project provides facility to use realistic.3ds files AR Project has structure to use more model files like.md3 model files. Source code can easily be modified to use different file formats AR Project provides an easy-to-use GUI AR Project provides a database mechanism; furthermore it has facilities to manually change/overwrite database files. AR Project provides an advanced facility to define animations that user can tailor; o The scale of the model for x, y and z axis individually 73

74 o The transformation value of the model for x, y and z axis individually o The rotation value of the model x, y and z axis individually o The movement type which can be none, circular or elliptic o The reference for transformation o The movement axis, up to two. o The reference for movement o The movement velocity o The radius values for movement AR Project provides a facility to add, edit or delete patterns AR Project provides a facility to add or delete models AR project provides a facility to assign animations to patterns, editing or deleting these scenarios AR Project holds the whole information for future use, defined patterns, models, animations. Different users can share their databases for collaborative work. AR Project provides a facility to access external calibration programs. To sum up; AR Project enhances ARToolKit libraries with brand new capabilities such as using different file formats. AR Project comes with a user friendly GUI. AR Project includes algorithms which would cost months of programming. AR Project is generic so the user can use the program for different situations. AR Project comes with a database system enabling users to share their work. 74

75 Chapter 6 A Case Study with the Developed Environment: Solar System Although we created a generic program to be compatible to any situation as possible, we always kept in mind that this tool would be completely suitable for education. For that reason we preferred our case study to be similar to an education program. We have chosen the solar system to be able to use the capabilities of AR Project, especially the animation part. 6.1 Running the AR Project Program The program itself is executable so any user can run the AR Project without assistance of any third party program provided that user is not demanding to chance the source code. The only exception is that a possibility of a.dll file (msvcvtd.dl) requirement depending on the operating system which is included in the project folders. After running the program the main window appears and all the functionality of the program can be accessed using a mouse and/or a keyboard. 6.2 Loading Patterns We designed 3 patterns for that purpose; crescent, earth and star as seen in Figure

76 Figure 27: Solar system patterns. a) Crescent, b) Earth, c) Star Later we will match crescent.patt with the moon.3ds model, earth.patt with the earth.3ds model and star.patt with the sun.3ds model. While designing, our first aim was creating a resemblance between these models and patterns to increase integrity of the system for the eyes of user. For that purpose, well-known illustrations are used inside standard 8 centimeters squares. Another technical aim is providing uniqueness of the patterns to avoid confusion. If the patterns are similar, the program may see them alike while the camera moves. After the design, the patt files (pattern files) must be created. This is done from the Advanced Options >Define New Pattern, triggering the external mkpatt.exe program coded by ARToolKit creators so it will not be explained in detail. To create the patt file, we installed a mechanism as seen in Figure 16. To load patt files, Load Pattern must be selected from the main menu. To add patterns, add new pattern button must be pressed as seen in Figure 28. Figure 28: Adding a new pattern (Part of the screen) Please note that some buttons and sections become inactive after pressing add new pattern button. This prevents user from making a mistake. 76

77 Patterns have 3 characteristics; name, path and marker width. Marker width is calculated by millimeters, default value is 80. Pattern name has no limitations other than the operating system forces, but the patt file must be at the /Data/ directory. After adding patterns, we can see the patterns added. We can select them using mouse/keyboard and after then deleting/editing them will be available using the appropriate buttons. 6.3 Loading Model Files We created 3 different model files for this case; earth.3ds, moon.3ds and sun.3ds. To create these files, we simply created a sphere and texturized them with the appropriate textures. Figure 29: The texture used in earth.3ds The earth texture given at Figure 29 and the resultant model file given at Figure 30 can be examined. 77

78 Figure 30: The earth.3ds model file. After creating the model files, Load Models must be selected from the main menu. This menu seems like the former one but models have two characteristics; model name and path. Model name has no limitations other than the operating system forces, but the model file must be at the /Models/ directory. 78

79 Figure 31: The properties of.3ds model files seen in Load Model section. (Part of the screen) After adding models, we can see the models added as seen in Figure 31. Please note that the system gives the total number and extension of model files. The system allows user to delete model files, but editing is not possible. 6.4 Defining Animations Actual equatorial radiuses of the sun, earth and the moon are , and 1734 kilometers respectively. We decided not to follow these ratios to provide visibility, ratios of 1.80, 1.20 and 0.70 seemed reasonable. (These ratios approximately corresponds to 5.8, 1.8 and 0.34 respectively by volume) To give the earth its own shape which is oblate spheroid (a.k.a geoids), one of the dimension is held To give the whole motions of sun,moon and earth when all patterns are observed in the camera display, the following animation properties are defined; Create a new animation by clicking Add New Animation and giving a name in the pop-up dialog box. 79

80 Add sun.3ds, earth.3ds and moon.3ds models by selecting from combo box and clicking on Add button one by one and clicking Finish Button at the end. For each model, define the properties as given below. For sun.3ds; Scale values are (X=1.8, Y=1.8, Z=1.8) Transformation values are zero Reference for Transformation is patt.star as sun model will be displayed relative to star pattern. Rotation values are zero. Movement type is none. (Thus all other movement properties are discarded) Therefore sun pop ups when star pattern is shown on the display and it stays right on top of the pattern without any movement. For earth.3ds; Scale values are (X=1.35, Y=1.2, Z=1.2) Transformation values are zero Reference for Transformation is discarded as movement type is not none for earth.3ds Rotation for Z axis is enabled with the value of 3. The others are zero. Movement type is elliptic. Movement axis is z and reference for movement is sun.3ds. Movement Velocity is 3 Radius values for each axis are X=120, Y=220, and Z is discarded. Z value is not meaningful as the movement axis is Z. Therefore all these variables cause earth to pop up when sun model is displayed and make an elliptic movement around static sun model. For moon.3ds; Scale values are (X=0.70, Y=0.70, Z=0.70) Transformation values are zero 80

81 Reference for Transformation is discarded as movement type is not none for earth.3ds Rotation for Z axis is enabled with the value of 3. The others are zero. Movement type is circular. Movement axis is z and reference for movement is earth.3ds. Movement Velocity is 4 Radius values for each axis are X=120, Y and Z are not meaningful as the movement type is circular and only one radius value is needed. Therefore all these variables cause moon to pop up when earth model is displayed and make a circular movement around dynamic earth model that moves around sun. 81

82 6.5 Assigning Animations to Patterns After creating all the animations, we matched correct animations with the correct pattern sets as seen in Table 8 to conclude building our case study. Animation Table Pattern(s) Model(s) Animation Sun.patt Sun.3ds Sun stays Moon.patt Moon.3ds Moon rotates around itself Earth.patt Earth.3ds Earth rotates around itself Moon.patt + Sun.patt Moon.3ds + Sun.3ds Sun stays, Moon rotates around itself Earth.patt + Sun.patt Earth.3ds + Sun.3ds Sun stays, Earth rotates around sun and Earth rotates around itself Moon rotates around earth, they Earth.patt + Moon.patt Earth.3ds + Moon.3ds both rotate around theirselves Earth.patt + Moon.patt + Sun.patt Earth.3ds + Moon.3ds + Sun.3ds Sun stays, Earth rotates around sun, Moon rotates around earth, Moon and Earth rotates around theirselves Table 8: The List of Added Animations The steps are simple. First of all we select the predefined patterns we want from the drop down menus as seen in Figure 32. We can choose up to three patterns. If we do not select any of the three, it will be counted as null. 82

83 Figure 32: Assigning animations to patterns. (Part of the Screen) After then we select the corresponding animation from the animation table. After we hit the Assign Animation to Patterns button, the assignment is done and the related information can be seen from the see current assignments part. We can define as much combination as we want. But please note that the algorithm overrides subset animations. For example sun+earth+moon also includes sun+earth, moon, moon+earth and so on. But only the largest set is valid to avoid confusion. The system checks for the matching patterns and only runs for the largest set at any point of the runtime. 6.6 Results After all the work is done, we click Start Augmenting Our World from the main menu. After configuring the webcam parameters as seen in Figure 33, Video Interface Identification and Diagram window will be opened. 83

84 Figure 33: Video Interface Identification Diagram Window While configuring webcam parameters, setting the highest resolution available would be wise. The window will show reality unless a pattern is in sight. After a pattern or a combination of patterns is in the window, the animations will be aligned to the real world due to the rules we have defined in 6.5 Figure 34 shows the situation when only the sun pattern is in sight, Figure 35 shows the situation when earth and moon are on sight and finally Figure 36 shows the situation when all three patterns are in sight. Also other combinations as given in 6.5 will be applied on real-time. 84

85 Figure 34: The Sun rotating around itself 85

86 Figure 35: Earth and the Moon 86

87 Figure 36: The Earth and the Sun 87

88 6.7 Advanced Options Tab Advanced Options tab is composed of two parts; Calibration and Manual File Edit Operations. Figure 37: The calibrations part in the Advanced Options section. (Part of the screen) As seen in Figure 37, calibrations part is composed of four buttons; Define New Pattern, Optical See-Through Calibration, Camera Parameters Calibration and Camera Distortion Calibration. These buttons trigger the programs with the functions as named on them. As these programs are coded by ARToolKit programmers, their details will not be given here. Instead, relevant reference can be used [1]. Figure 38: The manual file edit operations part in the Advanced Options section. (Part of the screen) AR Project uses text files as databases. Composed of 4 buttons triggering relevant text files, Manual File Edit Operations part gives access to these files to manually override them as seen in Figure 38. Object_data file holds the information about patterns as seen in Figure 39. The number of the patterns introduced to the system and characteristics of all patterns can be viewed/edited from here. 88

89 Figure 39: Sample object_data file. Model_Data file holds the information about patterns as seen in figure 40. The number of the models introduced to the system and characteristics of all models can be viewed/edited from here. Figure 40: Sample model_data file. Animation_Data file holds information about the animations as seen in 41. A complete sample of anim_data file can be seen at Appendix E. 89

90 Figure 41: Sample anim_data file. (Part of the file) 90

91 Chapter 7 Discussion and Conclusion AR is a promising area; most probably we will accept AR systems as permanent parts of our lives right after the technological obstacles are surpassed which is about to happen. This study overviewed the current situation of AR, focused on the limitations and moreover, focused on a C library named ARToolKit, with its advantages and disadvantages. Although it provides an inexpensive and easy to install solution for indoor AR systems, it comes with its own drawbacks. Not only has it extended the capabilities of the current situation but also this formed a suitable environment to encourage non-technical persons to use AR systems. We made some tests on testers with different technical and non-technical backgrounds to get feedback for AR Project. The results were extremely positive; all of the testers were satisfied. Although all of our testers were firstly introduced to the AR, they immediately accepted and get used to the technology. Common opinion was AR was a promising area to be worked with; moreover it will be a part of our lives soon. AR Project was considered a success by the testers; it is found to be user-friendly, promising and useful. Many usage areas and improvement suggestions are made by the testers.as a result, we can clearly say that these types of tools aiming even the non-technical persons should be more to increase the recognition and usage of AR. Future Work In this study, we have used web cam and display system pair to show the capabilities of AR and fully built our work onto that technology. Providing a not to be looked down success, it is also clear that the native environment of AR can be achieved by using Head Mounted Displays. 91

92 As mentioned in chapter 2, Head Mounted Displays come along with some technical difficulties. Other than some calibration techniques, the design and the code will not be different at all. Our work could identify more patterns and would be more resistant to occlusions if we have used a different approach to patterns, like Artag did [33] but this had a huge drawback that we did not want to interfere with. Although patterns would be more meaningful and clear for the system, it will be the exact opposite for the user. Figure 42: Comparison of ARToolKit and ARTag patterns [01], [33] Our aim was always forming a comprehensible environment for both user and the subjects which binary-like patterns will not allow. But as the necessity of pattern numbers converges to big numbers, creating unique design for each pattern should not be possible. Our proposal for that can be a hybrid approach, writing names outside the borders of the markers black squares, all four sides. These names would be absolutely meaningless to the computer but the users should still have minimal understanding about the patterns. In the Define Animations section, available movement types are None, Circular and Elliptic. An advanced graphical section can be added for the user to draw the route with input devices like mouse or keyboard. This should improve the capabilities of the program thus forming a more generic environment. 92

93 Augmented sound can be easily implemented. A pattern maker section can be added so the user will not need an image editing program thus the dependency of AR Project to the external programs will be minimized. After the tests, we have also seen that the following improvements should be made; the help section should be improved, predefined patterns should be added, pattern limit should be increased, video interface should be interactive, preview for models and patterns should be added and the usage of different image files should be provided. And last but not the least, a web page devoted to this project should be made so anyone willing to contribute can get the latest version of the code. 93

94 References [1] Hirokazu Kato, Mark Billinghurst, Ivan Poupyrev, ARToolKit version 2.33, Hiroshima City University, Human Interface Technology Laboratory University of Washington, MIC Research Labs ATR International, 2000 [2] Ronald T. Azuma, A Survey of Augmented Reality, Hughes Research Laboratories, 1997 [3] Azuma Ronald, Baillot Yohan, Behringer Reinhold, Feiner Steven, Julier Simon, KacINture Blair, Recent Advances in Augmented Reality, IEEE Computer Graphics and Applications , [4] Gaile Gordon, Mark Billinghurst, Melanie Bell, John Woodfill, Bill Kowalik, Alex Erendi, Janet Tilander, The Use of Dense Stereo Range Data in Augmented Reality, Human Interface Technology Laboratory University of Washington, 2002 [5] Masayuki Kanbara1, Takashi Okuma2, Haruo Takemura1, Naokazu Yokoya, A Stereoscopic Video See-through Augmented Reality System Based on Real-time Vision-based Registration, Graduate School of Information Science Nara Institute of Science and Technology, Electrotechnical Laboratory MITI, Nara Research Center, Telecommunications Advancement Organization of Japan, 2000 [6] Gudrun Klinker, Didier Stricker, Dirk Reiners, Augmented Reality for Exterior Construction Applications, Technische Universität München, Fraunhofer Projektgruppe für Augmented Reality am ZGDV, 2001 [7] Introduction to Augmented Reality, [8] Simon Julier, Steven Feiner, Marco Lanzagorta, Tobias Hollerer, Tohan Baillot, Lawrence Rosenblum, Information Filtering for Mobile Augmented Reality, Advanced Information Technology Naval Research Laboratory, Dept. of Computer Science Columbia University, Defence Science and Technology Organization of Australia, 2000 [9] J. Borenstein, H. R. Everett, L. Feng, Where am I? Sensors and Methods for Mobile Robot Positioning, University of Michigan, 1996 [10] Hannes Kaufmann, Dieter Schmalstieg, Mathematics And Geometry Education With Collaborative Augmented Reality, Interactive Media Systems Group Vienna University of Technology, 2002 [11] Brygg Anders Ullmer, Models and Mechanisms for Tangible User Interfaces, Bachelor of Science University of Illinois,

95 [12] Wilhelm F. Bruns, Complex Construction Kits for Coupled Real and Virtual Engineering Workspaces, artec Research Center for Work Technology Bremen University, 1999 [13] Morten Fjeld, Benedikt M. Voegtli, Augmented Chemistry: An Interactive Educational Workbench, Man-Machine Interaction IHA Swiss Federal Institute of Technology, HyperWerk Fachhochschule Beider Basel, 2002 [14] Wolf-D. Ihlenfeldt, Virtual Reality in Chemistry, Computer-Chemie-Centrum University of Erlangen-Nürnberg, 1997 [15] John T. Bell, H. Scott Fogler, The Investigation and Application of Virtual Reality as an Educational Tool, Department of Chemical Engineering University of Michigan, 1995 [16] Christine M. Byrne, "Water on Tap the Use of Virtual Reality as an Educational Tool", Doctoral Dissertation for the University of Washington, 1996 [17] Mahoney P. Diana, On the Right Track, Computer Graphics World page 16-18, April [18] Mark A. Livingston, Lawrence J. Rosenblum, Simon J. Julier, Dennis Brown, Yohan Baillot, J. Edward Swan, Joseph L. Gabbard, Deborah Hix, An Augmented Reality System For Military Operations in Urban Terrain, Advanced Information Technology Naval Research Laboratory, ITT Advanced Engineering and Sciences, Systems Research Center Virginia Polytechnic Institute and State University, 2002 [19] Simon Julier, Yohan Baillot, Marco Lanzagorta, Lawrence Rosenblum, Dennis Brown, Urban Terrain Modeling For Augmented Reality Applications, ITT AES/NRL, Naval Research Laboratory. [20] Wendy E. Mackay, Anne-Laure Fayard, Designing Interactive Paper: Lessons from three Augmented Reality Projects, Department of Computer Science Université de Paris-Sud, Electricité de France,1999 [21] Wendy E. Mackay, Augmented Reality: Dangerous Liaisons or the Best of Both Worlds?, University of Aarhus, 2000 [22] Wayne Piekarski, Bruce H. Thomas, ARQuake - Modifications and Hardware for Outdoor Augmented Reality Gaming, Wearable Computer Laboratory School of Computer and Information Science University of South Australia, 2003 [23] Wouter Alexander de Landgraaf, Interaction between users and Augmented Reality systems: Human-Computer Interaction of the future, Vrije Universiteit Amsterdam,

96 [24] Bruce Thomas, Nicholas Krul, Benjamin Close, Wayne Piekarski, Usability and Playability Issues for ARQuake, University of South Australia, 2002 [25] Wayne Piekarski, Bernard Gunther, Bruce Thomas, Integrating Virtual and Augmented Realities in an Outdoor Application, Advanced Computing Research Centre University of South Australia, 1999 [26] Drexel Hallaway, Steven Feiner, Tobias Hollerer, Bridging the Gaps: Hybrid Tracking for Adaptive Mobile Augmented Reality, Department of Computer Science University of California, Department of Computer Science Columbia University, 2004 [27] Anthony Webster, Steven Feiner, Blair MacIntyre, William Massie, Theodore Krueger, Augmented Reality in Architectural Construction, Inspection, and Renovation, Columbia University,1999 [28] Peiran Liu, Nicolas D. Georganas, Pierre Boulanger, Designing Real-Time Vision Based Augmented Reality Environments for 3D Collaborative Applications, Multimedia Communication Research Laboratory University of Ottawa, University of Alberta, 2002 [29] Dieter Schmalstieg, Anton Fuhrmann, Gerd Hesina, Zsolt Szalavári, L. Miguel Encarnação, Michael Gervautz, Werner Purgathofer, The Studierstube Augmented Reality Project, Vienna University of Technology, 2002 [30] Nakatsu Ryohei, Multimedia, Art, and Human-Computer Communications, International Conference on Artificial Reality and Telexistence Final Program, [31] United States of America Department of Defense, Military Standard Software Development and Documentation MIL-STD-498, 5 December [32] IEEE/EIA Standard, Industry Implementation of International Standard ISO/IEC 12207, Software Life Cycle Processes, March [33] Mark Fiala, ARTag Revision 1, a Fiducial Marker System Using Digital Techniques, Computational Video Group Institute for Information Technology National Research Council Canada, 2004 [34] Thomas B.H., Piekarski W. Glove Based User Interaction Techniques for Augmented Reality in an Outdoor Environment [35] Weiser Mark, "The Computer for the 21st Century," Scientific American, September 1991 [36] Access Date: 6/

97 [37] Http: //www.howsufworks.com/augmented-reality.htm Access Date: 4/2005 [38] Access Date: 6/2005 [39] Access Date: 5/2005 [40] Drascic David, Milgram Paul, Perceptual Issues in Augmented Reality Proc. SPIE Vol. 2653: Stereoscopic Displays and Virtual Reality Systems III, San Jose, California, Feb [41] Ing. D., Zlatanova S., "Augmented Reality Technology Tu Delft, December [42] Höllerer T., Feiner S., Hallaway D., Bell B., User Interface Management Techniques for Collaborative Mobile Augmented Reality, Computer and Graphics 25(5), Elsevier Science Ltd, Oct.2001, pp [43] Umlauf Eike J., Piringer Harald, Reitmayr Gerhard, Schmalstieg Dieter, ARLib: The Augmented Library The First IEEE International Workshop, Page 2,

98 APPENDICES Appendix A (ARToolKit Sample Code) #ifdef _WIN32 #include <windows.h> #endif #include <stdio.h> #include <stdlib.h> #ifndef APPLE #include <GL/gl.h> #include <GL/glut.h> #else #include <OpenGL/gl.h> #include <GLUT/glut.h> #endif #include <AR/gsub.h> #include <AR/video.h> #include <AR/param.h> #include <AR/ar.h> /******************************************************************* **********/ // modified by Thomas Pintaric, Vienna University of Technology #ifdef _WIN32 char *vconf = "flipv,showdlg"; // see video.h for a list of supported parameters #else char *vconf = ""; #endif /******************************************************************* **********/ int xsize, ysize; int thresh = 100; int count = 0; char *cparam_name = "Data/camera_para.dat"; ARParam cparam; char *patt_name = "Data/patt.hiro"; int patt_id; double patt_width = 80.0; double patt_center[2] = 0.0, 0.0; double patt_trans[3][4]; static void init(void); static void cleanup(void); static void keyevent( unsigned char key, int x, int y); static void mainloop(void); static void draw( void ); int main(int argc, char **argv) init(); arvideocapstart(); argmainloop( NULL, keyevent, mainloop ); 98

99 return (0); static void keyevent( unsigned char key, int x, int y) /* quit if the ESC key is pressed */ if( key == 0x1b ) printf("*** %f (frame/sec)\n", (double)count/arutiltimer()); cleanup(); exit(0); /* main loop */ static void mainloop(void) ARUint8 *dataptr; ARMarkerInfo *marker_info; int marker_num; int j, k; /* grab a vide frame */ if( (dataptr = (ARUint8 *)arvideogetimage()) == NULL ) arutilsleep(2); return; if( count == 0 ) arutiltimerreset(); count++; argdrawmode2d(); argdispimage( dataptr, 0,0 ); /* detect the markers in the video frame */ if( ardetectmarker(dataptr, thresh, &marker_info, &marker_num) < 0 ) cleanup(); exit(0); arvideocapnext(); /* check for object visibility */ k = -1; for( j = 0; j < marker_num; j++ ) if( patt_id == marker_info[j].id ) if( k == -1 ) k = j; else if( marker_info[k].cf < marker_info[j].cf ) k = j; if( k == -1 ) argswapbuffers(); return; /* get the transformation between the marker and the real camera */ argettransmat(&marker_info[k], patt_center, patt_width, patt_trans); draw(); 99

100 argswapbuffers(); static void init( void ) ARParam wparam; /* open the video path */ if( arvideoopen( vconf ) < 0 ) exit(0); /* find the size of the window */ if( arvideoinqsize(&xsize, &ysize) < 0 ) exit(0); printf("image size (x,y) = (%d,%d)\n", xsize, ysize); /* set the initial camera parameters */ if( arparamload(cparam_name, 1, &wparam) < 0 ) printf("camera parameter load error!!\n"); exit(0); arparamchangesize( &wparam, xsize, ysize, &cparam ); arinitcparam( &cparam ); printf("*** Camera Parameter ***\n"); arparamdisp( &cparam ); if( (patt_id=arloadpatt(patt_name)) < 0 ) printf("pattern load error!!\n"); exit(0); /* open the graphics window */ arginit( &cparam, 1.0, 0, 0, 0, 0 ); /* cleanup function called when program exits */ static void cleanup(void) arvideocapstop(); arvideoclose(); argcleanup(); static void draw( void ) double gl_para[16]; GLfloat mat_ambient[] = 0.0, 1.0, 1.0, 1.0; GLfloat mat_flash[] = 0.0, 1.0, 1.0, 1.0; GLfloat mat_flash_shiny[] = 50.0; GLfloat light_position[] = 100.0,-200.0,200.0,0.0; GLfloat ambi[] = 0.1, 0.1, 0.1, 0.1; GLfloat lightzerocolor[] = 0.9, 0.9, 0.9, 0.1; argdrawmode3d(); argdraw3dcamera( 0, 0 ); glcleardepth( 1.0 ); glclear(gl_depth_buffer_bit); glenable(gl_depth_test); gldepthfunc(gl_lequal); /* load the camera transformation matrix */ argconvglpara(patt_trans, gl_para); glmatrixmode(gl_modelview); glloadmatrixd( gl_para ); 100

101 glenable(gl_lighting); glenable(gl_light0); gllightfv(gl_light0, GL_POSITION, light_position); gllightfv(gl_light0, GL_AMBIENT, ambi); gllightfv(gl_light0, GL_DIFFUSE, lightzerocolor); glmaterialfv(gl_front, GL_SPECULAR, mat_flash); glmaterialfv(gl_front, GL_SHININESS, mat_flash_shiny); glmaterialfv(gl_front, GL_AMBIENT, mat_ambient); glmatrixmode(gl_modelview); gltranslatef( 0.0, 0.0, 25.0 ); glutsolidcube(50.0); gldisable( GL_LIGHTING ); gldisable( GL_DEPTH_TEST ); 101

102 Appendix B Class Diagrams of the Developed System 102

103 103

3D U ser I t er aces and Augmented Reality

3D U ser I t er aces and Augmented Reality 3D User Interfaces and Augmented Reality Applications Mechanical CAD 3D Animation Virtual Environments Scientific Visualization Mechanical CAD Component design Assembly testingti Mechanical properties

More information

Real World Teleconferencing

Real World Teleconferencing Real World Teleconferencing Mark Billinghurst a, Adrian Cheok b, Hirokazu Kato c, Simon Prince b a HIT Lab New Zealand b National University of Singapore c Hiroshima City University If, as it is said to

More information

Contents. 4 I/O Drivers: Connecting To External Technologies. 5 System Requirements. 6 Run Mode And Edit Mode. 7 Controls

Contents. 4 I/O Drivers: Connecting To External Technologies. 5 System Requirements. 6 Run Mode And Edit Mode. 7 Controls User Guide November 19, 2014 Contents 3 Welcome 3 What Is FACTORY I/O 3 How Does It Work 4 I/O Drivers: Connecting To External Technologies 5 System Requirements 6 Run Mode And Edit Mode 7 Controls 8 Cameras

More information

Part 21: Augmented Reality

Part 21: Augmented Reality Part 21: Augmented Reality Virtuelle Realität Wintersemester 2006/07 Prof. Bernhard Jung Overview Introduction to Augmented Reality Augmented Reality Displays Examples AR Toolkit an open source software

More information

Visualisation in the Google Cloud

Visualisation in the Google Cloud Visualisation in the Google Cloud by Kieran Barker, 1 School of Computing, Faculty of Engineering ABSTRACT Providing software as a service is an emerging trend in the computing world. This paper explores

More information

Virtual Environments - Basics -

Virtual Environments - Basics - Virtual Environments - Basics - What Is Virtual Reality? A Web-Based Introduction Version 4 Draft 1, September, 1998 Jerry Isdale http://www.isdale.com/jerry/vr/whatisvr.html Virtual Environments allow

More information

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY A PATH FOR HORIZING YOUR INNOVATIVE WORK OPEN SOURCE: SIXTH SENSE INTEGRATING INFORMATION WITH THE REAL WORLD MADHURI V.

More information

Installing Windows XP Professional

Installing Windows XP Professional CHAPTER 3 Installing Windows XP Professional After completing this chapter, you will be able to: Plan for an installation of Windows XP Professional. Use a CD to perform an attended installation of Windows

More information

Computer Hardware HARDWARE. Computer Hardware. Mainboard (Motherboard) Instructor Özgür ZEYDAN

Computer Hardware HARDWARE. Computer Hardware. Mainboard (Motherboard) Instructor Özgür ZEYDAN Computer Hardware HARDWARE Hardware: the collection of physical elements that comprise a computer system. Bülent Ecevit University Department of Environmental Engineering 1. Case and inside 2. Peripherals

More information

Augmented Reality Gaming

Augmented Reality Gaming Augmented Reality Gaming Extending Virtual Games into real Life Dr. Wolfgang Broll 17 July, 2007 Copyright 2006, Dr. Wolfgang Broll Outline Introduction Technology challenges Sample projects Discussion

More information

Interior Design in Augmented Reality Environment

Interior Design in Augmented Reality Environment Interior Design in Augmented Reality Environment Viet Toan Phan Ph. D Candidate 1 School of Architecture & Civil engineering Kyungpook National University, Republic of Korea 2 Department of Industrial

More information

Kathy Au Billy Yi Fan Zhou Department of Electrical and Computer Engineering University of Toronto { kathy.au, billy.zhou }@utoronto.

Kathy Au Billy Yi Fan Zhou Department of Electrical and Computer Engineering University of Toronto { kathy.au, billy.zhou }@utoronto. ECE1778 Project Report Kathy Au Billy Yi Fan Zhou Department of Electrical and Computer Engineering University of Toronto { kathy.au, billy.zhou }@utoronto.ca Executive Summary The goal of this project

More information

1. INTRODUCTION Graphics 2

1. INTRODUCTION Graphics 2 1. INTRODUCTION Graphics 2 06-02408 Level 3 10 credits in Semester 2 Professor Aleš Leonardis Slides by Professor Ela Claridge What is computer graphics? The art of 3D graphics is the art of fooling the

More information

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION International Journal of Advanced Research in Engineering and Technology (IJARET) Volume 7, Issue 2, March-April 2016, pp. 159 167, Article ID: IJARET_07_02_015 Available online at http://www.iaeme.com/ijaret/issues.asp?jtype=ijaret&vtype=7&itype=2

More information

SUBJECT: SOLIDWORKS HARDWARE RECOMMENDATIONS - 2013 UPDATE

SUBJECT: SOLIDWORKS HARDWARE RECOMMENDATIONS - 2013 UPDATE SUBJECT: SOLIDWORKS RECOMMENDATIONS - 2013 UPDATE KEYWORDS:, CORE, PROCESSOR, GRAPHICS, DRIVER, RAM, STORAGE SOLIDWORKS RECOMMENDATIONS - 2013 UPDATE Below is a summary of key components of an ideal SolidWorks

More information

An Introduction to OSVR

An Introduction to OSVR An Introduction to OSVR What is OSVR? OSVR is an open-source software platform for VR/AR applications. OSVR provides an easy and standardized way to discover, configure and operate hundreds of devices:

More information

Develop Computer Animation

Develop Computer Animation Name: Block: A. Introduction 1. Animation simulation of movement created by rapidly displaying images or frames. Relies on persistence of vision the way our eyes retain images for a split second longer

More information

Go to contents 18 3D Visualization of Building Services in Virtual Environment

Go to contents 18 3D Visualization of Building Services in Virtual Environment 3D Visualization of Building Services in Virtual Environment GRÖHN, Matti Gröhn; MANTERE, Markku; SAVIOJA, Lauri; TAKALA, Tapio Telecommunications Software and Multimedia Laboratory Department of Computer

More information

Tutorial. Making Augmented Reality Accessible for Everyone. Copyright (c) 2010 Human Interface Technology Laboratory New Zealand

Tutorial. Making Augmented Reality Accessible for Everyone. Copyright (c) 2010 Human Interface Technology Laboratory New Zealand Tutorial Making Augmented Reality Accessible for Everyone Copyright (c) 2010 Human Interface Technology Laboratory New Zealand Table of Contents ABOUT BuildAR... 3 BuildAR TUTORIAL... 4-13 Part 1 Installation..4

More information

Interactive Cards A game system in Augmented Reality

Interactive Cards A game system in Augmented Reality Interactive Cards A game system in Augmented Reality João Alexandre Coelho Ferreira, Instituto Superior Técnico Abstract: Augmented Reality can be used on innumerous topics, but the point of this work

More information

CS231M Project Report - Automated Real-Time Face Tracking and Blending

CS231M Project Report - Automated Real-Time Face Tracking and Blending CS231M Project Report - Automated Real-Time Face Tracking and Blending Steven Lee, slee2010@stanford.edu June 6, 2015 1 Introduction Summary statement: The goal of this project is to create an Android

More information

Mobile Application Design of Augmented Reality Digital Pet

Mobile Application Design of Augmented Reality Digital Pet Mobile Application Design of Augmented Reality Digital Pet 1 Chi-Fu Lin, 2 Sheng-Wen Lo, 2 Pai-Shan Pa, and 1 Chiou-Shann Fuh 1 Deaprtment of Computer Science and Information Engineering, National Taiwan

More information

Tracking devices. Important features. 6 Degrees of freedom. Mechanical devices. Types. Virtual Reality Technology and Programming

Tracking devices. Important features. 6 Degrees of freedom. Mechanical devices. Types. Virtual Reality Technology and Programming Tracking devices Virtual Reality Technology and Programming TNM053: Lecture 4: Tracking and I/O devices Referred to head-tracking many times Needed to get good stereo effect with parallax Essential for

More information

Perfect PDF 8 Premium

Perfect PDF 8 Premium Perfect PDF 8 Premium Test results ( gut Good, sehr gut very good) refer to versions 7, 6 and 5 of Perfect PDF. Professionally create, convert, edit and view PDF, PDF/A and XPS files Perfect PDF 8 Premium

More information

Computer Aided Liver Surgery Planning Based on Augmented Reality Techniques

Computer Aided Liver Surgery Planning Based on Augmented Reality Techniques Computer Aided Liver Surgery Planning Based on Augmented Reality Techniques Alexander Bornik 1, Reinhard Beichel 1, Bernhard Reitinger 1, Georg Gotschuli 2, Erich Sorantin 2, Franz Leberl 1 and Milan Sonka

More information

Future Landscapes. Research report CONTENTS. June 2005

Future Landscapes. Research report CONTENTS. June 2005 Future Landscapes Research report June 2005 CONTENTS 1. Introduction 2. Original ideas for the project 3. The Future Landscapes prototype 4. Early usability trials 5. Reflections on first phases of development

More information

If you are working with the H4D-60 or multi-shot cameras we recommend 8GB of RAM on a 64 bit Windows and 1GB of video RAM.

If you are working with the H4D-60 or multi-shot cameras we recommend 8GB of RAM on a 64 bit Windows and 1GB of video RAM. Phocus 2.7.6 Windows read-me December 5 2013 Installation To install Phocus, run the installation bundle called Phocus 2.7.6 Setup.exe. This bundle contains Phocus, Hasselblad Device Drivers, Microsoft.NET

More information

Updated: May 2008. Copyright 2005-2008 DBA Software Inc. All rights reserved. 2 Getting Started Guide

Updated: May 2008. Copyright 2005-2008 DBA Software Inc. All rights reserved. 2 Getting Started Guide Updated: May 2008 Copyright 2005-2008 DBA Software Inc. All rights reserved. 2 Getting Started Guide Table of Contents Welcome 4 Support Center Subscription 5 1. System Requirements 7 2. Installing the

More information

CLOUD GAMING WITH NVIDIA GRID TECHNOLOGIES Franck DIARD, Ph.D., SW Chief Software Architect GDC 2014

CLOUD GAMING WITH NVIDIA GRID TECHNOLOGIES Franck DIARD, Ph.D., SW Chief Software Architect GDC 2014 CLOUD GAMING WITH NVIDIA GRID TECHNOLOGIES Franck DIARD, Ph.D., SW Chief Software Architect GDC 2014 Introduction Cloud ification < 2013 2014+ Music, Movies, Books Games GPU Flops GPUs vs. Consoles 10,000

More information

3D Viewer. user's manual 10017352_2

3D Viewer. user's manual 10017352_2 EN 3D Viewer user's manual 10017352_2 TABLE OF CONTENTS 1 SYSTEM REQUIREMENTS...1 2 STARTING PLANMECA 3D VIEWER...2 3 PLANMECA 3D VIEWER INTRODUCTION...3 3.1 Menu Toolbar... 4 4 EXPLORER...6 4.1 3D Volume

More information

Dynamic Digital Depth (DDD) and Real-time 2D to 3D conversion on the ARM processor

Dynamic Digital Depth (DDD) and Real-time 2D to 3D conversion on the ARM processor Dynamic Digital Depth (DDD) and Real-time 2D to 3D conversion on the ARM processor November 2005 Abstract The use of mobile devices for entertainment consumption is a rapidly growing, global industry.

More information

What is Multimedia? Derived from the word Multi and Media

What is Multimedia? Derived from the word Multi and Media What is Multimedia? Derived from the word Multi and Media Multi Many, Multiple, Media Tools that is used to represent or do a certain things, delivery medium, a form of mass communication newspaper, magazine

More information

Desktop PC Buying Guide

Desktop PC Buying Guide Desktop PC Buying Guide Why Choose a Desktop PC? The desktop PC in this guide refers to a completely pre-built desktop computer, which is different to a self-built or DIY (do it yourself) desktop computer

More information

AUGMENTED REALITY FOR ASSESSING FUTURE LANDSCAPES

AUGMENTED REALITY FOR ASSESSING FUTURE LANDSCAPES AUGMENTED REALITY FOR ASSESSING FUTURE LANDSCAPES Craig Feuerherdt 1 William Cartwright 2 Michael Black 3 RMIT University, Australia, craigf@mapmedia.com.au 1, william.cartwright@rmit.edu.au2, michael.black@rmit.edu.au3

More information

Computer Animation and Visualisation. Lecture 1. Introduction

Computer Animation and Visualisation. Lecture 1. Introduction Computer Animation and Visualisation Lecture 1 Introduction 1 Today s topics Overview of the lecture Introduction to Computer Animation Introduction to Visualisation 2 Introduction (PhD in Tokyo, 2000,

More information

Laser Gesture Recognition for Human Machine Interaction

Laser Gesture Recognition for Human Machine Interaction International Journal of Computer Sciences and Engineering Open Access Research Paper Volume-04, Issue-04 E-ISSN: 2347-2693 Laser Gesture Recognition for Human Machine Interaction Umang Keniya 1*, Sarthak

More information

A Generic Virtual Reality Interaction System and its Extensions Using the Common Object Request Broker Architecture (CORBA)

A Generic Virtual Reality Interaction System and its Extensions Using the Common Object Request Broker Architecture (CORBA) A Generic Virtual Reality Interaction System and its Extensions Using the Common Object Request Broker Architecture (CORBA) ABSTRACT The paper describes the design and implementation of an immersive Virtual

More information

Chapter 5 Understanding Input. Discovering Computers 2012. Your Interactive Guide to the Digital World

Chapter 5 Understanding Input. Discovering Computers 2012. Your Interactive Guide to the Digital World Chapter 5 Understanding Input Discovering Computers 2012 Your Interactive Guide to the Digital World Objectives Overview Define input and differentiate among a program, command, and user response Identify

More information

Video Tracking Software User s Manual. Version 1.0

Video Tracking Software User s Manual. Version 1.0 Video Tracking Software User s Manual Version 1.0 Triangle BioSystems International 2224 Page Rd. Suite 108 Durham, NC 27703 Phone: (919) 361-2663 Fax: (919) 544-3061 www.trianglebiosystems.com Table of

More information

An Instructional Aid System for Driving Schools Based on Visual Simulation

An Instructional Aid System for Driving Schools Based on Visual Simulation An Instructional Aid System for Driving Schools Based on Visual Simulation Salvador Bayarri, Rafael Garcia, Pedro Valero, Ignacio Pareja, Institute of Traffic and Road Safety (INTRAS), Marcos Fernandez

More information

Table of Contents Advanced ID Creator User Guide

Table of Contents Advanced ID Creator User Guide Advanced ID Creator User Guide Revision 8.0.3 Table of Contents Chapter 1 Introduction... 1-5 Special Features... 1-5 What s New?... 1-5 Chapter 2 Installing Advanced ID Creator... 2-7 Minimum System Requirements...

More information

REAL TIME MONITORING AND TRACKING SYSTEM FOR AN ITEM USING THE RFID TECHNOLOGY

REAL TIME MONITORING AND TRACKING SYSTEM FOR AN ITEM USING THE RFID TECHNOLOGY Review of the Air Force Academy No 3 (30) 2015 REAL TIME MONITORING AND TRACKING SYSTEM FOR AN ITEM USING THE RFID TECHNOLOGY For the past few years, location systems have become a major studying field,

More information

Instruction Manual. Applied Vision is available for download online at:

Instruction Manual. Applied Vision is available for download online at: Applied Vision TM 4 Software Instruction Manual Applied Vision is available for download online at: www.ken-a-vision.com/support/software-downloads If you require an Applied Vision installation disk, call

More information

Introduction. www.imagesystems.se

Introduction. www.imagesystems.se Product information Image Systems AB Main office: Ågatan 40, SE-582 22 Linköping Phone +46 13 200 100, fax +46 13 200 150 info@imagesystems.se, Introduction TrackEye is the world leading system for motion

More information

Intelligent Monitoring Software

Intelligent Monitoring Software Intelligent Monitoring Software IMZ-NS101 IMZ-NS104 IMZ-NS109 IMZ-NS116 IMZ-NS132 click: sony.com/sonysports sony.com/security Stunning video and audio brought to you by the IPELA series of visual communication

More information

T O B C A T C A S E G E O V I S A T DETECTIE E N B L U R R I N G V A N P E R S O N E N IN P A N O R A MISCHE BEELDEN

T O B C A T C A S E G E O V I S A T DETECTIE E N B L U R R I N G V A N P E R S O N E N IN P A N O R A MISCHE BEELDEN T O B C A T C A S E G E O V I S A T DETECTIE E N B L U R R I N G V A N P E R S O N E N IN P A N O R A MISCHE BEELDEN Goal is to process 360 degree images and detect two object categories 1. Pedestrians,

More information

SMART Meeting Pro 3 software

SMART Meeting Pro 3 software Release notes SMART Meeting Pro 3 software About these release notes These release notes summarize the changes in SMART Meeting Pro 3 software and its service packs. N OT E This software includes SMART

More information

Updated: April 2010. Copyright 2005-2010 DBA Software Inc. All rights reserved. 2 Getting Started Guide

Updated: April 2010. Copyright 2005-2010 DBA Software Inc. All rights reserved. 2 Getting Started Guide Updated: April 2010 Copyright 2005-2010 DBA Software Inc. All rights reserved. 2 Getting Started Guide Table of Contents Welcome 4 Support Center Subscription 5 1. System Requirements 8 2. Installing the

More information

Robot Perception Continued

Robot Perception Continued Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart

More information

SketchUp Instructions

SketchUp Instructions SketchUp Instructions Every architect needs to know how to use SketchUp! SketchUp is free from Google just Google it and download to your computer. You can do just about anything with it, but it is especially

More information

Boundless Security Systems, Inc.

Boundless Security Systems, Inc. Boundless Security Systems, Inc. sharper images with better access and easier installation Product Overview Product Summary Data Sheet Control Panel client live and recorded viewing, and search software

More information

PowerPoint 2007: Basics Learning Guide

PowerPoint 2007: Basics Learning Guide PowerPoint 2007: Basics Learning Guide What s a PowerPoint Slide? PowerPoint presentations are composed of slides, just like conventional presentations. Like a 35mm film-based slide, each PowerPoint slide

More information

SMART Meeting Pro 3 software

SMART Meeting Pro 3 software Release notes SMART Meeting Pro 3 software About these release notes These release notes summarize the changes in SMART Meeting Pro 3 software and its service packs. Product information SMART Meeting Pro

More information

Data Visualization in Parallel Environment Based on the OpenGL Standard

Data Visualization in Parallel Environment Based on the OpenGL Standard NO HEADER, NO FOOTER 5 th Slovakian-Hungarian Joint Symposium on Applied Machine Intelligence and Informatics January 25-26, 2007 Poprad, Slovakia Data Visualization in Parallel Environment Based on the

More information

Applications > Robotics research and education > Assistant robot at home > Surveillance > Tele-presence > Entertainment/Education > Cleaning

Applications > Robotics research and education > Assistant robot at home > Surveillance > Tele-presence > Entertainment/Education > Cleaning Introduction robulab 10 is a multi-purpose mobile robot designed for various indoor applications, such as research and education, tele-presence, assistance to people staying at home. robulab 10 is a generic

More information

Monitor Wall 4.0. Installation and Operating Manual

Monitor Wall 4.0. Installation and Operating Manual Monitor Wall 4.0 en Installation and Operating Manual Monitor Wall 4.0 Table of Contents en 3 Table of Contents 1 Introduction 4 1.1 About this Manual 4 1.2 Conventions in this Manual 4 1.3 Minimum Installation

More information

Tekla Structures 18 Hardware Recommendation

Tekla Structures 18 Hardware Recommendation 1 (5) Tekla Structures 18 Hardware Recommendation Recommendations for Tekla Structures workstations Tekla Structures hardware recommendations are based on the setups that have been used in testing Tekla

More information

COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies

COMP175: Computer Graphics. Lecture 1 Introduction and Display Technologies COMP175: Computer Graphics Lecture 1 Introduction and Display Technologies Course mechanics Number: COMP 175-01, Fall 2009 Meetings: TR 1:30-2:45pm Instructor: Sara Su (sarasu@cs.tufts.edu) TA: Matt Menke

More information

Computer Science 474 Spring 2010 Virtual Reality

Computer Science 474 Spring 2010 Virtual Reality VIRTUAL REALITY Virtual Reality (VR) is the use of computer graphics and other technologies to create a simulated environment in which the user interacts. While computer graphics supplies the visual component

More information

Effective Interface Design Using Face Detection for Augmented Reality Interaction of Smart Phone

Effective Interface Design Using Face Detection for Augmented Reality Interaction of Smart Phone Effective Interface Design Using Face Detection for Augmented Reality Interaction of Smart Phone Young Jae Lee Dept. of Multimedia, Jeonju University #45, Backma-Gil, Wansan-Gu,Jeonju, Jeonbul, 560-759,

More information

Force/position control of a robotic system for transcranial magnetic stimulation

Force/position control of a robotic system for transcranial magnetic stimulation Force/position control of a robotic system for transcranial magnetic stimulation W.N. Wan Zakaria School of Mechanical and System Engineering Newcastle University Abstract To develop a force control scheme

More information

Voice Driven Animation System

Voice Driven Animation System Voice Driven Animation System Zhijin Wang Department of Computer Science University of British Columbia Abstract The goal of this term project is to develop a voice driven animation system that could take

More information

Open-Source-based Visualization of Flight Waypoint Tracking Using Flight Manipulation System

Open-Source-based Visualization of Flight Waypoint Tracking Using Flight Manipulation System Open-Source-based Visualization of Flight Waypoint Tracking Using Flight Manipulation System Myeong-Chul Park a, Hyeon-Gab Shin b, Yong Ho Moon b, Seok-Wun Ha b*, a Dept. of Biomedical Electronics, Songho

More information

Computer Requirements

Computer Requirements Installing Pro64 Network Manager It is recommended that you quit all running Windows applications before starting the Aviom Pro64 Network Manager installation process. Check the Aviom website (www.aviom.com)

More information

Medical Image Processing on the GPU. Past, Present and Future. Anders Eklund, PhD Virginia Tech Carilion Research Institute andek@vtc.vt.

Medical Image Processing on the GPU. Past, Present and Future. Anders Eklund, PhD Virginia Tech Carilion Research Institute andek@vtc.vt. Medical Image Processing on the GPU Past, Present and Future Anders Eklund, PhD Virginia Tech Carilion Research Institute andek@vtc.vt.edu Outline Motivation why do we need GPUs? Past - how was GPU programming

More information

GeoVision Setup. Once all the settings for Windows are completed and you have all the hard drives setup you can install GeoVision.

GeoVision Setup. Once all the settings for Windows are completed and you have all the hard drives setup you can install GeoVision. GeoVision Setup Once all the settings for Windows are completed and you have all the hard drives setup you can install GeoVision. Start in order beginning with the drivers. When you install the drivers

More information

Ansur Test Executive. Users Manual

Ansur Test Executive. Users Manual Ansur Test Executive Users Manual April 2008 2008 Fluke Corporation, All rights reserved. All product names are trademarks of their respective companies Table of Contents 1 Introducing Ansur... 4 1.1 About

More information

Multimedia Communication. Slides courtesy of Tay Vaughan Making Multimedia Work

Multimedia Communication. Slides courtesy of Tay Vaughan Making Multimedia Work Multimedia Communication Slides courtesy of Tay Vaughan Making Multimedia Work Outline Multimedia concept Tools for Multimedia communication _Software _Hardware Advanced coding standards Applications What

More information

UNICORN 7.0. Administration and Technical Manual

UNICORN 7.0. Administration and Technical Manual UNICORN 7.0 Administration and Technical Manual Page intentionally left blank Table of Contents Table of Contents 1 Introduction... 1.1 Administrator functions overview... 1.2 Network terms and concepts...

More information

New Inspiron 20 3000 Series (Intel ) All-in- One Desktop

New Inspiron 20 3000 Series (Intel ) All-in- One Desktop Dell recommends Windows. New Inspiron 20 3000 Series (Intel ) All-in- One Desktop All the right features, all in one. Get what you need a computer, monitor and speakers in a compact, space-saving design

More information

Mouse Control using a Web Camera based on Colour Detection

Mouse Control using a Web Camera based on Colour Detection Mouse Control using a Web Camera based on Colour Detection Abhik Banerjee 1, Abhirup Ghosh 2, Koustuvmoni Bharadwaj 3, Hemanta Saikia 4 1, 2, 3, 4 Department of Electronics & Communication Engineering,

More information

BlazeVideo HDTV Player v6.0r User s Manual. Table of Contents

BlazeVideo HDTV Player v6.0r User s Manual. Table of Contents BlazeVideo HDTV Player v6.0r User s Manual Table of Contents Ⅰ. Overview... 2 1.1 Introduction... 2 1.2 Features... 2 1.3 System Requirements... 2 Ⅱ. Appearance & Menus... 4 Ⅲ. Operation Guide... 7 3.1

More information

Wearable Finger-Braille Interface for Navigation of Deaf-Blind in Ubiquitous Barrier-Free Space

Wearable Finger-Braille Interface for Navigation of Deaf-Blind in Ubiquitous Barrier-Free Space Wearable Finger-Braille Interface for Navigation of Deaf-Blind in Ubiquitous Barrier-Free Space Michitaka Hirose Research Center for Advanced Science and Technology, The University of Tokyo 4-6-1 Komaba

More information

Sony Releases the Transparent Lens Eyewear SmartEyeglass Developer Edition

Sony Releases the Transparent Lens Eyewear SmartEyeglass Developer Edition News & Information 1-7-1 Konan, Minato-ku, Tokyo Sony Corporation No. 15-016E February 17, 2015 Sony Releases the Transparent Lens Eyewear SmartEyeglass Developer Edition - Promotes the development of

More information

REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR INTRODUCTION

REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR INTRODUCTION REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR Paul Mrstik, Vice President Technology Kresimir Kusevic, R&D Engineer Terrapoint Inc. 140-1 Antares Dr. Ottawa, Ontario K2E 8C4 Canada paul.mrstik@terrapoint.com

More information

Getting Online with Second Life

Getting Online with Second Life Getting Online with Second Life A guide for students and staff Second life is a virtual world where the content is created and managed by its residents. The OU owns 6 islands in Second Life, three of which

More information

Sensor Fusion and its Applications in Portable Devices. Jay Esfandyari MEMS Product Marketing Manager STMicroelectronics

Sensor Fusion and its Applications in Portable Devices. Jay Esfandyari MEMS Product Marketing Manager STMicroelectronics Sensor Fusion and its Applications in Portable Devices Jay Esfandyari MEMS Product Marketing Manager STMicroelectronics Outline What is Sensor Fusion? What Are the Components of Sensor Fusion? How Does

More information

Introduction to Final Cut Pro 7 - Editing Basics

Introduction to Final Cut Pro 7 - Editing Basics Workshop Objectives Become familiar with the Final Cut Pro workspace, basic editing, capturing footage, using tools, exporting to tape, or QuickTime. Learn effective workflow and file management strategies.

More information

================================================================== CONTENTS ==================================================================

================================================================== CONTENTS ================================================================== Disney Planes Read Me File ( Disney) Thank you for purchasing Disney Planes. This readme file contains last minute information that did not make it into the manual, more detailed information on various

More information

TAMS 2.1 User s Manual. Utah LTAP Center. Contact: Utah LTAP 4111 Old Main Hill Logan, UT. 84322-4111 800-822-8878 www.utahltap.

TAMS 2.1 User s Manual. Utah LTAP Center. Contact: Utah LTAP 4111 Old Main Hill Logan, UT. 84322-4111 800-822-8878 www.utahltap. TAMS 2.1 User s Manual Utah LTAP Center Contact: Utah LTAP 4111 Old Main Hill Logan, UT. 84322-4111 800-822-8878 www.utahltap.org Table of Contents Introduction 1 Initializing the program... 1 Data Needed

More information

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - nzarrin@qiau.ac.ir

More information

BUILDING TELEPRESENCE SYSTEMS: Translating Science Fiction Ideas into Reality

BUILDING TELEPRESENCE SYSTEMS: Translating Science Fiction Ideas into Reality BUILDING TELEPRESENCE SYSTEMS: Translating Science Fiction Ideas into Reality Henry Fuchs University of North Carolina at Chapel Hill (USA) and NSF Science and Technology Center for Computer Graphics and

More information

Geomagic Design. Release Notes. Get to Market Faster with Better Products at a Lower Cost V17

Geomagic Design. Release Notes. Get to Market Faster with Better Products at a Lower Cost V17 Geomagic Design Get to Market Faster with Better Products at a Lower Cost Release Notes V17 TABLE OF CONTENTS 1 INTRODUCTION 1 COPYRIGHT 1 2 INSTALLATION 2 SOFTWARE IDENTIFICATION 2 UPGRADING TO GEOMAGIC

More information

Example AR image. Augmented Reality. Augmented Reality. Milgram s Reality- Virtuality continuum. Why Augmented Reality? Is AR easier/harder than VR?

Example AR image. Augmented Reality. Augmented Reality. Milgram s Reality- Virtuality continuum. Why Augmented Reality? Is AR easier/harder than VR? Example AR image Augmented Reality Matt Cooper Youngkwan Cho, STAR system (Many slides based on the MUM2003 Tutorials by Mark Billinghurst and Mark Ollila) Milgram s Reality- Virtuality continuum Real

More information

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1 Silverlight for Windows Embedded Graphics and Rendering Pipeline 1 Silverlight for Windows Embedded Graphics and Rendering Pipeline Windows Embedded Compact 7 Technical Article Writers: David Franklin,

More information

Virtual Reality. man made. reality. sense. world. What is Virtual Reality?

Virtual Reality. man made. reality. sense. world. What is Virtual Reality? Virtual Reality man made reality sense world What is Virtual Reality? Dipl.-Ing. Indra Kusumah Process Technology Fraunhofer IPT Steinbachstrasse 17 D-52074 Aachen Indra.kusumah@ipt.fraunhofer.de www.ipt.fraunhofer.de

More information

Virtual Reality in Chemical Engineering Education

Virtual Reality in Chemical Engineering Education Reprinted from the Proceedings of the American Society for Engineering Education Illinois / Indiana Sectional Conference, Purdue University, March 1995. Virtual Reality in Chemical Engineering Education

More information

In this chapter you will find information on the following subjects:

In this chapter you will find information on the following subjects: 17 1. From XP to Vista Microsoft, the creator of Windows, has published various versions of the Windows operating system over the past two decades. Windows Vista is the latest version, the successor to

More information

WebEx. Remote Support. User s Guide

WebEx. Remote Support. User s Guide WebEx Remote Support User s Guide Version 6.5 Copyright WebEx Communications, Inc. reserves the right to make changes in the information contained in this publication without prior notice. The reader should

More information

Basics of Computational Physics

Basics of Computational Physics Basics of Computational Physics What is Computational Physics? Basic computer hardware Software 1: operating systems Software 2: Programming languages Software 3: Problem-solving environment What does

More information

Appointment Scheduler

Appointment Scheduler EZClaim Appointment Scheduler User Guide Last Update: 11/19/2008 Copyright 2008 EZClaim This page intentionally left blank Contents Contents... iii Getting Started... 5 System Requirements... 5 Installing

More information

MMGD0203 Multimedia Design MMGD0203 MULTIMEDIA DESIGN. Chapter 3 Graphics and Animations

MMGD0203 Multimedia Design MMGD0203 MULTIMEDIA DESIGN. Chapter 3 Graphics and Animations MMGD0203 MULTIMEDIA DESIGN Chapter 3 Graphics and Animations 1 Topics: Definition of Graphics Why use Graphics? Graphics Categories Graphics Qualities File Formats Types of Graphics Graphic File Size Introduction

More information

FSI Machine Vision Training Programs

FSI Machine Vision Training Programs FSI Machine Vision Training Programs Table of Contents Introduction to Machine Vision (Course # MVC-101) Machine Vision and NeuroCheck overview (Seminar # MVC-102) Machine Vision, EyeVision and EyeSpector

More information

Automotive Applications of 3D Laser Scanning Introduction

Automotive Applications of 3D Laser Scanning Introduction Automotive Applications of 3D Laser Scanning Kyle Johnston, Ph.D., Metron Systems, Inc. 34935 SE Douglas Street, Suite 110, Snoqualmie, WA 98065 425-396-5577, www.metronsys.com 2002 Metron Systems, Inc

More information

Implementation of Augmented Reality System for Smartphone Advertisements

Implementation of Augmented Reality System for Smartphone Advertisements , pp.385-392 http://dx.doi.org/10.14257/ijmue.2014.9.2.39 Implementation of Augmented Reality System for Smartphone Advertisements Young-geun Kim and Won-jung Kim Department of Computer Science Sunchon

More information

DMU Viewer CATIA V5R19 TABLE OF CONTENTS

DMU Viewer CATIA V5R19 TABLE OF CONTENTS TABLE OF CONTENTS Introduction...1 DMU Viewer...1 Product Structure Introduction...3 Pull Down Menus...4 Start...4 File...5 Edit...6 View...9 Insert...12 Tools...13 Analyze...16 Window...17 Help...18 Product

More information

COMPUTER SCIENCE High School Standards

COMPUTER SCIENCE High School Standards COMPUTER SCIENCE High School Standards CONTENT STANDARD 1 1. Components Of A Computer System 1.HS.1 1.HS.2 1.HS.3 1.HS.4 Demonstrate the ability to store data on a variety of storage media, i.e., floppy,

More information

Maya 2014 Basic Animation & The Graph Editor

Maya 2014 Basic Animation & The Graph Editor Maya 2014 Basic Animation & The Graph Editor When you set a Keyframe (or Key), you assign a value to an object s attribute (for example, translate, rotate, scale, color) at a specific time. Most animation

More information

Issues of Hybrid Mobile Application Development with PhoneGap: a Case Study of Insurance Mobile Application

Issues of Hybrid Mobile Application Development with PhoneGap: a Case Study of Insurance Mobile Application DATABASES AND INFORMATION SYSTEMS H.-M. Haav, A. Kalja and T. Robal (Eds.) Proc. of the 11th International Baltic Conference, Baltic DB&IS 2014 TUT Press, 2014 215 Issues of Hybrid Mobile Application Development

More information

Using Photorealistic RenderMan for High-Quality Direct Volume Rendering

Using Photorealistic RenderMan for High-Quality Direct Volume Rendering Using Photorealistic RenderMan for High-Quality Direct Volume Rendering Cyrus Jam cjam@sdsc.edu Mike Bailey mjb@sdsc.edu San Diego Supercomputer Center University of California San Diego Abstract With

More information