QUT Digital Repository: http://eprints.qut.edu.au/27216



Similar documents
QUT Digital Repository:

Remote Desktop Access through Android Mobiles and Android Mobiles Access through Web Browser

Mobile Operating Systems Lesson 07 Symbian OS

Mobile-PC Suite: Using Mobile Phone as Remote to Control PC Operations

RUP Vision Document for the Home Appliance Control System: Defining Stakeholders, Goals, and COTS Components

PEGA MOBILITY A PEGA PLATFORM WHITEPAPER

MOBILE TECHNOLOGIES AND DISTANCE EDUCATION

Mobile Devices and Systems Lesson 02 Handheld Pocket Computers and Mobile System Operating Systems

The preliminary design of a wearable computer for supporting Construction Progress Monitoring

E-Business Technologies for the Future

Using Mobiles for On Campus Location Tracking

Mobile Commerce and Ubiquitous Computing. Chapter 6

PERSONAL MOBILE DEVICE FOR SITUATED INTERACTION

Mitel MiCollab Client

Mobile Operating Systems Lesson 03 PalmOS Part 1

Augmented Reality Gaming

Mobile Operating Systems. Week I

ORACLE MOBILE APPLICATION FRAMEWORK DATA SHEET

Mobility Solutions in IBM

Mobile Device Manual for 3G DVRs

Fast remote data access for control of TCP/IP network using android Mobile device

A GUI Crawling-based technique for Android Mobile Application Testing

Solutions using our software products

Chapter 3. Application Software. Chapter 3 Objectives. Application Software

Nokia 9210i/9290 Communicators and PersonalJava TM Application Development

ANDROID APPLICATION DEVELOPMENT FOR ENVIRONMENT MONITORING USING SMART PHONES

USER S GUIDE FOR. Nokia PC Suite 6.84

JavaFX Session Agenda

Novel AMR technologies and Remote Monitoring

Video Collaboration & Application Sharing Product Overview

How To Use Eye Tracking With A Dual Eye Tracking System In A Collaborative Collaborative Eye Tracking (Duet)

AA Automated Attendant is a device connected to voice mail systems that answers and may route incoming calls or inquiries.

Test 1 Review Chapter 1 What is a computer? 1. Definition of computer 2. Data and information. Emphasize that data is processed into information. 3.

STORE VIEW: Pervasive RFID & Indoor Navigation based Retail Inventory Management

Development of Integrated Management System based on Mobile and Cloud Service for Preventing Various Hazards

TECHNOLOGIES MOBILE STRATEGY

Discovering Computers Chapter 3 Application Software

DEVELOPMENT OF CAMPUS SPACE NAVIGATION AND GUIDE SYSTEM

SOA REFERENCE ARCHITECTURE: WEB TIER

ORACLE ADF MOBILE DATA SHEET

Wireless Java Programming for Enterprise Applications

Chapter 1 Basic Introduction to Computers. Discovering Computers Your Interactive Guide to the Digital World

PRIVATE TEXTUAL NETWORK USING GSM ARCHITECTURE

ERDAS ADE Enterprise Suite Products Overview and Position

Mobility Introduction Android. Duration 16 Working days Start Date 1 st Oct 2013

Achieving business agility and cost optimization by reducing IT complexity. The value of adding ESB enrichment to your existing messaging solution

Experiment 5: Magnetic Fields of a Bar Magnet and of the Earth

A Framework of Context-Sensitive Visualization for User-Centered Interactive Systems

Best Practice Guide for constructing a study area in studentcentral which is designed for friendly viewing in Blackboard Mobile Learn

ONTOLOGY FOR MOBILE PHONE OPERATING SYSTEMS

VIA COLLAGE Deployment Guide

Literature Review: Starting Mobile Application Development for E-Sports Portal Hayoung Noh

Mobile Devices: Server and Management Lesson 05 Service Discovery

A survey of mobile social networking

MOBILE GEOGRAPHIC INFORMATION SYSTEMS: A CASE STUDY ON MANSOURA UNIVERSITY, EGYPT

Experiences with a Tablet PC Based Lecture Presentation System in Computer Science Courses

HYPER MEDIA MESSAGING

Mobile Virtual Network Computing System

Systems Engineering and Integration for the NSG (SEIN) SharePoint Developer

CASSANDRA: Version: / 1. November 2001

Animation Action STUDIO. PROJECT 11 For use with Chapter 8. Objective SUPPLIES

Mobile Access Controller

Enabling Smart Data on M2M Gateways and Aggregators

ADF Mobile Overview and Frequently Asked Questions

City of Dublin Education & Training Board. Programme Module for. Mobile Technologies. leading to. Level 6 FETAC. Mobile Technologies 6N0734

Skynax. Mobility Management System. System Manual

Smart Cities. Photo used under Creative Commons from nigelhowe

Information Technology Career Field Pathways and Course Structure

Data Synchronization in Mobile Computing Systems Lesson 06 Synchronization Software HotSync, ActiveSync and Intellisync

Development of Integrated Management System based on Mobile and Cloud service for preventing various dangerous situations

Mobile Development Discovery Document

Chapter 3. Application Software. Chapter 3 Objectives. Application Software. Application Software. Application Software. What is application software?

pocket PCs PDAs desktop phones Endpoints begin with Inter-Tel video and multimedia units

Mobile Operating Systems Lesson 05 Windows CE Part 1

3D Client Software - Interactive, online and in real-time

Toolkit for Bar Code Recognition and Resolving on Camera Phones - Jump-Starting the Internet of Things

Small Business Mobility

Transcription:

QUT Digital Repository: http://eprints.qut.edu.au/27216 Dong, Andy., Maher, Mary Lou and Daruwala, Yohann. Where the actions are : bring reality into virtual reality. The Participants of the CRC for Construction Innovation have delegated authority to the CEO of the CRC to give Participants permission to publish material created by the CRC for Construction Innovation. This delegation is contained in Clause 30 of the Agreement for the Establishment and Operation of the Cooperative Research Centre for Construction Innovation. The CEO of the CRC for Construction Innovation gives permission to the Queensland University of Technology to publish the papers/publications provided in the collection in QUT eprints provided that the publications are published in full. Icon.Net Pty Ltd retains copyright to the publications. Any other usage is prohibited without the express permission of the CEO of the CRC. The CRC warrants that Icon.Net Pty Ltd holds copyright to all papers/reports/publications produced by the CRC for Construction Innovation.

Where the actions are: bring reality into virtual reality Andy Dong +61 2 9351 3031 adon3656@mail.usyd.edu.au ABSTRACT This paper describes a work-in-progress on developing design environments that combine wireless and mobile technologies with augmented reality to facilitate bringing context from the physical environment to the virtual models for design work. One of the challenges for designers in a variety of end-user-oriented design disciplines such as architecture and industrial design has been capturing and replaying the contextual information of the intended domain of the artifact being designed. Either the technology is decidedly low-tech, such as charcoal drawings in a sketchbook, out-of-reach, such as immersive virtual reality CAVEs, or a make-do with existing technologies, such as a collage of digital photos. This paper describes a novel combination of off-the-shelf technologies that may allow designers more capability to create models using standard computer-aided design applications and augmented reality to combine the current, physical context with the projected, digital context. We demonstrate this approach in the building design domain to address a common problem in building construction, construction defect resolution. Categories and Subject Descriptors J6.6 [Computer-Aided Engineering]: Computer aided design. General Terms Design, Human Factors. Keywords Augmented reality; mobile computing. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. VRST 2005, November 7-9, 2005, Monterey, CA, USA. Copyright 2005 ACM 1-58113-000-0/00/0004 $5.00. Mary Lou Maher +61 2 9351 4108 mary@arch.usyd.edu.au Wai Kong Wan +61 2 9351 2053 wwan4501@mail.usyd.edu.au Yohann Daruwala +61 2 9351 2053 ydar6010@mail.usyd.edu.au 1. INTRODUCTION Designers are peripatetic workers. Training and education in various design-related disciplines such as architectural design, industrial design, and interaction design encourages this peripatetic tendency as contextual inquiry is an integral aspect of the design process in these disciplines. Bellotti and Bly distinguish between local mobility and long distance collaboration in their description of peripatetic designers at a consulting firm [2]. Today s emerging nomadic computing tools and public and private wireless data infrastructure pushes the potential for leveraging this mobility rather than seeing it as a technical difficulty. This enhanced mobility encourages designers out of their office, enabling them to conduct contextual inquiry of users, study environments, and explore places and spaces in situ. This paper describes research that aims to build mobile design studios that allow designers to design wherever the designing happens, and more importantly, appropriate to the context of the designed artifact. Taking cues from the field of model based optimization, our philosophy is not to reproduce the context in the highest fidelity simulator, such as with a virtual reality CAVE, nor to miniaturize desktop computer-aided design applications onto handheld computers. In fact, we believe the first would produce cognitive overload and the second mask the advantages of nomadic computing. Rather, our approach is to supplement computer-aided design applications to allow designers to build models that stimulate their recollection of the context for which they are designing. Our objective is to develop a model for design environments that allow designers to create models that combine the current, physical context with the projected, digital context, that is, the existing context plus the designed artifacts. In this environment designers may experience the emerging concept rather than merely represent the structural embodiment of a concept. Additionally, the system should facilitate the capability for designers to work with others, inside and outside of the studio. This approach sits within two ends in a spectrum of approaches for embedding physical contexts into virtual models and vice versa. On the one hand, the intended aim of augmented reality is

to allow users to visualize virtual reality models in the physical context, bringing the virtual to the physical. Systems which allow designers to model 3D objects while situated in the physical context such as the Tinmith-Metro system [7] exemplify this approach. On the other hand, pure augmented reality computeraided design applications such as the ARTHUR system lets designers view digital models of to-be-built buildings within the pre-existing urban landscape for urban planning and design, but both the to-be-built buildings and the existing urban environment are presented as 3D augmented reality digital models [3]. The urban streetscape is not presented to the viewers as it actually exists. The intent of the system described in this paper is to bring the physical into the virtual to facilitate the perception and interaction with both the physical and the digital components in the designer s studio. The idea of connecting the physical context with the digital model for designers to experience the emerging concept of the design closes the loop on what has been called experience design. The new discipline of experience design deals with the creation of a dialog between the consumers of a product and the product through all touch points with the product. Grefé describes the field of experience design through words and an equation. In words, experience design deals with the form, content and context of communication occurring over time ; described mathematically, experience design = (form + content + context ) time [5]. Grefé, in describing what experience design is and how designers may design experiences, also usefully offers a way to think about how experiences are created. We would modify Grefé s equation to consider that those experiences emerge through the accumulation of reciprocal interactions between a person and content over time where the content is more than the form of an object and the subject matter of the object. Expressed mathematically, our model of experience is: experience = (content + context) dt where content includes the (linguistic) senses of content as subject matter, cognitive object, and person, object or scene. Thus, the computer-aided design application affords the designer s recollection of the context for which they are designing through the repetition of the experience through computation. The model of that experience, and the enactment of new experiences, comes about through the retroactive re-construction of the context with the new designed artifacts, model = experience + computation. This paper describes our ongoing research in the type of computational systems based on virtual reality and nomadic computing that include experience in the physical context. The intent is that those models will assist the designers to create artifacts and associated experiences for wherever the actions are [4], that is, wherever the user may engage in a touch point with the artifact. 2. DESIGNING WITH PHYSICAL CONTEXTS The system consists of two main parts: context capture and context re-play. The hardware supporting the context capture part consists of a wireless integrated mobile phone (i.e., mobile phone, personal digital assistant, still and moving image digital camera) with a GPS card. The hardware supporting the context re-play part consists of a horizontal, interactive tabletop as an input/output device that combines a projector-based and an augmented reality display system. The following sections describe each of the parts in further detail. 2.1 Capturing Context The mobile computer, in the form of a handheld computer such as a Tablet PC, palm-sized personal digital assistant (PDA) such as a Palm, or a smart phone which combines the features of a PDA and mobile phone, along with wireless networking such as Wi-Fi, WiMAX, and GPRS, provide the basic infrastructure for nomadic computing applications. There are an increasing number of research projects and commercial products that aim to take advantage of the computing power of these nomadic computing devices such as information management on construction sites [6], and digital moving image capture [8]. Our point of departure is that nomadic computing allows just one aspect of scaling interaction with respect to time [1] where the interaction in this application deals interacting with the contexts within which a designed artifact will exist. In the experience capture stage, the user creates a short digital media (e.g., digital audio, still image or moving image) clip and textual record of the context. The digital media is additionally annotated with meta data about the location of the media using GPS coordinates. This information is then sent to a remote database to store the data. The available network communication protocols include HTTP over Wi-FI or WiMAX, SMS or MMS over GPRS and Bluetooth. With SMS, the user is limited to sending textual information only with the GPS coordinates. For Bluetooth, the system sends two data files using Bluetooth file sharing: one file containing the textual data and a reference to the digital clip and one containing the digital media clip. Implemented in J2ME for cross-platform portability, the following optional packages required are described in Table 1. The data capture and transmission requires the following Java Specification Request (JSR) optional packages: Table 1. J2ME Optional Packages for Capture Phase Feature Optional Package Package Description File I/O JSR 75 FileConnection (FC) SMS/MMS JSR 205 Wireless Messaging API 2.0 Bluetooth JSR 82 Bluetooth API At the moment, there is no J2ME enabled device which supports all of the required optional packages. The IBM J9 J2ME runtime environment, which runs on portable devices with the Microsoft Windows PocketPC operating system, supports JSR 75 but not JSR 205 or JSR 82. Mobile phones running the Symbian 9.1 operating system, such as the Nokia N91, support JSR 75 and JSR 82 but only JSR 120 (Wireless Messaging API 1.1) which does not include support for MMS. WiMAX is not yet available in Australia. As such, the current implementation of the capture phase only supports wireless data transfer via HTTP over Wi-Fi and GPRS. Figure 1 shows the data capture software running in the Sun J2ME mobile phone emulator. The user browses the mobile device s file system for the digital media clip to send, inputs the GPS coordinates, and selects the communication protocol as shown in Figure 1(a). Figure 1(b) depicts the type of note that a user may attach to the digital media clip. When the user pushes

the Send button, the digital media clip is sent using the requested communication protocol. The resolution of the design issues is complicated by two confounding variables. First, the context, that is the built environment, is partially built but also partially digital in the form of 2D and 3D CAD models. Thus, the designed solution must satisfy the realities of what has been built, the current context, and what will be built, the projected context. Second, the fix is mired by contractual obligations and stakeholders. Therefore, the designed solution is more likely to arise from intensive sessions of synchronous collaborative design. The design environment provided by our approach, combining the nomadic experience capture with the replaying of the context while designing a solution collaboratively on the horizontal interactive tabletop system, brings the two kinds of information together to facilitate further design development. The digital images of construction defects are sent via HTTP over GPRS (with the mobile device s inbuilt GPRS modem) to the office. The architects then select the defects to view within a 2D plan view of the site as shown in Figure 2. (a) (b) Figure 1 Capture data on mobile phone 2.2 Replaying Context The data sent by the mobile device is received by a computer which listens for the data from the mobile device. The service listeners include an HTTP daemon, an SMS/MMS service gateway, and a Bluetooth server. The received data is uploaded to a relational database. This information can then be retrieved and re-played on a horizontal interactive tabletop system. The horizontal interactive tabletop system consists of a projectorbased display and an augmented reality display. The combination of the three displays enables a virtual 3D display. The projector displays a 2D view (e.g., a plan view) whereas the augmented reality system displays a 3D perspective view from a section cut in the plan. To display the digital still image in augmented reality, the still image is placed as a texture on a 2D face. The interactive tabletop system includes various tangible interaction devices including a hand movable vertical screen that changes the camera location for the perspective view, augmented reality patterns printed on blocks that the user may move, augmented reality patterns inside CAD applications that are positioned automatically based on the GPS coordinates of the captured digital media clips that that designer wishes to re-play, and a pen-based interface for moving the augmented reality patterns within the CAD model. 3. DEMONSTRATION During the construction phase of a building, changes or defects provoke a design issue that needs to be resolved by consulting the design models or drawings in light of the information available only on the construction site. Current practice tends to resolve these issues through multiple phone calls and conversations in the design office that try to reconstruct the physical context with words and relate that context to the CAD-based design models. Figure 2 Floor plan shown in Microstation The floor plan (plan view) is projected on the horizontal surface of the interactive tabletop. First, the user must calibrate the system by identifying a known point on the drawing and a known GPS coordinate as the reference point. All subsequent locations in the drawing are calculated based on the reference point. The digital clip is retrieved along with the GPS coordinates of the digital clip showing the defect. For each digital image, an augmented reality pattern is automatically displayed in the 3D model. Each pattern is actually a pre-drawn cell and is inserted into the model (We are using the Microstation BASIC API.). Once the augmented reality pattern is drawn, the augmented reality system superimposes the digital image of the construction defect on the vertical screen.

model and the digital model. We are currently outfitting the mobile device with additional sensors and other location-sensing technologies to more precisely locate the digital images in the virtual model as well as extending the capability for the designer to manipulate the 3D model elements including the accumulation of physical context information. Eventually, unified computer-aided modeling systems employing nomadic computing and mobile technologies combining the approach described in this paper with systems such as Tinmith- Metro would allow designers to work where the actions are. 5. ACKNOWLEDGMENTS This research was supported in part by the Australian Cooperative Research Centre for Construction Innovation under grant 2002-057-C. Figure 3 Plan view with augmented reality patterns and images Using a pen-based interface, the user may also move and rotate the images to the proper orientation if the designer knows the proper orientation. At this point, bearing and angle of inclination/declination of the viewer is not captured due to the unreliable bearing data from the GPS receiver and lack of a sensor to measure the inclination/declination viewing angle of the digital camera. Figure 4 User rotates augmented reality pattern with Mimio pen-based interface 4. CONCLUSIONS This paper describes a design environment that brings and locates relevant physical context into a 3D digital model of a design. The ability to locate and connect remote physical context with a 3D digital model is possible with mobile computing devices and wireless networks. The demonstration application facilitates collaboration among architects, engineers, and building contractors to resolve a design issue due to a construction defect. The challenge in construction defect resolution is that the working context is partially built, partially digital. The combination of the technologies affords the combination of the physical as-built 6. REFERENCES [1]. Abowd, G.D. and Mynatt, E.D. Charting past, present, and future research in ubiquitous computing. ACM Transactions on Computer-Human Interaction (TOCHI), 7 (1). 29-58. [2]. Bellotti, V. and Bly, S., Walking away from the desktop computer: distributed collaboration and mobility in a product design team. in CSCW '96: Proceedings of the 1996 ACM conference on Computer supported cooperative work, (Boston, MA, United States, 1996), ACM Press, 209-218. [3]. Broll, W., Lindt, I., Ohlenburg, J., Wittkämper, M., Yuan, C., Novotny, T., gen. Schieck, A.F., Mottram, C. and Strothmann, A. ARTHUR: A collaborative augmented environment for architectural design and urban planning. Journal of Virtual Reality and Broadcasting, 1 (1). [4]. Dourish, P. Where the Action Is: The Foundations of Embodied Interaction. MIT Press, Cambridge, MA, USA, 2001. [5]. Grefé, R. (Form + Content + Context) Time = Experience Design. Gain: AIGA Journal of Design for the Network Economy, 1 (1). 1-4. [6]. Kimoto, K., Endo, K., Iwashita, S. and Fujiwara, M. The application of PDA as mobile computing system on construction management. Automation in Construction, 14 (4). 500-511. [7]. Piekarski, W. and Thomas, B.H., Interactive augmented reality techniques for construction at a distance of 3D geometry. in EGVE '03: Proceedings of the workshop on Virtual environments 2003, (Zurich, Switzerland, 2003), 19-28. [8]. Teng, C.-M., Wu, C.-I., Chen, Y.-C., Chu, H.-h. and Hsu, Y.-j., Design and evaluation of mproducer: a mobile authoring tool for personal experience computing. in Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia, (College Park, MD, United States, 2004), ACM Press, 141-148.