REPORT. on the SPECIFICATIONS OF THE INTERFACE AND SOFTWARE. for the VISU AIRPORT PROJECT

Similar documents
BENEFIT OF DYNAMIC USE CASES TO EARLY DESIGN A DRIVING ASSISTANCE SYSTEM FOR PEDESTRIAN/TRUCK COLLISION AVOIDANCE

Ar Drone Controller Version and above

How To Run A Factory I/O On A Microsoft Gpu 2.5 (Sdk) On A Computer Or Microsoft Powerbook 2.3 (Powerpoint) On An Android Computer Or Macbook 2 (Powerstation) On

A new dimension in infotainment

Approaching VR 2.0: Creating and Sharing Open Source Virtual Environments for Health Care and Research

Smart Transport ITS Norway

3D-GIS in the Cloud USER MANUAL. August, 2014

2015 HSC Information and Digital Technology Digital animation Marking Guidelines

Go to contents 18 3D Visualization of Building Services in Virtual Environment

Suggested Application Options Settings for Autodesk Inventor

U g CS for DJI Phantom 2 Vision+, Phantom 3 and Inspire 1

Virtual CRASH 3.0 Staging a Car Crash

SuperNav for Heavy Equipment

3D U ser I t er aces and Augmented Reality

An Instructional Aid System for Driving Schools Based on Visual Simulation

Procedure for Marine Traffic Simulation with AIS Data

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

GRAFICA - A COMPUTER GRAPHICS TEACHING ASSISTANT. Andreas Savva, George Ioannou, Vasso Stylianou, and George Portides, University of Nicosia Cyprus

Adding emphasis to a presentation in PowerPoint 2010 and 2013 for Windows

Scooter, 3 wheeled cobot North Western University. PERCRO Exoskeleton

INITIAL TEST RESULTS OF PATHPROX A RUNWAY INCURSION ALERTING SYSTEM

SimFonIA Animation Tools V1.0. SCA Extension SimFonIA Character Animator

PRODUCT INFORMATION. Insight+ Uses and Features

CREATE A 3D MOVIE IN DIRECTOR

SMART Board Menu. Full Reference Guide

1.0-Scratch Interface 1.1. Valuable Information

Head-Coupled Perspective

Asset Track Getting Started Guide. An Introduction to Asset Track

Information Technology Career Field Pathways and Course Structure

Working With Animation: Introduction to Flash

Alert ALARM MANAGEMENT

Automatic Dependent Surveillance Broadcast (ADS-B)

Intuitive Navigation in an Enormous Virtual Environment

ZIMBABWE SCHOOL EXAMINATIONS COUNCIL. COMPUTER STUDIES 7014/01 PAPER 1 Multiple Choice SPECIMEN PAPER

1. Abstract. 2. The Problem / Need for the Program

VR4D: An Immersive and Collaborative Experience to Improve the Interior Design Process

Voice Driven Animation System

SYSTEM GLOBAL NAVIGATION SATELLITE SYSTEM LANDING TECHNOLOGY/PRODUCT DEVELOPMENT

School of Computer Science

Functional overview. PRECISE l RELIABLE l REAL-TIME MONITORING Document Number: MW

THIRD REGIONAL TRAINING WORKSHOP ON TAXATION. Brasilia, Brazil, December 3 5, Topic 4

Sweet Home 3D user's guide

How to Develop Accessible Linux Applications

Skills Inventory: Art/Media Communications. 1. Pre-employment Training/Career Development. A. Formal; e.g., certificates. Date Description Location

VIRTUAL TRIAL ROOM USING AUGMENTED REALITY

AN AIRCRAFT TAXI SIMULATION MODEL FOR THE UNITED PARCEL SERVICE LOUISVILLE AIR PARK. W. Swain Ottman Angela C. Ford Gregory R.

Model Simulation in Rational Software Architect: Business Process Simulation

Using Your Polyvision Digital Whiteboard and Walk-and-Talk

ROAD WEATHER AND WINTER MAINTENANCE

Parking Guidance System

LN Mobile Service. Getting Started (version 1.24)

Multi-Touch Ring Encoder Software Development Kit User s Guide

The role of integrated requirements management in software delivery.

DANGER indicates that death or severe personal injury will result if proper precautions are not taken.

PLAY VIDEO. Close- Closes the file you are working on and takes you back to MicroStation V8i Open File dialog.

VRSPATIAL: DESIGNING SPATIAL MECHANISMS USING VIRTUAL REALITY

One-Way Pseudo Transparent Display

Creating Scenes and Characters for Virtools in OpenFX

Bluetooth + USB 16 Servo Controller [RKI-1005 & RKI-1205]

Ch. 13.3: More about Probability

MovieClip, Button, Graphic, Motion Tween, Classic Motion Tween, Shape Tween, Motion Guide, Masking, Bone Tool, 3D Tool

PowerMic II Getting Started Guide. Dragon Medical Practice Edition

Teaching Methodology for 3D Animation

Outline. Lecture 13: Web Usability. Top Ten Web Design Mistakes. Web Usability Principles Usability Evaluations

Parameter identification of a linear single track vehicle model

CONDIS. IT Service Management and CMDB

WP5 - GUIDELINES for VIDEO shooting

Business Process Management (BPM) Software

Asset Register Asset Care Plan Developer On Key Analytics Maintenance Manager Planning and Scheduling On Key Interface Tool

CLOUD COMPUTING - OPPORTUNITIES

About the Render Gallery

Choosing a Development Tool

3D Interactive Information Visualization: Guidelines from experience and analysis of applications

SimLab 3D PDF. Settings

TabletWorks Help Index 1

A Remote Maintenance System with the use of Virtual Reality.

Maya 2014 Basic Animation & The Graph Editor

EB TechPaper. Test drive with the tablet. automotive.elektrobit.com

CATIA Drafting TABLE OF CONTENTS

A framework for Itinerary Personalization in Cultural Tourism of Smart Cities

BUSINESS JET SECURITY SYSTEM

What makes a good coder and technology user at Mountfields Lodge School?

Data Review and Analysis Program (DRAP) Flight Data Visualization Program for Enhancement of FOQA

Studio Visual Steps. Windows Defender. For Windows XP, Vista and 7

Ch 1: What is Game Programming Really Like? Ch 2: What s in a Game? Quiz #1 Discussion

Sample- for evaluation only. Advanced PowerPoint. TeachUcomp, Inc. A Presentation of TeachUcomp Incorporated. Copyright TeachUcomp, Inc.

A very short history of networking

Geovisualization. Geovisualization, cartographic transformation, cartograms, dasymetric maps, scientific visualization (ViSC), PPGIS

Model Safety Plan. Why Do You Need A Safety Plan?

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

Multiagent Control of Traffic Signals Vision Document 2.0. Vision Document. For Multiagent Control of Traffic Signals. Version 2.0

What s New in LANDESK Service Desk Version 7.8. Abstract

Level Design. Characters vs Levels. Level Design. Case Study: Matchstick puzzle

Requirements Analysis Concepts & Principles. Instructor: Dr. Jerry Gao

Flash Tutorial Part I

Quareo ICM Server Software

Cybersecurity Analytics for a Smarter Planet

Railway Safety Directive 2004/49/EC

DRIVING SIMULATORS FOR COMMERCIAL TRUCK DRIVERS - HUMANS IN THE LOOP

Technical Trends of Driver Assistance/ Automated Driving

Transcription:

EUROCONTROL REPORT on the SPECIFICATIONS OF THE INTERFACE AND SOFTWARE for the VISU AIRPORT PROJECT 1. Stage 2 aims Following the discussions and the analyse of the needs in workpackage 1, we can define the specifications for out demonstrator. The 2 nd stage of our work Specifications of interface and software is the specifications of the visual interfaces, of the scenario and of the scene modelling software. Gilles Fuselier from ADP proposed to focus our demonstrator on the incidents and on the black points (high risk accident points) that could be avoid thanks to 3D visualisation. Therefore taking into account this points, an interest area for modelling has been chosen. It gathers several difficult areas already mentioned: 1

Building harming to the scene visibility Taxiway proximity; Traffic area; Parking area. For the demonstrator we will detail this afterwards. 2. Specifications of interface 2.1 Experimentations about the viewpoints The observer, while turning relatively the tablet, changes the viewpoint s orientation in a pseudo-natural way. By a relative translation of the tablet, one moves its viewpoint, by translation, at the desired place. The user can easily choose, without a keyboard, only by using the pen and the interactive screen, the observed zone, the display functionalities, the memorized configurations of several points of view, to communicate with other people, etc. Another solution, which could be tested in complement, is the exploitation of speech recognition in order to give simple commands easily and naturally, like the calling of a visualisation point already configured and memorized. Tablet PC and inertial sensor for the visual interface in mobile station The controller is thus immersed in his space of action. The system gives the controller the possibility of changing his observation point, according to each situation, as we indicated previously. He can also configure a set of viewpoints, adapted to his needs, which he can exploit at every moment. Consequently, the third part of the project consists in the design of 2

the software functions allowing the viewpoint modification, by using a simple mouse, the tablet-pc s pen or using an inertial sensor connected to tablet PC. The various ways of choosing the viewpoints will have to be tested, improved and validated. The ergonomic aspect of the viewpoint command must be carefully taken into account at the cognitive level. It is never easy to handle by rotation a direction (here the view point s direction), because rotations (according to the three orthogonal axes) are not commutative operations. The use of an inertial sensor will be the solution for a pseudo-natural behaviour of the observer. 2.2 Test Description The test is decomposed in two different tasks: Moving into the environment; Point of view changing into the environment. Material description All the tests will be carried out on Tablet-PC laptop type (Acer TravelMate C300) equipped with an inertial sensor and a graphical pen (provided with the Tablet-PC). The Inertial sensor InterTrax 2 has three degrees of freedom that allow the user obtaining the directional information of the Tablet-PC. The graphical pen, provided with the tablet, allows via the tablet s sensitive screen a mouse-like control. Objective and tests functioning The tests objectives are to find the proper way of navigation with respect to the Visu Airport needs. Different ways of navigation are tested. In order to respect a given scenario, the user will have the possibility to move and to turn the Tablet-PC. Several command laws for viewpoint changing have been implemented and they will be tested at an ergonomical and functional level: - Translation using the sensor, all or nothing type. - Orientation using the sensor, all or nothing type. - Translation using the tablet s keys. - Orientation using the sensor, all or nothing type. - Translation using the sensor. First command starts-up the translation, the second stop it. - Orientation using the sensor in absolute coordinates. - Translation using the tablet-keys. - Orientation using the sensor in absolute coordinates. 3

- Translation using the graphical pen, drag n drop style - Orientation using the graphical pen, drag n drop style - Translation using the graphical pen. User points the desired direction. - Orientation using the graphical pen, in the same way as the translation. During the tests, the user will conduct the same scenario for each type of navigation. The different types of navigation will be tested in a random order, to homogenize the results, subject of users learning influences. Scenario The test scenario consists in making several manoeuvres of translating and rotating the Tablet-PC. To this purpose, the user will alternate a translation action with an orientation one. The user will follow a path in la labyrinth (translation test) and after, once arrived, he must observe a plane in his vision field (tablet s orientation test). Experiments indicators The time, which has elapsed during the labyrinth navigation, is registered for every user and every command type, allowing in this way the comparison. For the airplane following-up, a percentage will denote the degree of success whenever the user catches the plane. 2.3 2 Experiments results The results are now in analysed and they will allow choosing the efficient types of command that will be integrated in the demonstrator. 3. Specifications of software 3.1 Introduction The demonstrator is composed of two visual interfaces (one as a fixed station and another as a mobile one), the operating software for the two interfaces, and the communication between them. This software will make possible to display an airport with its planes, its ground vehicles and staff in motion. The synthesis images will be created in real-time (25 images a second) starting from an established pre-scenario of the motion of the mobile entities and the exchanges of information between the two visual interfaces. A certain number of 4

complementary displayed information will be also realised and integrated. The results of a first set of tests for concept validation will be provided. A final report will be written, drawing the conclusions of the experiments carried out with the demonstrator. For the demonstrator realization, carried out in this one-year project, no development on data synchronization is required. A scenario established in advance will be played to test this new concept of observation. The software development will be carried out starting from the software platform of the French company Virtools, which was designed for applications of animation and virtual reality. The application will be tested with a visual interface on a desktop station (micro-computer PC) and with a visual interface on mobile station (tablet- PC). A final stage of the project consists in carrying out the experiments to test the relevance of the adopted solutions, at technical level, at ergonomic level and at functional level. 3.2 Modelling and animation of the Virtual Environment In Autocad plans given by ADP, we will extract information s needs in different layers of files. For security and confidential reasons around E5 terminal, ADP gave us some plans with erased areas but we have most of areas we have discussed in meeting. In July 2, a campaign of photos will be shot and then permit to produce a coherent database, with definition of highs and volumes. The 3D modelization will be done in two steps, first we create all details of environment with 3DSmax and then we adapt optimisation elements for real-time in front of needs and uses for digital model. In second time, we create digital planes and vehicles whose operate in airport area. All those models, environment, roads, planes and vehicles are textured and lighted for realtime, and many cameras will selected in most important viewpoints of database. Planes and vehicles are animated with 3DSmax with defined paths for ones and moved interactively for others. Models are exported with Virtools and we apply interactive behaviours like real-time visit and possibility to move automatically to the important viewpoints selected before. We apply to the entire vehicles on the road behaviour of collision. The goal is to permit to cameras to don t walk across others models. Behaviour of animation and interactive modification for models will be programmed. For example interactive plane animation or changing signs in environment can be applied to the scene. We also develop behaviour for appearing and disappearing of objects in scene and a plan whose permit to show the position of camera. Finally, different scenarios will be tested to define the best ergonomic possibilities that people can do. In addition to the three basic parameters on a plane (position, altitude, speed), we propose the integration of the angles of pitch, roll and yaw. Those will be integrated in the 3D planes model. They will be used at the same time for a realistic rendering of the plane s behaviour, 5

as well as for a better prediction of its short-term trajectory, useful to the control activity. The ground vehicles will be supposed to transmit their positions in real-time (thanks to sensors of localization, standard GPS). About the complexity of the model, there are about ten moving objects (four planes and six vehicles for example) in the demonstrator. There are enough moving objects in order to show and to validate different functionalities and scenarios. 3.3 Scenarios As we have already precised the demonstrator will propose two functionalities validation scenarios that can interest the different airport professionals. Scenario 1: Incidents and accidents avoidance whenever vehicles are going on the airport 6

The scenario will consist in a 3D vehicle circulation visualisation on a precise area of Roissy airport (see the modelling section and the attached figure). The Tablet-PC visualisation will be proposed on vehicles, for drivers assistance. The driver will virtually observe the scene and the vehicles traffic, even when a building makes obstruction to his vision of in fog conditions. Vehicles will circulate and events able to generate incidents or accidents will be simulated. Alerts, red-eye type, will be proposed on the 3D display or on a 3D and 2D mixed display (sky-point view of the circulation area). This allows decreasing the cognitive charge of the driver, even in situations of bad visibility. The standard scene display on tablet-pc is similar to the driver s real view in his vehicle. This is possible if the vehicle is positioned by GPS and its orientation in the airport referential is known. Using technologies of virtual augmented reality, one can propose synthesis images similar to the real images, on which one can add additional information. For instance: looking trough a building or whenever vehicles obstruct the drivers vision, adding alerts and limits to the security area on the 3D environment, etc. Area of the Roissy airport to be modelled (grey area on the airport map) Area of the Roissy airport to be modelled will contain the taxiway area as well as a part of the landing strips. 7

Scenario 2: Airplane s stopover task scheduling supervision The scenario will consist in the 3D stopover task visualisation on a parking area of an air transport company. The task scheduling supervision for each plane will be visualised in 3D and their chronological flow will be displayed in another 2D window. Symbolic, textual or vocal information will be proposed for an efficient dialog between two users. For instance, vehicles and planes trajectories visualisation, passengers flows, luggage flow, provisional tasks evolutions, etc. This display will be proposed to a coordo on the ground and also to a supervisor out of the area on his desktop station, that will show the same scene, but not necessarily with the same point of view. A scenario, with dialog and communication between the two persons, will be executed during the demonstration. One can test the assistance provided by the 3D visualisation for a better scene understanding or for complex events: bad coordination between sub-tasks, airplane coordination perturbation by another coordination, etc. 3.4 Validation Two types of accidents are retained for the demonstrations : - a) on the taxiway area; - b) on the airplane parking area. a) the user must cross the taxiway. - airplanes are running on the taxiway, for a high collision risk between them and the user s vehicle, even if this situation is not very realistic; - in case of collision risk an alert system is triggered; a sound alert or a 2D red-eye alert, a 3D flag alert on the intersection position, or a descriptive ground-projected alert showing the collision risk with other vehicles. b) on the airplane parking area, a building eclipsing the driver view. - whenever a building is eclipsing a vehicle view, the building is becoming partially transparent on the tablet-pc. This vehicle must have priority on the user s vehicle, therefore it is compulsory for the last one to stop (otherwise the collision is visualised). The 1st scenario validation will be made on a vehicles traffic simulation presenting an increased collision risk. The proposed functionalities will be validated by the serviceproviders (contractors) crew. The validations will be mostly qualitative and less quantitative. At a quantitative level we will evaluate the accident avoidance rate, with or without complementary information,(alert system, building transparency). At a qualitative level, the testers will answer a questionnaire on the interest of the demonstrator and the difficulties that it generates. 8

The 2 nd scenario validation will also be made on a simulation basis. The proposed functionalities will be validated by the Air France crew in charge with the stopover operations coordination. These will be qualitative validations. A user is supposed being on the airport area, near a parked airplane (a coordo ) while another user is supposed to be in the supervision room for the coordination of all company s airplanes during a stopover (a supervisor). The airport scene visualisation will be made for these two users, the coordo on his tablet-pc and the supervisor on his desktop computer, in the following conditions: - One user can visualise the scene from the other user s point of view in order to a better mutual comprehension. - The visualised scene must be complex enough to allow the dialogue improvement testing thanks to 3D visualisation. At a qualitative level, the testers will answer a questionnaire on the interest of the demonstrator and the difficulties that it generates. Following the meetings and discussions with executive and staff officers from ADP and Air France, the following functionalities have been drawn: 1 Visual knowledge about the stopover timings for traffic management; 2 Stopover-navigation synchronisation for an airplane; 3 Operations compliance control on parking areas; 4 Ground airplane traffic; 5 Accidents (defrosting, cleaning, breakdown service); Collision avoidance with other vehicles (bad visibility conditions, building or vehicles hiding objects, fog) 6 Red eye alert systems for reaching the taxiways 7 Remote supervision 8 2D and textual information linking The scenario 1 is linked to the following requirements retained in WP1: 4, 5 and 6 functionalities. The scenario 2 is linked to the following requirements retained in WP1: 1, 7 and 8 functionalities. The scenarios do not cover the 2 and 3 functionalities. 4. Conclusion 9

Starting with the discussions and the needs analysis of the workpackage 1, we defined the specifications for the future demonstrator. We still have to analyse the point of view handling tests results in order to define the point of view changing types that will be exploited in the demonstrator. 1998 European Organisation for the Safety of Air Navigation (EUROCONTROL). All rights reserved. This document is published by EUROCONTROL in the interests of the exchange of information. It may be copied in whole or in part, providing that this copyright notice and disclaimer are included. The information contained in this document may not be modified without prior written permission from EUROCONTROL. EUROCONTROL makes no warranty, either implied or express, for the information contained in this document, neither does it assume any legal liability or responsibility for the accuracy, completeness or usefulness of this information 10