Multimodal Interaction in Virtual Reality



Similar documents
Gestures Body movements which convey meaningful information

Max, our Agent in the Virtual World

Max A Multimodal Assistant in Virtual Reality Construction

Vision-based Recognition of Gestures with Context. Jan Nikolaus Fritsch

Figure 1. Basic Petri net Elements

EasyC. Programming Tips

Virtual Organization Virtuelle Fabrik

Master of Science in Computer Science

A Study of Immersive Game Contents System Design and Modeling for Virtual Reality Technology

A Novel Multitouch Interface for 3D Object Manipulation

Processing Dialogue-Based Data in the UIMA Framework. Milan Gnjatović, Manuela Kunze, Dietmar Rösner University of Magdeburg

Overview of the Adobe Flash Professional CS6 workspace

Communicating Agents Architecture with Applications in Multimodal Human Computer Interaction

Kapitel 2 Unternehmensarchitektur III

Practical Work DELMIA V5 R20 Lecture 1. D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr

Scooter, 3 wheeled cobot North Western University. PERCRO Exoskeleton

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research

Leitfaden für die Antragstellung zur Förderung einer nationalen Biomaterialbankeninitiative

Building an Architecture Model Entwerfen Sie mit AxiomSys ein Kontextdiagramm, das folgendermaßen aussieht:

Introduction to Computer Graphics

Graduate Co-op Students Information Manual. Department of Computer Science. Faculty of Science. University of Regina

Über die Semantik von Modellierungssprachen

Program curriculum for graduate studies in Speech and Music Communication

LEARNING AGREEMENT FOR STUDIES

Interactive Multimedia Courses-1

APPLICATION SETUP DOCUMENT

CHAPTER 1. Introduction to CAD/CAM/CAE Systems

Doctor of Philosophy in Computer Science

a powerful graphical developing tool for prototyping image processing tasks

Search Engines Chapter 2 Architecture Felix Naumann

Virtual Environments - Basics -

CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS.

DEGREE PLAN INSTRUCTIONS FOR COMPUTER ENGINEERING

3D sound in the telepresence project BEAMING Olesen, Søren Krarup; Markovic, Milos; Madsen, Esben; Hoffmann, Pablo Francisco F.; Hammershøi, Dorte

Computer Organization & Architecture Lecture #19

Force/position control of a robotic system for transcranial magnetic stimulation

UNDERGRADUATE DEGREE PROGRAMME IN COMPUTER SCIENCE ENGINEERING SCHOOL OF COMPUTER SCIENCE ENGINEERING, ALBACETE

IBM Security. Alle Risiken im Blick und bessere Compliance Kumulierte und intelligente Security Alerts mit QRadar Security Intelligence

Chapter 1. Introduction. 1.1 The Challenge of Computer Generated Postures

Teaching Methodology for 3D Animation

Big Ideas in Mathematics

Architecture Design & Sequence Diagram. Week 7

Lecture Notes - H. Harry Asada Ford Professor of Mechanical Engineering

VRSPATIAL: DESIGNING SPATIAL MECHANISMS USING VIRTUAL REALITY

HYPO TIROL BANK AG. EUR 5,750,000,000 Debt Issuance Programme (the "Programme")

Upgrading Your Skills to MCSA Windows Server 2012 MOC 20417

Flash MX 2004 Animation Lesson

Industrial IT Ó Melody Composer

Masters in Human Computer Interaction

Herzlich Willkommen. zum Webinar. Data Insight Lab - Smart Data for Business & RapidMiner

MINDWALKER: A Brain Controlled Lower Limbs Exoskeleton for Rehabilitation. Potential Applications to Space.

Masters in Advanced Computer Science

Masters in Artificial Intelligence

An Intelligent Assistant for Computer-Aided Design Extended Abstract

The aims: Chapter 14: Usability testing and field studies. Usability testing. Experimental study. Example. Example

Obtaining Knowledge. Lecture 7 Methods of Scientific Observation and Analysis in Behavioral Psychology and Neuropsychology.

Number Sense and Operations

Embedded Software Development and Test in 2011 using a mini- HIL approach

STRAND: Number and Operations Algebra Geometry Measurement Data Analysis and Probability STANDARD:

An Incrementally Trainable Statistical Approach to Information Extraction Based on Token Classification and Rich Context Models

Course Title: Introduction to Video Game Design Board Approval Date: 4/15/13 Credit / Hours: 0.5credit

Module 9. User Interface Design. Version 2 CSE IIT, Kharagpur

A: Ein ganz normaler Prozess B: Best Practices in BPMN 1.x. ITAB / IT Architekturbüro Rüdiger Molle März 2009

VISUAL RECOGNITION OF HAND POSTURES FOR INTERACTING WITH VIRTUAL ENVIRONMENTS

Managing Variability in Software Architectures 1 Felix Bachmann*

Growing Up With Epilepsy

Robot Task-Level Programming Language and Simulation

Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD)

VR4D: An Immersive and Collaborative Experience to Improve the Interior Design Process

Automated Recording of Lectures using the Microsoft Kinect

Module Catalogue for the Bachelor Program in Computational Linguistics at the University of Heidelberg

CORPORATE DESIGN MANUAL English 5/2015

(51) Int Cl.: G10L 15/26 ( )

Tool for Analysis of Language and Communication (TALC)

School of Computer Science

Web Document Clustering

Elements of robot assisted test systems

Morten Fjeld. Designing for Tangible Interaction. Man-Machine Interaction IHA, ETH Zurich

CONTROL, COMMUNICATION & SIGNAL PROCESSING (CCSP)

Transcription:

Multimodal Interaction in Virtual Reality Ipke Wachsmuth Faculty of Technology University of Bielefeld Anthropomorphic Interfaces Interfaces following human form natural human-machine communication highly popular in interactive multimedia gestures, speech, coverbal gesture Drawbacks in VR! Tedious usage of synthetic models! Deficits in interactive design support Virtual Reality: 3D viewing spatial sound tactile stimuli Put this chair there! Communicating in the situation Central aspects in our work: Multimedia Multimodality Multiple representations Multi-agent Systems! Manipulation not subject to physical laws! Lack of user-friendliness in interaction Our approach: indirect management techniques, based on gesture and speech

Three things are important in our work toward incorporating gestures as a useful tool in virtual reality: measuring gestures as articulated hand and body movements in the context of speech interpreting them by way of classifying features and transducing them to an application command via a symbolic notation inherited from sign language timing gestures in the context of speech in order to establish correspondence between accented behaviors in both speech and gesture channels

Gesture Recognition Work with Timo Sowa and Ian Voss Experiments with probands in a virtual environment Approach 1: Basic interactions, direct feedback! Pointing! Grasp! GraspRelease! Rotation! Translation Approach 2: Formbased description " explicit (knowledge-based) approach " atomic form elements of gesture are composed to gesture words Symbols for Body Parts Gesture notation with HamNoSys Work with Martin Fröhlich and Timo Sowa, SGIM Project For form description of gestures we work with a scenario-specific subset of HamNoSys (HNS ).

(defrule rule32?tmp28 <- (object (is-a MoveA)) ; eine Handvorwärtsbewegung?tmp31 <- (object (is-a MoveR)) ; eine Handrechtsbewegung => (SEQUENCE tmpcl25?tmp28?tmp31)) ; in Sequenz Hier ist automatisch eine CLIPS-Regel rule32 generiert worden, die ein Token der Klasse tmpcl25 assertieren kann, das eine erkannte Sequenz von bestimmten Handbewegungen registriert. HNS Parse Tree for statischer Anteil konfiguration input dynamischer Anteil aktion handform handstellung lokation BrackSeqL aktion BrackSeqRq (defrule rule36?tmp16 <- (object (is-a BSifinger)) ; Zeigefinger gestreckt?tmp19 <- (object (is-a EFinA)) ; nach vorn gestreckt?tmp22 <- (object (is-a PalmL)) ; Handfläche links?tmp23 <- (object (is-a LocShoulder)) ; in Schulterhöhe?tmp24 <- (object (is-a LocStretched)) ; ausgestreckt?tmp35 <- (object (is-a tmpcl25)) ; bewegt wie oben => (PARALLEL DUMMY?tmp16?tmp19?tmp22?tmp23?tmp24?tmp35)) Hier ist automatisch eine Regel rule36 generiert worden, die ein Token der Klasse DUMMY assertieren kann, das das Auftreten einer bestimmten Handpostur in Verbindung mit oben beschriebener Bewegung registriert. (für DUMMY kann ein beschreibender Gestenname gesetzt werden) grundform BSifinger fingeransatzrichtung EFinA koerperebene abstand aktion bewegung LocShoulder LocStretched bewegung einfachebewegung einfachebewegung handflaechenorientierung gerade PalmL MoveA gerade MoveR From gesture to application! Actuators " abstract placeholders of significant discrete features in body reference system; normalized (e.g., world coordinates)! Motion modificators " bind temporarily to actuators and filter motion data to object transformations (via manipulators)! Manipulators " receive transformation commands and put them into effect in the 3D scene Multimodal Interface: an interface which allows the(sequential or concurrent) usage of multiple modalities Open input: Begin and end of interaction not known

Gestures: Kinesic Structure Stroke Cues After McNeill, Levy & Pedelty (1990) Consistent Arm Arm Use Use and and Body Body Posture Consistent Head Head Movement Gesture-Unit Gesture-Phrase Preparation Stroke Stroke Retraction " pre/post-stroke hold (kinesic structure) " strong acceleration of hands, stopps, rapid changes in movement direction " strong hand tension " symmetries in two-hand gestures Hold Hold (pre-stroke) Hold Hold (post-stroke) Timing of gestures and speech Multimodal Integration The gesture stroke is often marked by an abrupt stop which is correlated with accented words or syllables the stroke does not occur after an accented word but simultaneously or shortly before Two "logistic" problems to be solved (Srihari, 1995): Segmentation Problem How can a system be made to cope with open input? How can units be determined to be processed in one system cycle? Correspondence Problem Nimm dieses Rohr, steck es da dran 0 1 2 3 sec => hypotheses for establishing correspondence between accented behaviors in speech and gesture channels How to determine cross-references between multiple modalities (speech/gesture)?

Earlier work: VIENA (1996) Virtual Construction Work with Bernhard Jung and Martin Hoffhenke Innovative approach to prototype the design of mechanical devices and explore their assembly in early stages of industrial development, based on CAD models put this computer on the blue desk The things we have learned from investigating these issues help us to advance natural interaction with 3D stereographic scenes in a scenario of virtual construction. Multimodal Interaction in VR Work with Marc Latoschik, SGIM Project SG IM 2 Speech and Gesture Interfaces for Multimedia

Natural interaction Work with Marc Latoschik, SGIM Project pointing/selecting (deictic) turning (mimetic) drag and drop In the first place we have dealt with pointing and turning, etc., commonly classified as deictic and mimetic gestures. Where we would like to go next DEIKON (2000++) Deixis in Construction dialogs (joint project with Hannes Rieser)! Virtual Workspace (twosided 3D projection)! manipulative gesture (grasp space) and! communicative gesture (distant space)! Systematic study of referential acts by coverbal gesture! Aim: To investigate how complex signals originate from speech and gesture and how they are used in reference! Contribution of gestural deixis for making salient or selecting objects and regions! Making an artificial communicator to understand and produce (coverbal) deictic gestures in construction dialogs

Deictics & Iconics Lifelike Gesture Synthesis Work with Timo Sowa and Marc Latoschik Work with Stefan Kopp, Articulated Communicator In the DEIKON project, we have now started to research into more sophisticated forms of deictics in construction dialogues that include features indicating shape or orientation, which leads us into iconic gesture. Another issue in our work is the synthesis of lifelike gesture from symbolic descriptions for an articulated virtual figure where natural motion and timing are central aspects. Find Lab Showcase & papers... Martin... WBS Team Bernhard Jung...Hoffhenke...Fröhlich www.techfak.uni-bielefeld.de/techfak/ags/wbski/ www.techfak.uni-bielefeld.de/ ipke/ Stefan Kopp Timo Sowa Ian Voß Marc Latoschik