An Animation Definition Interface Rapid Design of MPEG-4 Compliant Animated Faces and Bodies



Similar documents
DESIGNING MPEG-4 FACIAL ANIMATION TABLES FOR WEB APPLICATIONS

Principal Components of Expressive Speech Animation

Introduction to Computer Graphics

Template-based Eye and Mouth Detection for 3D Video Conferencing

Analyzing Facial Expressions for Virtual Conferencing

Real Time Facial Feature Tracking and Speech Acquisition for Cloned Head

M3039 MPEG 97/ January 1998

CS130 - Intro to computer graphics. Dr. Victor B. Zordan vbz@cs.ucr.edu Objectives

Visualizing Data: Scalable Interactivity

Building Interactive Animations using VRML and Java

This week. CENG 732 Computer Animation. Challenges in Human Modeling. Basic Arm Model

The Facial Animation Engine: towards a high-level interface for the design of MPEG-4 compliant animated faces

Programming 3D Applications with HTML5 and WebGL

Geant4 Visualization. Andrea Dotti April 19th, 2015 Geant4 M&C+SNA+MC 2015

3D Face Modeling. Vuong Le. IFP group, Beckman Institute University of Illinois ECE417 Spring 2013

Interactive Visualization of Magnetic Fields

GUI GRAPHICS AND USER INTERFACES. Welcome to GUI! Mechanics. Mihail Gaianu 26/02/2014 1

REMOTE RENDERING OF COMPUTER GAMES

Computer Animation and Visualisation. Lecture 1. Introduction

Media Cloud Service with Optimized Video Processing and Platform

Computer Applications in Textile Engineering. Computer Applications in Textile Engineering

Develop Computer Animation

International standards on technical. documentation. documentation. International Electrotechnical Commission

COMPUTER GRAPHICS Computer Graphics

ANIMATION a system for animation scene and contents creation, retrieval and display

OBJECT RECOGNITION IN THE ANIMATION SYSTEM

Introduction to Computer Graphics. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Proposal for a Virtual 3D World Map

Course Overview. CSCI 480 Computer Graphics Lecture 1. Administrative Issues Modeling Animation Rendering OpenGL Programming [Angel Ch.

NEW CHALLENGES IN COLLABORATIVE VIRTUAL FACTORY DESIGN

Blender 3D Animation

FEAWEB ASP Issue: 1.0 Stakeholder Needs Issue Date: 03/29/ /07/ Initial Description Marco Bittencourt

CSE 564: Visualization. GPU Programming (First Steps) GPU Generations. Klaus Mueller. Computer Science Department Stony Brook University

Data Visualization in Parallel Environment Based on the OpenGL Standard

Comp 410/510. Computer Graphics Spring Introduction to Graphics Systems

Introduction to Computer Graphics Marie-Paule Cani & Estelle Duveau

GUIDE Gentle User Interfaces for Elderly People

VisIt Visualization Tool

GPU Architecture. Michael Doggett ATI

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

A method of generating free-route walk-through animation using vehicle-borne video image

Modelling 3D Avatar for Virtual Try on

Video Analytics A New Standard

Blender Notes. Introduction to Digital Modelling and Animation in Design Blender Tutorial - week 9 The Game Engine

Silverlight for Windows Embedded Graphics and Rendering Pipeline 1

C O M P U C O M P T U T E R G R A E R G R P A H I C P S Computer Animation Guoying Zhao 1 / 66 /

An Instructional Aid System for Driving Schools Based on Visual Simulation

Web-Based Enterprise Data Visualization a 3D Approach. Oleg Kachirski, Black and Veatch

CURRICULUM VITAE EDUCATION:

B2.53-R3: COMPUTER GRAPHICS. NOTE: 1. There are TWO PARTS in this Module/Paper. PART ONE contains FOUR questions and PART TWO contains FIVE questions.

Character Animation from 2D Pictures and 3D Motion Data ALEXANDER HORNUNG, ELLEN DEKKERS, and LEIF KOBBELT RWTH-Aachen University

Lecture Notes, CEng 477

Computer Graphics. Computer graphics deals with all aspects of creating images with a computer

Digital 3D Animation

Virtual PC-Type Thin Client System

GOM Optical Measuring Techniques. Deformation Systems and Applications

Simulink 3D Animation User's Guide

Adobe Dreamweaver Exam Objectives

Outline. srgb DX9, DX10, XBox 360. Tone Mapping. Motion Blur

Using MuPAD and JavaView to Visualize Mathematics on the Internet

lesson 1 An Overview of the Computer System

Voice Driven Animation System

DEVELOPMENT OF REAL-TIME VISUALIZATION TOOLS FOR THE QUALITY CONTROL OF DIGITAL TERRAIN MODELS AND ORTHOIMAGES

Computer Aided Liver Surgery Planning Based on Augmented Reality Techniques

Abstractions from Multimedia Hardware. Libraries. Abstraction Levels

SAN DIEGO COMMUNITY COLLEGE DISTRICT MESA COLLEGE ASSOCIATE DEGREE COURSE OUTLINE

CS 3530 Operating Systems. L02 OS Intro Part 1 Dr. Ken Hoganson

2: Introducing image synthesis. Some orientation how did we get here? Graphics system architecture Overview of OpenGL / GLU / GLUT

Recent Advances and Future Trends in Graphics Hardware. Michael Doggett Architect November 23, 2005

Study of Large-Scale Data Visualization

Science Intensive Development (SID)

FACIAL RIGGING FOR 3D CHARACTER

3D Client Software - Interactive, online and in real-time

Practical Data Visualization and Virtual Reality. Virtual Reality VR Software and Programming. Karljohan Lundin Palmerius

Data Visualization Using Hardware Accelerated Spline Interpolation

White paper. H.264 video compression standard. New possibilities within video surveillance.

MMGD0203 Multimedia Design MMGD0203 MULTIMEDIA DESIGN. Chapter 3 Graphics and Animations

NORTHERN VALLEY SCHOOLS Office of Curriculum and Instruction Technology Education Department Demarest and Old Tappan HYPERMEDIA II GRADES 10 12

Product Characteristics Page 2. Management & Administration Page 2. Real-Time Detections & Alerts Page 4. Video Search Page 6

Using artificial intelligence methods and 3D graphics for implementation a computer simulator for ophthalmology

OpenGL & Delphi. Max Kleiner. 1/22

Visualisation in the Google Cloud

GRAFICA - A COMPUTER GRAPHICS TEACHING ASSISTANT. Andreas Savva, George Ioannou, Vasso Stylianou, and George Portides, University of Nicosia Cyprus

Transcription:

An Animation Definition Interface Rapid Design of MPEG-4 Compliant Animated Faces and Bodies Erich Haratsch, Technical University of Munich, erich@lis.e-tecknik.tu-muenchen.de Jörn Ostermann, AT&T Labs Research, osterman@research.att.com Abstract Many real-time animation programs including MPEG-4 terminals with face and body animation capabilities, will run a proprietary renderer using a proprietary face or body model. Usually, the animation of a proprietary model is not compatible to the MPEG-4 requirements. Furthermore, the new implementation and modification of animation parameters like smiles or eye brow movement into these renderers is cumbersome and time consuming. In this contribution, a process is proposed that allows the fast definition of animation parameters for proprietary models and their inclusion into proprietary real-time rendering software. REading the proprietary model into any commercially available modeler, this modeler is used to define the behavior of different animation parameters. For each animation parameter, the modified model is stored. The animation definition interface, a model analysis software, compares the original model with the animated model and extracts the essential animation parameters. These parameters are stored in tables and are used by the real-time animation program to generate the designed expression. 1. Introduction Currently, ISO/IEC JTC1/WG11, the same Moving Pictures Experts Group (MPEG) of the International Standardization Organization (ISO) and the International Electrotechnical Commission (IEC) that developed MPEG-1 and MPEG-2, is developing the new standard MPEG-4 [1]. Among other items, MPEG-4 strives to define a standardized interface to allow animation of face and body models within a MPEG-4 terminal [2]. DUe to the fast advances in computer graphics hardware, it is not foreseen that MPEG-4 will standardize the face and body models. Instead, face and body definition parameters (FDP, BDP) are defined for specifying the shape and surface of a model [7]. For the animation of the models, face and body animation parameters (FAP, BAP) are standardized. These animation parameters include low-level parameters like raise left outer eyebrow and tongue roll as well as high-level parameters like joy. Assuming that different terminals allow for models with different degrees of complexity, a process is required that allows the rapid development of models suited for animation. The use of standardized file formats like VRML would allow the use of commonly available modeling software (modelers) like COSMO Worlds or Alias/Wavefront PowerAnimator to design

animations. However, formats like VRML1, VRML2 [3][4][5], and libraries like OpenInventor[6] allow to define animation parameters for transforms like rotation or scaling of rigid objects but not for components of flexibly connected objects. Face and body animation requires flexible deformation. Since this is currently not easily implemented using OpenInventor or VRML 2-based application programming interfaces (API), the real-time renderer must be proprietary. Usually, the real-time render can read and write VRML or OpenInventor files. However, the definitions of animations like smiles is built into the renderer. Convenient editors for defining the animation capabilities are missing. In this contribution, the interface between a modeler (here Alias/Wavefront PowerAnimator) and a realtime renderer (here AT&T's virtual operator) is described that allows the rapid definition, modification and implementation of animation parameters. Since the interface reads VRML files from the modeler, it is independent of the modeler. The interface writes a VRML file and one accompanying table for each defined animation parameter thus making this information easily integrated into proprietary renderers. 2.0 Animation Definition Interface The proposed animation definition interface (ADI) between the modeler and the real-time renderer assumes that the animated models are described as wireframes. VRML 2 wireframes are defined using IndexedFaceSets. The definition of an animation parameter is given in an animation definition table (ADT) to be computed by the ADI. The interface takes as its input several VRML objects describing static models with a topology appropriate for the render [10]. Figure 1 shows how the proposed system is integrated with the modeler and the renderer. The model of the renderer is exported as a VRML file and read into the modeler. In order to design the behavior of the model for one animation parameter, the model is deformed using the tools of the modeler. Usually, restrictions on the topology of the model exist. For simplicity, we assume that the model is deformed only by moving relevant vertices and not by changing its topology. The modeler exports the deformed model as a VRML file.

Figure 1: Animation Definition Interface (ADI): The model is defined in a VRML file, the effects of animation parameters are defined in animation definition tables (ADT) referenceing vertices of the VRML file. The modeler is used to generate VRML files with the object in different animated positions. The renderer reads the VRML file and tables. Then, the model can be animated using animation parameters like MPEG-4 FAPs. The ADI compares the output of the modeler with its input, the model exported from the renderer. By comparing vertex positions of the two models, the vertices affected by the newly designed animation parameters are identified. The ADI computes for each affected vertex a 3D displacement vector defining the deformation and exports this information in an animation definition table. The renderer reads the VRML file of the model and the table in order to learn the definition of the new animation parameter. Now, the render can use the newly defined animation as required by the animation parameters. The amount of deformation contributed by each vertex is defined by the scalar product of the 3D displacement vector as given in the ADT and the actual value of the animation parameter. 2.1 Approximation of Non-Linear Deformations by Straight Lines The converter as described above allows the renderer only to create deformations of moving vertices along the defined 3D displacement vector. While this might be sufficient for simple actions like move left eye brow up, complex motions like smile or tongue roll up are not sufficiently modeled by linearly moving vertices. Therefore, we propose to create several VRML files for different phases of the animation, thus allowing for a piece-wise linear approximation of complex deformations (Figure 2).

Figure 2: An arbitrary motion trajectory is piece-wise linearly approximated For a smile, writing three files with smile=0.3, smile=0.7, and smile=1.0 are sufficient to allow for a subjectively present piece-wise linear approximation of this relatively complex deformation. 2.2 Application Example The above outlined procedure was used to define the entire set of MPEG-4 FAPs for a proprietary face animation renderer. The model is an extension of Parke's model [9]. The FAPs integrate nicely with the model's talking capability [8] (Figure 3) Figure 3: MPEG-4 will aloow the animation of computer graphics heads by synthetic speech and animation parameters. Here serveral frames from the text "Speech synthesis by AT&T"

Figure 4: A smile as defined with the Animation Definition Interface Animated sequences using different personalities will be shown at the conference (Figure 4). Although this example shows only the animation of the wireframe by deformation, this process can be extended to allow the definition of animation parameters for appearance like surface color and texture maps. 2.3 Flexible Deformations on OpenGL Based Graphics Subsystems. Most of the newly available graphics boards for PC and workstations support rendering based on the OpenGL engine. Similarly, VRML 2 browsers and OpenInventor are based on OpenGL [11]. So it is essential to enable real-time deformations of models rendered on an OpenGL engine, and it is imperative to use hardware supported functions of OpenGL as much as possible. OpenGL does not allow to deform wireframes by moving parts of a wireframe or IndexedFaceSet. Therefore, the CPU has to update the vertex positions of the wireframe through the animation parameters as defined in the table [12]. However, we can still take full advantage of the OpenGL rendering engine speed for global motions, lighting, texture mapping, etc. 3. Conclusions A process was defined that allows the rapid definition of new animation parameters for proprietary renderers, even allowing for peculiarities of proprietary models. In a first step, a proprietary model gets animated in a standard modeler. The animated models are saved as VRML files. The output of the Animation Definition Interface is the model and a table describing the new animation parameter. This information is read by the renderer and used whenever the animation parameter is required. The proposed process with the ADI can easily be used to generate new shapes from the original model. 4. References 1. Leonardo Chiariglione (convenor), "MPEG", http://drogo.cselt.stet.it/mpeg. 2. Peter K.Doenges, Tolga K Capin, Fabio Lavagetto, Jörn Ostermann, Igor S.Pandzic, Eirc D.Petajan, "MPEG-4: Audio/Video & Synthetic Graphics/Audio for Mixed Media", Signal Processing: Image Communication, accepted for publication in 1997. 3. "Virtual Reality Modeling Language, Version 2.0", ISO/IEC JTC1 SC24, ISO/IEC CD

14772. 4. "The VRML 2.0 Specification", http://www.vrml.org/vrml2.0/final. 5. J.Hartman, Josie Wernecke, The VRML 2.0 Handbook, Addison Wesley, New York, 1996. 6. Open Inventor Architecture Group, Open Inventor C++ Reference Manual, Addison Wesley, New York, 1994. 7. "SNHC Verification Model 4.0", ISO/IEC JTC/SC29/WG11 N1666, Bristol meeting, April 1997. 8. M.Cohen, D.Massaro, "Modeling coarticulation in synthetic visual speech", in N.M.Thalmann and D.Thalmann, editors, Models and Techniques in Computer Animation, pp.141-155, Springer Verlag, Tokyo, 1993. 9. Frederic I.Parke, Keith Waters, Computer Facial Animation, AK Peter Ltd, Wellesley, Massachusetts, Chapter 6, 1996. 10. L.Chen, J.Ostermann, "Animated talking head with personalized 3D head model", 1997 Workshop on Multimedia Signal Processing", Princeton, NJ, USA, June 1997. 11. Jackie Nieder et al., OpenGL Programming Guide, Addison Wesley, New York, 1993. 12. E.Haratsch, J.Ostermann, "Parameter based animation of 3D head models', submitted to Picture Coding Symposium PCS'97, Berlin, GErmany, September 1997.