Kinect Exercise HCI Course Fall Maria Husmann CNB E 104.1

Similar documents
Next Generation Natural User Interface with Kinect. Ben Lower Developer Community Manager Microsoft Corporation

Amit Moran, Gila Kamhi, Artyom Popov, Raphaël Groscot Perceptual Computing - Advanced Technologies Israel

Introduction to the Perceptual Computing

Master Thesis Using MS Kinect Device for Natural User Interface

Gesture-Based Human-Computer-Interaction Using Kinect for Windows Mouse Control and PowerPoint Presentation

How does the Kinect work? John MacCormick

GestPoint Maestro3D. A White Paper from GestureTek The Inventor of 3D Video Gesture Control

Abstract. Introduction

Hand Analysis Tutorial Intel RealSense SDK 2014

DESIGN OF A TOUCHLESS USER INTERFACE. Author: Javier Onielfa Belenguer Director: Francisco José Abad Cerdá

Contents. Introduction Hardware Demos Software. More demos Projects using Kinect Upcoming sensors. Freenect OpenNI+ NITE + SensorKinect

Development of 3D Image Manipulation Software Utilizing the Microsoft Kinect

Energy Consumption of New Generation Game Consoles - Key Findings

Ammar Ahmad Awan, Muhammad Aamir Saleem, Sungyoung Lee

Screencast-o-matic ProPage Basics

C# Implementation of SLAM Using the Microsoft Kinect

VIRTUAL TRIAL ROOM USING AUGMENTED REALITY

Magic iriid Virtual Remote Controller Interface For Home Theater and Marketing App User Manual

Megapixel IP66. IP66 Waterproof Housing, Cable through bracket and Anti-Fog Front Cover

Getting Started with the ZED 2 Introduction... 2

Nintendo Announces Price and Launch Date of Nintendo 3DS in Japan

Teacher Interview 7377 Introduction to Technology in Schools, Fall, 2008 Connie Capaldo, Katherine Flanagan, and David Littrell

Chapter 5 Understanding Input. Discovering Computers Your Interactive Guide to the Digital World

Kinect Interface to Play Computer Games with Movement

Solomon Systech Image Processor for Car Entertainment Application

Interaction devices and sensors. EPFL Immersive Interaction Group Dr. Nan WANG Dr. Ronan BOULIC

Wireless Day / Night Cloud Camera TV-IP751WIC (v1.0r)

KINECT PROJECT EITAN BABCOCK REPORT TO RECEIVE FINAL EE CREDIT FALL 2013

Multi-Kinect Tracking for Dismounted Soldier Training

EIZO Monitor CalMAN Setup Guide

VIDEO COMMUNICATION SYSTEM-TECHNICAL DOCUMENTATION. Tracking Camera (PCSA-CTG70/CTG70P) PCS-G70/G70P All

Learn Microsoft Kinect API

Advanced Windows Store App Development Using C#

Automated Recording of Lectures using the Microsoft Kinect

Megapixel PoE Day / Night Internet Camera TV-IP572PI (v1.0r)

EVALUATION OF KINECT JOINT TRACKING FOR CLINICAL AND IN-HOME STROKE REHABILITATION TOOLS. A Thesis. Kathryn LaBelle

Megapixel IP66. IP66 Waterproof Housing, Cable through bracket and Anti-Fog Front Cover

How To Use A Kinect To Measure A Teacher'S Work In School

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION

2-Megapixel Sony Progressive CMOS Sensor with Super Wide Dynamic Range and High Frame Rate

CS1112 Spring 2014 Project 4. Objectives. 3 Pixelation for Identity Protection. due Thursday, 3/27, at 11pm

Release Notes. Please refer to the Intel RealSense 3D Camera Software License.rtf for license terms and conditions.

Microsoft PowerPoint 2007

Camtasia Studio. Creating Screen Videos

Human Motion Tracking for Assisting Balance Training and Control of a Humanoid Robot

Mocap in Carrara - by CyBoRgTy

Mouse Control using a Web Camera based on Colour Detection

EXPERIENCES BUILDING A COLLEGE VIDEO GAME DESIGN COURSE

ADOBE FLASH PLAYER Local Settings Manager

SOFTWARE REQUIREMENTS SPECIFICATION FOR TSL-KINECT D-BUG SPONSORED BY INNOVA

Making console games in Unity Tomas Jakubauskas

CS 528 Mobile and Ubiquitous Computing Lecture 2: Android Introduction and Setup. Emmanuel Agu

Spatio-Temporally Coherent 3D Animation Reconstruction from Multi-view RGB-D Images using Landmark Sampling

Automatic Gesture Recognition and Tracking System for Physiotherapy

CSE452 Computer Graphics

DC-DVR1 Hidden Camera Setup Manual

Introduction to Digital Video

Engaging Students in Video Production and Movie Making in the classroom. December 9, pm PST / 4 pm EST

Proposal on. Microsoft DreamSpark Benefits & How to get connected. By CCSIT

The main imovie window is divided into six major parts.

Get to know PayAnywhere.

Milestone Integration Platform Software Development Kit 2 (MIP SDK 2)

Head-Coupled Perspective

[PACKTl. Flash Development for Android Cookbook. Flash, Flex, and AIR. Joseph Labrecque. Over 90 recipes to build exciting Android applications with

Release Notes. Software Version History. What s New in Polycom CX5000 and CX5000HD Systems, Version Polycom CX5000 HD Systems

Laser Gesture Recognition for Human Machine Interaction

Microsoft Research Microsoft Azure for Research Training

SMART Board Training Outline Trainer: Basel Badran

SMART BOARD USER GUIDE FOR PC TABLE OF CONTENTS I. BEFORE YOU USE THE SMART BOARD. What is it?

Windows Embedded Security and Surveillance Solutions

Essentials of Developing Windows Store Apps Using C# MOC 20484

2MP H.264/ MPEG-4/ MJEPG

Chapter 5 Objectives. Chapter 5 Input

Welcome to Corel VideoStudio Pro X5

Kapitel 12. 3D Television Based on a Stereoscopic View Synthesis Approach

Getting Started with Microsoft Office Live Meeting. Published October 2007 Last Update: August 2009

Lecturer s Guide to Teaching through Videoconferencing

Classroom Setup... 2 PC... 2 Document Camera... 3 DVD... 4 Auxiliary Lecture Capture Setup... 6 Pause and Resume... 6 Considerations...

Getting Started with Microsoft Office Live Meeting. Published October 2007

Megapixel. Mechanical ICR

Smartphone Overview for the Blind and Visually Impaired

«compl*tc IDIOT'S GUIDE. Android App. Development. by Christopher Froehlich ALPHA. A member of Penguin Group (USA) Inc.

l What is Android? l Getting Started l The Emulator l Hello World l ADB l Text to Speech l Other APIs (camera, bitmap, etc)

AR-media Player v2.3. INSTALLATION & USER GUIDE (February, 2013) (Windows XP/Vista/7)

Understanding Operating System Configurations

Winscribe Citrix XenApp and Terminal Services Installation Guide

A Real-Time Fall Detection System in Elderly Care Using Mobile Robot and Kinect Sensor

Dynamic Digital Signage Pack.

Animation. Basic Concepts

Innovate Engineering Outreach: A Special Application of the Xbox 360 Kinect Sensor

FARO Technologies Inc. Internal Control File Location: X:\CONTROL\RECORDS\05MANUFA\PARTSPEC\7 Software\E1073_SCENECT_5.2_Manual_EN.

VOLUMNECT - Measuring Volumes with Kinect T M

Datasheet. Unified Video Surveillance Management. Camera Models: UVC, UVC-Dome, UVC-Pro NVR Model: UVC-NVR. Scalable Day or Night Surveillance

University of Sheffield Computer Science. Ambassadors Scheme. Overview. This document contains the text from

Using SMART Boards. Display Power (Projector) & PC Buttons on the Console

Point of View Mobii Android 4.2 Tablet PC. General notices for use... 2 Disclaimer... 2 Box Contents... 2

A Microsoft Kinect based virtual rehabilitation system

Introduction. System requirements

Accuracy of joint angles tracking using markerless motion system

Overcoming Accessibility Challenges of Web Conferencing

Transcription:

Kinect Exercise HCI Course Fall 2015 Maria Husmann husmann@inf.ethz.ch CNB E 104.1

Agenda Introducing Kinect What it does How it works Kinect SDK for Windows Skeletal Tracking Kinect Tutorial (Example Code) Introducing the exercise Demo

Introducing Kinect Kinect is a camera-based, motion sensing input device Developed by Microsoft Announced first in June 2009 at the E3 as «Project Natal» Launched in November 2010 exclusively for the Xbox 360 video game console In February 2012, Microsoft released the commercial Kinect SDK for Windows (current version 1.8)

What it does Provides full-body 3D motion capture, face tracking, and speech recognition Skeletal tracking of up to 6 persons (Kinect v2) Has a built-in microphone array to record (and locate) voices Promises to deliver a «Natural User Interface» experience, your body is the controller The Kinect Effect (Advertisement)

3D DEPTH SENSORS RGB CAMERA MULTI-ARRAY MIC MOTORIZED TILT Old Kinect Sensor (v1)

Source: buildinsider.net

Source: ifixit.com

New is better: Kinect v2 Improved sensors: 3x depth fidelity, wider field of view, 1080p color camera (30fps) Tracks more people and more skeletal joints: Hand tips, thumbs and shoulder center Lighting independent IR (can see in the dark) Advanced face tracking Current Devkit: SDK 2.0

Kinect (v2) Data Sources

Body (skeletal) Tracking Recognizes up to 6 people in the field of view Maximum of two (six with v2) players tracked (body) at once Center position for other 4 players (v1) Two range modes (Kinect v1) Default: Between 0.8m and 4m Pratical range: 1.2 to 3.5 m Near range: Between 0.4m and 3m Pratical range: 0.8 to 2.5 m One Mode (Kinect v2) Between 0.5m and 4.5m

Joints Coordinates expressed in meters, center of the Kinect is reference plane Each player has set of 20 joints (25 in v2) Each joint has attributes Associated state: Tracked, Not tracked, or Inferred Joint normal (v2): Describes rotation Inferred Occluded, clipped, or low confidence joints

New Joints!

Advanced tracking with Kinect v2 Kinect v2 tries to recognize different states for each hand: Open, Closed, Lasso, Unknown and NotTracked Hand state is tracked for the two bodies closest to the sensor Lean Tracking Determines how much a body is leaning from vertical Values range between -1 and 1 in both directions, 1 roughly corresponds to 45 degrees of lean

Kinect Tutorial

Kinect Initalisation KinectSensor sensor; BodyFrameReader bodyreader; void MyKinectApplication_Loaded(object sender, RoutedEventArgs e) { // Get the default Kinect Sensor (there can only be one anyway) sensor = KinectSensor.GetDefault(); // Request a new reader for the body frames (skeletal tracking) bodyreader = sensor.bodyframesource.openreader(); // Allocate buffer for bodies (skeletons) bodies = new Body[6]; // Start the Kinect sensor sensor.open(); } // Register event handler to process body frames bodyreader.framearrived += bodyreader_framearrived;

Processing Body Frames void bodyreader_framearrived(object sender, BodyFrameArrivedEventArgs e) { // Retrieve the current body frame and dispose it after the block using (BodyFrame bframe = e.framereference.acquireframe()) { // Check whether the frame has not expired yet (i.e. it is available) if (bframe!= null) { bframe.getandrefreshbodydata(bodies); // Copy the body data into our buffer trackedbody = (from s in bodies where s.istracked select s).firstordefault(); // Get the first tracked body } if (trackedbody!= null) { JointOrientation headorientation = trackedbody.jointorientations[jointtype.head]; CameraSpacePoint headposition = trackedbody.joints[jointtype.head].position; //... do stuff }

The Kinect Exercise

Introducing the exercise Task: Control a PowerPoint presentation with your body Due to the nature of the project, you are required to program in C# You can work in teams of 4-5 Teams will be allocated at least 2 slots à 2-3 hours each (more slots can be allocated to your team if desired) Fun exercise with moderate effort

How the exercise is being run All programming is done with Visual Studio in our lab on the real Kinect for Windows hardware 2 labs are available, each equipped with a workstation and a Kinect for Windows sensor Visual Studio 2012 pre-installed You are responsible for managing your source code. You may be assigned to a different machine each time.

Implement «recognizegestures» /// <summary> /// TODO This is where you are supposed to implement your pose/gesture recognition /// </summary> /// <param name="trackedbody">tracked body data</param> private void RecognizeGestures(Body trackedbody) { // TODO Add your pose/gesture recognition code here! } Called each time a new body frame has been processed (up to 30 frames per second) You may change other parts of the code as well if you want to add more/extended functionality

Goal of this Exercise Design gestures for at least 4 different PowerPoint operations: Next Slide, Previous Slide First Slide, Last Slide Get to know the Kinect SDK for Windows and implement some poses and/or gestures You will be given a small code framework (you do not have to start from scratch) available by the end of this week Multi-user operations possible but tricky! (You will need to change the supplied framework)

Technical challenges Constant stream of data, no «touch down» or «gesture start» events Coordinates are in 3D space and differ depending on where the users are standing Our proposal: Pseudo-Gestures Try to recognize different poses (e.g. hands above head) A pseudo-gesture is a sequence of different poses Implement some kind of State-Machine

Time Schedule 23.09.2015 01.10.2015 (1 Week) Explore the Kinect SDK, design suitable gestures Register your group with Maria (husmann@inf.ethz.ch) until Friday 25. 9. Send an e-mail with the subject "Kinect Exercise" that contains a group name and the names of all members of your group Please also specify preferred dates/times: Mo-Fr, 9-12/12-15/15-18 29.09.2015-13.10.2015 (~2 Weeks) Implementation Preparation of a small presentation (ca. 5 minutes per team) You need to send your Visual Studio Projects and accompanying presentations to Maria no later than 13.10.2015 14.10.2015 Presentation and live demonstration of your system in the exercise session

Showtime! Demo

References Learning Resources and Documentation http://www.microsoft.com/en-us/kinectforwindows/develop/learn.aspx Official MSDN Documentation https://msdn.microsoft.com/en-us/library/dn799271.aspx Programming Guid https://msdn.microsoft.com/en-us/library/dn782037.aspx Video Tutorials (Lecture 2 is the most useful for this exercise), try to open link in Chrome http://www.microsoftvirtualacademy.com/training-courses/programmingkinect-for-windows-v2-jump-start