A Microsoft Kinect based virtual rehabilitation system

Similar documents
EVALUATION OF KINECT JOINT TRACKING FOR CLINICAL AND IN-HOME STROKE REHABILITATION TOOLS. A Thesis. Kathryn LaBelle

Kinect Interface to Play Computer Games with Movement

Web-based home rehabilitation gaming system for balance training

Lean on Wii: Physical Rehabilitation With Virtual Reality and Wii Peripherals

Master Thesis Using MS Kinect Device for Natural User Interface

Fall Detection System based on Kinect Sensor using Novel Detection and Posture Recognition Algorithm

DESIGN OF A TOUCHLESS USER INTERFACE. Author: Javier Onielfa Belenguer Director: Francisco José Abad Cerdá

Automated Recording of Lectures using the Microsoft Kinect

VIRTUAL TRIAL ROOM USING AUGMENTED REALITY

Mouse Control using a Web Camera based on Colour Detection

Introduction to the Perceptual Computing

Human Motion Tracking for Assisting Balance Training and Control of a Humanoid Robot

Design Analysis of Everyday Thing: Nintendo Wii Remote

Go to contents 18 3D Visualization of Building Services in Virtual Environment

Exergaming: Video Games as a form of Exercise

DINAMIC AND STATIC CENTRE OF PRESSURE MEASUREMENT ON THE FORCEPLATE. F. R. Soha, I. A. Szabó, M. Budai. Abstract

Accuracy of joint angles tracking using markerless motion system

Next Generation Natural User Interface with Kinect. Ben Lower Developer Community Manager Microsoft Corporation

THE MS KINECT USE FOR 3D MODELLING AND GAIT ANALYSIS IN THE MATLAB ENVIRONMENT

Virtual Reality Based Rehabilitation and Game Technology

Contents. Introduction Hardware Demos Software. More demos Projects using Kinect Upcoming sensors. Freenect OpenNI+ NITE + SensorKinect

Computer Games and Virtual Worlds: New Modalities of Rehabilitation and Therapy

What is Multimedia? Derived from the word Multi and Media

GestPoint Maestro3D. A White Paper from GestureTek The Inventor of 3D Video Gesture Control

Passive Range of Motion Exercises

Development of 3D Image Manipulation Software Utilizing the Microsoft Kinect

VIDEO COMMUNICATION SYSTEM-TECHNICAL DOCUMENTATION. Tracking Camera (PCSA-CTG70/CTG70P) PCS-G70/G70P All

Barbie Bungee Jump. High School Physics

Beyond Built-in: Why a Better Webcam Matters

Wii Remote Calibration Using the Sensor Bar

CREATE A 3D MOVIE IN DIRECTOR

Bounce! Name. Be very careful with the balls. Do not throw them DROP the balls as instructed in the procedure.

Female Child s date of birth: Last name: State/ Province: Home telephone number:

Interaction devices and sensors. EPFL Immersive Interaction Group Dr. Nan WANG Dr. Ronan BOULIC

Sport-specific Rehabilitation and Performance Programs

How does the Kinect work? John MacCormick

Car Racing Game. Figure 1 The Car Racing Game

Dr. Pat Mirenda. Software Design Specification Document

Intermediate PowerPoint

Gesture-Based Human-Computer-Interaction Using Kinect for Windows Mouse Control and PowerPoint Presentation

Running head: A SURVEY OF GAME CONTROLLER DESIGN CONSIDERATIONS 1. Total Control: A Survey of Modern Video Game Controller Design Considerations

The Core of the Workout Should Be on the Ball

Intro to the Art of Computer Science

SINGLE CABLE SOLUTION PROVIDES 5PLAY CONVERGENCE FOR COMMERCIAL INSTALLATIONS

Archery: Coaching Young Athletes. Developing Fundamental Movement Skills

Range of Motion Exercises

Roles of Smart TV in Internet of Things

Last name: State/ Province: Home telephone number:

Proper Workstation Setup

Android Phone Controlled Robot Using Bluetooth

Chapter 5 Understanding Input. Discovering Computers Your Interactive Guide to the Digital World

SimFonIA Animation Tools V1.0. SCA Extension SimFonIA Character Animator

The Physics and Math of Ping-pong and How It Affects Game Play. By: Connor Thompson & Andrew Johnson

How to increase Bat Speed & Bat Quickness / Acceleration

Self-Calibrated Structured Light 3D Scanner Using Color Edge Pattern

Graphical Environment Tool for Development versus Non Graphical Development Tool

Amit Moran, Gila Kamhi, Artyom Popov, Raphaël Groscot Perceptual Computing - Advanced Technologies Israel

Tutorial: Creating Platform Games

Florida 4-H Consumer Choices Study Topics. Student Guide. Video Game Systems. Introduction and Background

C# Implementation of SLAM Using the Microsoft Kinect

YouGrabber playful therapy for finger, hand and arm rehabilitation

Chapter 5 Objectives. Chapter 5 Input

TH2. Input devices, processing and output devices

ANALYZING A CONDUCTORS GESTURES WITH THE WIIMOTE

The Use of the Lokomat System in Clinical Research

Basic Rowing Technique. Autor: Thor S. Nilsen (NOR) Editors: Ted Daigneault (CAN), Matt Smith (USA)

CAPTURE Collaboration and Proactive Teamwork Used to Reduce. Best Practices in Safe Transfers and Mobility to Decrease Fall Risk

This document fully describes the 30 Day Flexibility Challenge and allows you to keep a record of your improvements in flexibility.

Athletics (Throwing) Questions Javelin, Shot Put, Hammer, Discus

NEW BIONIC LEG AT MARLTON REHAB HELPS STROKE PATIENTS & OTHERS WALK THEIR WAY TO GAIT RECOVERY

Magic iriid Virtual Remote Controller Interface For Home Theater and Marketing App User Manual

Learn Microsoft Kinect API

How to Make APA Format Tables Using Microsoft Word

Ulster GAA Sport Science Services Fitness Testing Procedures Ulster GAA Fitness Testing Procedures For County Academy Squads

Virtual Reality and Rehabilitation

Approaching VR 2.0: Creating and Sharing Open Source Virtual Environments for Health Care and Research

ARTICLE. Sound in surveillance Adding audio to your IP video solution

Voice Driven Animation System

A Remote Maintenance System with the use of Virtual Reality.

Intuitive Navigation in an Enormous Virtual Environment

Head-Coupled Perspective

Whitepaper. Image stabilization improving camera usability

Chapter 1. Introduction. 1.1 The Challenge of Computer Generated Postures

NETS for Students: Extended Rubric for Grades 6 8

The ergonomics of tablet use in business applications

INSTRUCTING WHEELCHAIR TENNIS PLAYERS

Freehand Sketching. Sections

ABOUT YOUR SATELLITE RECEIVER

Transcription:

A Microsoft Kinect based virtual rehabilitation system Erdenetsogt Davaasambuu 1, Chia-Chi Chiang 2, John Y.Chiang 2, Yung-Fu Chen 1, Sukhbaatar Bilgee 3 1 Department of Healthcare Administration, Central Taiwan University of Science and Technology, Taichung 40601, Taiwan 2 Department of Computer Science and Engineering, National Sun Yat-sen University, Kaohsiung 80424, Taiwan 3 School of Information and Telecommunication Technology, Mongolian University of Science and Technology, Ulaanbaatar,Mongolia Abstract Virtual Reality technology is currently widely applied in physical rehabilitation therapy. The ability to track joint positions for Microsoft Kinect might be useful for rehabilitation, both in a clinical setting and at home. This research explores the potential and the limitations of the Kinect in the application of e- rehabilitation. We evaluated the tools that could be used to help promote physical rehabilitation at home by reducing the frequency of hospital visits, resulting in the reduction of healthcare cost. A prototypic system has been developed for the evaluation of 6 different gestures, which are useful for e-rehabilitation. The experimental shows show that the system can recognize gestures with an accuracy up to 88-92.2%. The questionnaire outcome reveals that the designed gestures were perceived as fun and easy in operation (p<0.05). Participants generally like the controller-free interface and smooth movement operations provided in the designed system. Keywords: Virtual reality; E-rehabilitation; Kinect; Gesture recognition; I. Introduction A. Overview and Background Virtual reality is an artificial environment that is created with special software and presented to the user in such a way that the user is able to recognize himself and operate in that environment.[1] An increasing trend in gaming is the use of interfaces that require physical activity for an optimal user experience (e.g. the Sony EyeToy or the Nintendo Wii). Rehabilitation forms an essential component of the therapeutic continuum in multiply injured or motor disabilities patients. Effective rehabilitation programs assist patients in optimizing their level of physical, psychological and social function, while also reducing the length of patient stay, re-admission rates and use of primary care resources [2]. Virtual reality (VR) offers a possible solution. VR is a technology that allows the user to directly interact with a computer-simulated environment. This technology, developed initially for military training, has now become widely available through video games[3]. The potential for VR interfaces to create an environment that encourages high repetition intensity has been exploited by numerous vocational training programs [4], such as laparoscopic surgical skill training. VR is an immersive, interactive, 3-dimensional computer experience occurring in real time. Virtual reality has the ability to simulate real-life tasks [5] and comes together with several evident benefits for rehabilitation: 1) Specificity and adaptability to each patient and disease; 2) Repeatability; 3) Ability to provide patient engagement; 4) Tele-rehabilitation and remote data access; 5) Safety [6]; VR offers the possibility to be precisely adapted to the patient s therapy and to be specific. VR environments can provide realistic training for the patient in different scenarios and phases of the rehabilitation. It is now conceivable that computer-based rehabilitation programs could be developed using current, widely available, affordable virtual reality platforms, such as the Microsoft Kinect. This research investigated the possibility of using the Microsoft Kinect to aid such therapy. The Kinect is a peripheral device developed by Microsoft for use with their Xbox 360 gaming platform. Using its depth, image, and audio sensors, the device allows users to control games using just their bodies. Instead of playing video games using conventional hand-held controllers, players can stand in front of the Kinect and be the controller themselves. The Kinect enables this by following users movements by tracking and identifying 44

their joints. Positions of a player s joints in threedimensional space are obtained from the sensor data and are used to follow the motion of the player [7]. This study aims to use image processing technology in designing a system to help motivate people with motor disabilities to increase the number of exercises and improve the motor proficiency and quality of life[8]. The Kinect could also be a useful tool for at-home therapy. Therapy in the home is more flexible and convenient for the patient and allows for more frequent repetition of exercises. In order to stimulate neural reactivation in regions of the brain that control movement, exercises must be repeated many times every day [9]. While therapy sessions alone often cannot fulfill the required frequency of practice, at home exercises can achieve this goal. The sophisticated tools and technologies that can be offered at a doctor s office are generally too expensive to have in a home, but the Kinect is low in cost and readily available. B. Research and Goals The Microsoft Kinect is a technology greatly changing the way in playing games. Through the combination of its depth map and its skeleton tracking abilities, it enables a massive range of applications. It was a clear choice to attempt to use this incredible technology to produce a low cost and effective toolkit for biomedical engineers. The overarching goal of this research is to determine whether the Kinect and the data supplied are conducive to be used in virtual rehabilitation. In order to answer these questions, a prototype application was developed and ready to use. Patients shall be following a set of exercises shown on the TV screen, while the collected data would be fed back to the doctors supervising the patient, thereby decreasing the number of his visits to the hospital, making this a convenient affair for both the patient and the doctor. C. Related work Even before the release of the Kinect, some of project using Nintendo s Wii video gaming system. Such gesture interaction technologies are not new; however, their recent availability as interface means within affordable mass-market gaming products can be seen as evidence of a broadening usage beyond solely entertainment. Often, Nintendo s Balance Board is used as an input device, for example, in various games for balance training [13]. The Kinect, on the other hand, is small and affordable enough to be used in virtually any home environment, and it does not require patients to wear anything that could limit their movement [7]. Other studies have also identified the Kinect s potential for use in physical therapy. Chang et al, developed a Kinect-based rehabilitation system to assist therapists in their work with students who had motor disabilities [8]. The program used the motion tracking data provided by the Kinect to determine whether the patient s movements reached the rehabilitation standards and to allow the therapist to view rehabilitation progress. The Kinect has been used for medical purposes outside of physical therapy as well [7]. Rizzo and others at the University of Southern California studied how video games that require player movement could motivate persons at risk for obesity to engage in physical activity[11]. To demonstrate the concept, they developed a system using the Kinect in which the popular computer game World of Warcraft could be controlled with user gestures instead of mouse and keyboard commands. II. Technical Details A. Introduction to Microsoft Kinect The Microsoft Kinect is a set of sensors developed as a peripheral device for using with the Xbox 360 gaming console. Kinect has an RGB camera and a dual infrared depth sensor with a projector and an infraredsensitive camera on the same band. The RGB camera has a resolution of 640 480 pixels, while the infrared camera uses a matrix of 320 240 pixels or 640x480 pixels. Using image, audio, and depth sensors, it can detect movements, identifies speech of players, and allows them to play games using only their own bodies as the controls. Unlike previous attempts at gesture or movement-based controls, it does not require the player to wear any kind of accessory to enable the device to track the player s movements. The depth, image, and audio sensors are housed in a horizontal bar attached to a base with a motorized pivot that allows it to adjust the sensor bar up and down (Figure 1). Together, these parts make up the Kinect device. Figure 1. Kinect Sensors. 1. Depth sensor, 2. RGB camera, 3 Microphone array, and 4.Motorized base 45

Two sensors make up the depth component of the Kinect: an infrared projector and a monochrome CMOS sensor (label 1, Figure 1). Together, these are the basis for gesture recognition and skeleton tracking. The infrared light projector shines a grid of infrared light on the field of view, and a depth map is created based on the rays that the sensor receives from reflections of the light off of objects in the scene. B. Kinect Drivers and SDKs Microsoft did not initially release any drivers or SDKs to enable the Kinect to be used with a personal computer, and in fact at first the company discouraged efforts by the computing community to enable this. Later Microsoft modified its statement and intentionally disclose the USB port to be used for connecting the device to the Xbox [12]. It attracts many developers to initiate open source projects to develop drivers, SDKs, and APIs to be used with personal computers. The explosion of these efforts was doubtless aided by the $2,000 prize issued by Adafruit Industries for the first open source Kinect driver. Currently, two kinds of framework have been developed, i.e. OpenNI developed by PrimeSense and Official Microsoft Kinect SDK. Microsoft Kinect SDK offers several most important advantages over the open source tools. One of the biggest advantages of the Microsoft SDK is joint tracking without calibration and working with audio. It enables the tracking of 20 joints, as shown in Figure 2a, in which X, Y, and Z-coordinates (Figure 2b) are given in meters for each joint position. The labels in the figure are placed on the positive direction of each axis. Microsoft s joint tracking algorithm identifies joint positions by processing a depth image. The algorithm first comes up with a joint guess for each pixel in a depth image, along with a confidence level for each pixel. The main purpose for this software was to demonstrate that it would be possible to develop useful virtual rehabilitation software tools with the Kinect. The personalized gesture-based Carousel Spinning menu (Figure 3) provides the patients with easier interface to adjust the main program according to the individual needs. You can swipe the next item from the right to left side or you can swipe back the item from the left to right hand side using the right hand. Right hand rises up above the head in order to enter and load a new form about exercise. Patients perform various movements and achieve different activities of rehabilitation as prescribed by the therapists. This program enhances the efficiency of rehabilitation by increasing the patient s (especially elder people) muscle endurance and ability to perform daily tasks independentl. Skeleton tracking is the processing of depth image data to establish the positions of various skeleton joints on a human form. For example, skeleton tracking determines where a user s head, hands, and center of mass are. It provides X, Y, and Z coordinates of these skeleton points. Its menu has integrated a speech commands with multiple features. To implement voice commands it is necessary: To instantiate an SpeechRecognitionEngine, the creation of a grammar containing the words to recognize. Figure 3. Screenshot of main screen Figure 2. Microsoft SDK Skeleton. Joint positions. Skeleton reference system. C. Kinect Rehabilitation Software Implementation The implementation of an event handler that handles Speech Recognized object containing the word command that has the highest confidence. 1. Next: In main carousel menu, change the current items to next item. 2. Back: Jump to the item in previous menu 3. Select: Select items 4. Up: Changes kinect angle by tilting up 5. Down: Changes kinect angle by tilting down All voice commands to be tested must have confidence level more than 0.92. In gestural interfaces, pure gestures, poses, and tracking can be combined to create interaction idioms. 46

The Microsoft Kinect SDK does not include a gesture detection engine. Therefore, developers define and detect gestures in a more flexibile way to create movements and detect them. Gesture detection can be relatively simple or intensely complex, depending on the gesture. There are three basic approaches to detect gestures: algorithmic, neural network, and by example. Each of these techniques has its strengths and weaknesses. The methodology of a developer may choose will depend on the gesture, the needs of the project, the time available, and the development skill. The algorithmic approach is the simplest and easiest to implement, whereas the neural network and exemplar systems are complex and non-trivial. Our approach is based on algorithmic method. There included jump, hand tracking, walking, joint angle calculation, swipe gesture, sit and stand as well as many simple gesture can be check and recognize. The Law of Cosines helps calculate the angle between joints. The largest angle calculable by the Law of Cosines is 180. When calculating the angles between the joints, an additional point is needed to determine the angles from 180 to 360 [13]. From the skeleton tracking data, we can draw a triangle using any two joint points. The third point of the triangle is derived from the other two points. Knowing the coordinates of each point in the triangle means that we know the length of each side, but no angle values. As shown in Figure 5, by applying the Law of Cosines, the value of any desired angle can be obtained. (2) Figure 5. Law of cosines Figure 4. Schema of deployed modules Microsoft Kinect promotes a concept expressing that users are the input devices. With skeleton data, applications can do the same things a mouse or touch device can. The difference is the depth component allows the user and the application to interact as a way that has never been done before. Let us explore the mechanics through which the Kinect can control and interact with user interfaces. It can create custom gesture and detect them using skeleton data frames. For example, to detect whether you are walking or not, the algorithm checks five joint points, including right and left knees, right and left feet, and hip center. Using two Boolean values, namely right-rise and left-rise, for checking which leg is above the other and comparing the z axis with knee and hip centers. If all condition provided step count increased to one, Most of movement calculation based on two or more joint locations, distances and joint angles [14] to calculate the distance between two points for 2- and 3- dimensional points, respectively, using Eqs. (2) and (3). (1) Calculations on the joint points give the values for a, b, and c. The unknown is angle C, which can be obtained by the formula: C=cos -1 ((a 2 +b 2 c 2 )/2ab). The system developed is shown in Figure 6 and Figure 7. Figure 6. Screenshots of the developed application. Hand game and angle pose III. Experimental Setup The components needed for setting up the experimental platform are as follows: Sensor of Kinect XBOX 360 Core 2 duo 2.0 processor OS Windows 7.NET FRAMEWORK 4.0 Microsoft Speech Platform Runtime, v. 11 47

Kinect for Windows SDK 1.5 When the player s position is recognized as sitting, the player has to brake, as demonstrated in Figure 9 When a jumping gesture is detected during playing the game, a jumping operation will be executed. Figure 7. Screenshots of the developed system. Walking exercise and jumping exercise The gestures detected by Kinet sensor are integrated to the keyboard controller for playing games. The experimental framework calculates each gesture done by the patient and generates virtual keyboard commands. There are a total of five keyboard input command, including Up, Down, Right, Left, and Jump. In this experiment, we focus on using the game ExtremeTuxRacer for experiments. It is an open source project, offering high-quality graphics, sound and gameplay. In addition to using keyboard controller, it can also be controlled by full body motion supported by Kinect virtual keyboard. The purpose of the experiment aims to change the keyboard-based game into full body motion or gestures control one. ExtremeTuxRacer is a racing game where the user controls a penguin descending from a mountain route covered by snow. The challenge for the player is to maximize his or her score by targeting, catching, or collecting the fishes that are strategically placed along the way, by preventing from hitting the trees to arrive at the destination in shortest time. The thrill of the game comes from the feeling of speed, the risk of hitting obstacles, and the possibility to jump for saving time. The fun of the game simply comes from the empowered direct control of an artifact in a virtual environment [15]. Five buttons are needed to play the game to provide the functions of accelerating, braking, turning left and right, and jumping. Each button executed by body gesture. The basic ideas coming from KinDrive project are as follows: Swinging the hands to the right side gesture will be executed by right button, as shown in Figure 8. Swinging the hands to the left side will be executed by the left button, as indicated in Figure 8 When the player is detected as standing, the player will accelerated, as illustrated in Figure 9. Figure 8. Swing gesture, and left and right commands Figure 9 Stand and sit gestures The Kinect keyboard is not only for playing ExtremeTuxRacer, but also can be used to play other racing games. Figure 10 shows the gameplay. These motor tasks are all targeting upper limb and full body movement such are shoulder elbow, jump, walking, sitting and hand tracking. Most of the experiments were conducted with the subject located at a distance of about 1.5 meters from the Kinect sensor. Five subjects (3 male and 2 female; age ranging from 21 to 22; height: 162-178 cm) performed each type of experiments from 10 to 20 times. Table 3 shows the subjects used in the tests with their genders and heights. 48

Figure 11 Variation of jump B. Jumping experiment Two main types of tests were performed. All based on a jump exercise that is frequently employed in rehabilitation and diagnosis. In this test, the subject starts preparing the pose, facing the Kinect sensor and the extended screen. In each of these experiments, three joint positions were recorded. Data was written to a file for later analysis. Three joints, i.e. left knee, right knee, and hip center, are enough to recognize a jump or not. Figure 11 shows a graph of the vertical joints variation for a subject doing jump exercise. The accuracy of recognizing jump exercises is 0.922, while detecting all the other exercises can achieve at least an accuracy of 0.88. Another important measurement is the distance between Kinect to patient. Every exercise using skeleton data when comparing data collected at distances that varied from 1.5 meters to 4 meters, the probability of gesture recognition and skeleton tracking was most clearly distinguishable at 2 to 2.8 meters. As seen in Figure 11, when the subject was too close or too far away from the sensor, the inconsistency of the probability of movement recognition will be getting sensitively changing. Approximately two hundred attempts using for log file analysis and calculation of movement detection. These results align pretty closely with Microsoft s recommendation that suggests the skeleton tracking works best at a distance of 1.2 to 3.5 meters. Figure 12 Accuracy of distance shifting correlation The gesture set is limited to seven core movements to account for a lack of previous activity impairments among the target audience and to limit the amount of sensor motor learning necessary to interact. Hand gestures (Raise hand swipes) were highlighted as easiest to perform, especially if movements were natural and already known from daily life, as one participant pointed out that Raise right hand [was easiest] because it is just natural, it is quite strong. TABLE 2 Overview of gesture sets. Gesture1 Gesture2 Gesture3 Gesture4 Gesture5 Gesture6 Gesture 7 Walk in place Raise right hand Jump Swipe Angle pose Hand Pointer Detect sit down or stand up B. Questionnaire Results Usage attitudes were assessed using the questionnaire after the subjects have used the e-rehabilitation system. The outcomes were tested with one-sample t-test with significance defined as p<0.05. The five volunteers were asked to answer the questions in of the questionnaires after having used the system. After statistic analysis, it reveals that the designed gestures were perceived as fun and easy for operation (p<0.05), 49

as well as neither tiresome nor difficult in operation (p<0.05). Participants generally like the controller-free interface and smooth movement operations provided in the designed system. TABLE 3 Descriptive statistics for the perceived suitability of the gestures set used for the exercise (5= completely agree). *Significance with p<0.05 (one-sample t-test) Item (N=5) Mean SD Performing the gestures was fun* 4.08 0.94 Performing the gestures was tiresome* 2.05 0.56 Performing the gestures was easy* 3.91 0.74 Performing the gestures was difficult* 2.07 0.56 Performing game was fun* 4.08 0.53 Performing game was effectiveness to rehabilitation* 4.11 0.62 IV. Conclusion and future work The Kinect is expected to be a useful tool for home rehabilitation. Rehabilitation at home is more flexible, cheap and convenient for the users and allows for more frequent repetitive exercises. This is becoming a popular approach in rehabilitation, which is fun for the patient. The preliminary study works on user interface, log analyses report system, and complex movement approach. Currently, Kinect SDK supports on NeoAxis and Neat game engines. Those solutions offer a 3D virtual training and provide an interesting way of rehabilitation. The advantages of the computer game is that it can be improved to provide functions for rehabilitation. In this way the training activity becomes more attractive and interesting. In conclusion, in this study we propose a system with Kinet sensor to control the game by body posture and movement. It is expected to be interesting and effective in enhance home with body motion controller. [7] Kathryn LaBelle. Evaluation of kinect joint tracking for clinical and home stroke rehabilitation tools thesis, 2011 [8] Yao-Jen Chang, Shu-Fang Chen, and Jun-Da Huang, "A Kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities," Research in Developmental Disabilities, vol. 32, no. 6, pp. 2566-2570, November-December 2011 [9] Austen Hayes, Patrick Dukes, and Larry F Hodges, "A Virtual Environment for Post-Stroke Motor Rehabilitation," Clemson University, Clemson, [10] Bateni, H. Changes in balance in older adults based on use of physical therapy vs the Wii Fit gaming system: a preliminary study. Physiotherapy (2011). [11] Albert Rizzo, Belinda Lange, Evan A Suma, and Mark Bolas, "Virtual Reality and Interactive Digital Game Technology: New Tools to Address Obesity and Diabetes," Journal of Diabetes Science and Technology, vol. 5, no. 2, pp. 258-264, March 2011. [12] Dean Wilson. (2010, November) TechEye. [Online]. [13] R. Boian, et al. Virtual reality-based system for ankle rehabilitation post stroke, Proceedings of the First International Workshop on Virtual Reality Rehabilitation, pp. 77 86, Citeseer, 2002 [14] Jarrett Webb and James Ashley. Beginning Kinect Programming with the Microsoft Kinect SDK. 2012 [15] B.Herbelin, J.Ciger, A.L.Brooks. Customization of gaming technology and prototyping of rehabilitation applications. 2008 [16] P. Weiss, R. Kizony, U. Feintuch, and N. Katz, Virtual reality in neurorehabilitation Textbook of neural repair and neurorehabilitation, vol. 2, pp. 182 197, REFERENCES [1] http://www.virtual-reality-rehabilitation.com/a/virtualreality/what-is-virtual-reality [online] [2] National Institute for Health and Clinical Excellence. Rehabilitation after criticalillness; [3] Daniel Paul Butler,Keith Willett. Wii-habilitation: Is there a role in trauma? On injury: 2010 11 [4] RJ. Stone, Applications of virtual environments: an overview. In: Stanney KM,editor. Handbook of virtual environments. Mahwah, NJ: Erlbarum; 2002. p. [5] S. Adamovich. A virtual reality Based Exercise System for Hand Rehabilitation Post-Stroke. Presence, Special Issue on Virtual Rehabilitation, 14(2), 161-174. [6] Alessandro De Mauro. Virtual Reality Based Reahabilitaion and Game Technology on ehealth & Biomedical Applications Vicomtech. 50