FINGER TRACKING VIRTUAL MOUSE



Similar documents
Mouse Control using a Web Camera based on Colour Detection

A Method for Controlling Mouse Movement using a Real- Time Camera

Laser Gesture Recognition for Human Machine Interaction

Virtual Mouse Using a Webcam

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

Interactive Projector Screen with Hand Detection Using LED Lights

Analecta Vol. 8, No. 2 ISSN

A System for Capturing High Resolution Images

A Real Time Hand Tracking System for Interactive Applications

REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING

Vision-Based Blind Spot Detection Using Optical Flow

Multi-Touch Control Wheel Software Development Kit User s Guide

Detection and Restoration of Vertical Non-linear Scratches in Digitized Film Sequences

International Journal of Advanced Information in Arts, Science & Management Vol.2, No.2, December 2014

INTERACTIVE WHITE BOARD MANUAL

A Proposal for OpenEXR Color Management

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

A Study on SURF Algorithm and Real-Time Tracking Objects Using Optical Flow

Anime Studio Debut 10 Create Your Own Cartoons & Animations!

Multi-Touch Ring Encoder Software Development Kit User s Guide

Synthetic Sensing: Proximity / Distance Sensors

A Novel Multitouch Interface for 3D Object Manipulation

How To Compress Video For Real Time Transmission

HAND GESTURE BASEDOPERATINGSYSTEM CONTROL

VISUAL RECOGNITION OF HAND POSTURES FOR INTERACTING WITH VIRTUAL ENVIRONMENTS

A Survey of Video Processing with Field Programmable Gate Arrays (FGPA)

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

Multi-Touch Control Wheel Software Development Kit User s Guide

How to rotoscope in Adobe After Effects

HANDS-FREE PC CONTROL CONTROLLING OF MOUSE CURSOR USING EYE MOVEMENT

Building an Advanced Invariant Real-Time Human Tracking System

Cachya Head Tracking Software. User s Manual (revision )

Automatic Traffic Estimation Using Image Processing

CNC-STEP. "LaserProbe4500" 3D laser scanning system Instruction manual

Demo: Real-time Tracking of Round Object

The Scientific Data Mining Process

Mean-Shift Tracking with Random Sampling

The infrared camera NEC Thermo tracer TH7102WL (1) (IR

SIGNATURE VERIFICATION

Cloud tracking with optical flow for short-term solar forecasting

Basler. Line Scan Cameras

VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS

Efficient Background Subtraction and Shadow Removal Technique for Multiple Human object Tracking

Speed Performance Improvement of Vehicle Blob Tracking System

Hands free HCI based on head tracking using feature extraction

Template-based Eye and Mouth Detection for 3D Video Conferencing

ROBOTRACKER A SYSTEM FOR TRACKING MULTIPLE ROBOTS IN REAL TIME. by Alex Sirota, alex@elbrus.com

Computer Vision for Quality Control in Latin American Food Industry, A Case Study

Neat Video noise reduction plug-in for Edius

Embedded Vision on FPGAs The MathWorks, Inc. 1

ACE: Illustrator CC Exam Guide

Visual-based ID Verification by Signature Tracking

WAKING up without the sound of an alarm clock is a

An Active Head Tracking System for Distance Education and Videoconferencing Applications

CS231M Project Report - Automated Real-Time Face Tracking and Blending

PRODUCT SHEET.

VECTORAL IMAGING THE NEW DIRECTION IN AUTOMATED OPTICAL INSPECTION

Plotting: Customizing the Graph

Master Thesis Using MS Kinect Device for Natural User Interface

FCE: A Fast Content Expression for Server-based Computing

Scanners and How to Use Them

Project 4: Camera as a Sensor, Life-Cycle Analysis, and Employee Training Program

Anime Studio Debut vs. Pro

The Keyboard One of the first peripherals to be used with a computer and is still the primary input device for text and numbers.

Sense. 3D Scanner. User Guide

E27 SPRING 2013 ZUCKER PROJECT 2 PROJECT 2 AUGMENTED REALITY GAMING SYSTEM

Boundless Security Systems, Inc.

9.7" Microscope Tablet Camera (REALPAD 5.5) Introduction. Feature

Static Environment Recognition Using Omni-camera from a Moving Vehicle

Assessment. Presenter: Yupu Zhang, Guoliang Jin, Tuo Wang Computer Vision 2008 Fall

Video Analytics A New Standard

A secure face tracking system

Vision based Vehicle Tracking using a high angle camera

Effective Use of Android Sensors Based on Visualization of Sensor Information

DATA ACQUISITION FROM IN VITRO TESTING OF AN OCCLUDING MEDICAL DEVICE

From Product Management Telephone Nuremberg

Understanding Network Video Security Systems

SMART Ink 1.5. Windows operating systems. Scan the following QR code to view the SMART Ink Help on your smart phone or other mobile device.

product. Please read this instruction before setup your VenomXTM.

A Counting Algorithm and Application of Image-Based Printed Circuit Boards

A technical overview of the Fuel3D system.

Bildverarbeitung und Mustererkennung Image Processing and Pattern Recognition

Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wireless Hardware Control

T-REDSPEED White paper

Impedance 50 (75 connectors via adapters)

XDS Multi-windowing display management system

DEVELOPMENT OF HYBRID VECTORIZING SOFTWARE FOR DIGITIZATION OF CADASTRAL MAPS

DWH-1B. with a security system that keeps you in touch with what matters most

XCal-View user manual

Colour Image Segmentation Technique for Screen Printing

ESE498. Intruder Detection System

How To Use Trackeye

Robert Collins CSE598G. More on Mean-shift. R.Collins, CSE, PSU CSE598G Spring 2006

ADVANCES IN AUTOMATIC OPTICAL INSPECTION: GRAY SCALE CORRELATION vs. VECTORAL IMAGING

How To Fix Out Of Focus And Blur Images With A Dynamic Template Matching Algorithm

Video-Conferencing System

Cortona3D Viewer. User's Guide. Copyright ParallelGraphics

Acquire Video Wall. Revolutionising digital interaction.

Video compression: Performance of available codec software

How To Run A Factory I/O On A Microsoft Gpu 2.5 (Sdk) On A Computer Or Microsoft Powerbook 2.3 (Powerpoint) On An Android Computer Or Macbook 2 (Powerstation) On

Transcription:

Volume 2, No. 03, ISSN 2278-1080 The International Journal of Computer Science & Applications (TIJCSA) RESEARCH PAPER Available Online at http://www.journalofcomputerscience.com/ FINGER TRACKING VIRTUAL MOUSE Shweta Jain 1, School of computer science and engineering VIT UNIVERSITY Vellore, India jain.shweta45@gmail.com Shrikant Ujjainkar 2, School of computer science and engineering VIT UNIVERSITY Vellore, India shrikant41@gmail.com Prashant Mandloi 3, School of computer science and engineering VIT UNIVERSITY Vellore, India nrprashant123@gmail.com Jaisankar N 4, School of computer science and engineering VIT UNIVERSITY Vellore, India njaisankar@vit.ac.in Abstract: In this paper, a virtual mouse that is based on tracking the movement of a finger is presented. The idea is for users to control their computer simply by moving their fingers over any realistic object or in air also. An intensity based approach is used to detect the arbitrary shaped, uniform colored 2D area on which the hand operates, and then the fingertip is effectively detected and tracked. A fast and robust method that can trace the position of fingertip is proposed. The method consists of five steps. First, conversion of RGB raw image into YUV color format. Second, the skin-color pixels of fingertip are detected by pixel classification using the chrominance component of the input from a web-cam which is having CMOS image sensor. Third, remove undetected area to reinforce the regions of skin-color pixels. Fourth, is partitioning of window for minimizing computation. Fifth, a fingertip tracking algorithm is applied to find out the fingertip position. Clicking action is also implemented by a specific movement. The method is used to implement virtual mouse in a windows based operating system. 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 134

Keywords- Virtual mouse, YUV color format, RGB raw image. 1. Introduction In the field of computer vision, many prototypes of intelligent vision-based interface systems have been developed that are more intuitive and cost-effective than the conventional interface devices such as the mouse and the keyboard. The method describes such a system that detects and tracks the fingertip in a 2-D plane, and then demonstrates its application as a virtual mouse. The mouse interface of a computer system or other electronic appliances is traditionally based on the direct contact of some electronic devices. The freedom of moving fingers through the air to control the computer is frequently imagined in some popular movies. The main objective of this work is to design a device which allows users to operate their mouse through the movement of their fingers in the air or in handmade panel. It provides alternative solutions for convenient device control, which encourages the application of virtual devices. A main problem in designing the virtual mouse is the accurate positioning. In this paper we use the fingertip as the pointer of the mouse, and the movement of the finger is tracked by a fast and robust detection algorithm. Furthermore, the mouse clicking functions are also implemented. All these algorithms are realized in a windows system. 2. Related Work The conventional one is having many flaws in it. The existing system consists of complex hardware system and finger mouse problem is being encountered every time. The other technique like hmouse [5] is also being proposed but the finger tracking virtual mouse is comparatively more convenient technique. In work at hmouse [5], the system is implemented in C++ on a standard PC, using a typical web camera mounted on top of the monitor as video input and VisGenie as video based information visualization platform. It processes 11 to 13 fps on a desktop with 2.80GHz Pentium 4 CPU and 1.0G RAM. The pure detection part processes 13 to 14 fps. It works on WINDOWS XP operating system with 320 x 420 video frame sizes in RGB channels. The average CPU usage is 35% to 43%.In work at [10], system developed a prototype of visionbased fingertip detection and tracking system on a 2D plane based on the intensity of captured images. This approach combined with the method of grid sampling results in speeds up to 30 frames per second with images of resolution 640x480 pixels. The system uses an arbitrary shaped panel of uniform color and a webcam, objects that are inexpensive and easily available in present day organizations. In work at [6], in this paper, a new interscopic multi-touch paradigm that combines traditional 2D interaction performed in monoscopic mode with 3D interaction and stereoscopic projection is introduced. The challenges and potentials for the use of multitouch interfaces for the interaction with interscopic data and are presented two different 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 135

application scenarios, i.e., city & landscape planning and medical exploration, that can be interfaced with user interfaces based on imuts. The both applications had highlighted two new multi-touch interaction metaphors which benefit from multitouch in combination with monoscopic or stereoscopic projection, the Window on the World navigation method and direct volume deformation. In work at [2], Instead of focusing on analogue methods to compensate for offset variation, research in logarithmic sensors should aim to minimise bias variation so that offset or offset and gain variation suffices to model FPN, and to minimise average bias, so that colour rendition in dim lighting improves. As the mask depends on spectral responses of photodiodes and overlaid filters and does not seem to vary across pixels, it may be estimated once for a process (a common practice with conventional linear cameras) rather than for every sensor. In work at [8], the random selection of 20 training samples for each gesture from the entire set of all 10 users is used to train HMMs, and the remaining samples used to determine the overall classification rate of the system. The classification rates for the gestures using two, three, four and five fingers are 93.8%, 91.4%, 92.1% and 88.8%, for an average gesture recognition rate of 91.5%, and the result is statistically significant (p <.0001). These results show that our technique classifies gestures within a reasonable accuracy for practical use. As the HMM can be trained using samples from more than one user, it can also be stored within an application or shared across users to avoid re-training. 3. Proposed System 3.1 RGB to YUV conversion The first stage of this YUV finger tracking algorithm is to have the YUV data. The first step is to convert RGB image raw data to YUV data. There is a standard of transforming RGB to YUV [1]. Luminance and color difference coded signals (YUV) are used by many component video systems. Conversion from RGB (Red-Green-Blue) is necessary to feed a device requiring YUV input. There are many converters but they use only constant transformation coefficients and are optimized only by one parameter that is, maximum input data frequency. 3.2 Pixel Classification The first stage of the algorithm involves the use of color information in a fast, low-level region classification process. The aim is to classify pixels of the input image into skin color and non-skin color. To do so, we have devised a skin-color reference region in YCrCb color space. A skin-color region can be identified by the presence of a certain set of chrominance (i.e., Cr and Cb) values distributed narrowly and consistently in the YCrCb color space. With this skin-color reference region, the pixel classification can now begin. Since we are utilizing only the color information, the classification requires only the chrominance component of the input image. The calibration technique is used which is described in [2]. 3.3 Removal of Noise 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 136

This stage considers the image produced from the previous stage to contain the region that is corrupted by noise. The noise may appear as small holes on the finger region due to the undetected finger features such as edges, shadow area, or it may also appear as object with skin-color appearance in the background scene. Therefore, this stage performs simple morphological operations such as dilation to fill in any small holes in the selected area and erosion to remove any small object in the background area. The intention is not necessarily to remove the noise entirely but to reduce its amount and size. The probability measure is to derive from observation that the finger color is very uniform, and therefore the skin-color pixels belonging to the finger region will appear in a large cluster, while the skin-color pixels belonging to the background may appear as large clusters or small isolated objects. There are many works on the restoration of images corrupted by impulse noise. The median filter [9] was once the most popular nonlinear filter for removing impulse noise, because of its good denoising power and computational efficiency. An algorithm to determine the median of noise-free pixels in the neighborhood of a pixel under interest is now presented. The median of the noise free pixels is utilized to modify the pixel corrupted with impulse noise. This median is computed separately for each color component in the following steps: Step1: Take a window of size a a centered on the pixel of interest in the corrupted image where a is any natural number. Step2: Arrange all the pixels of the window as a vector. Sort the vector in an increasing order and compute the median of the sorted vector. Step3: Calculate the difference between each window pixel and the median of the vector. Step4: Arrange all the window pixels having the differences less than or equal to a parameter in a vector. Step5: Sort the new vector and obtain the median of the sorted vector. 3.4 Window Searching Color comparing and finger tracking for all pixels are computationally expensive. Due to the limited processing power of the application processor, an exhaustive search of the whole frame is impossible. It is necessary to reduce the searching area. This is done by defining a certain search region around the last detected position of the finger. Another benefit of this method is to reduce the detection error. The project first assumes that the displacement of the finger in between two consecutive frames will not excess a particular searching region. Any movement outside the search region is not seen as the displacement of a finger. In other words, it is neglected. Therefore, detection error can be eliminated. Another assumption is that the time for a finger to move from one corner to the diagonal corner will not access the time pre-set so that the searching region in that time will cover the displacement of the finger. If the searching window is small, fast movement will not work as the finger is not in the window to be found. For normal speed movement, the mouse pointer can still follow the movement of the finger, meaning that the finger is still moving within the searching window. If the searching window is too large, will have some disadvantage. One of the disadvantages is the increased processing time and resources. Another disadvantage is an increase in the 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 137

error detection possibility. The advantage is that fast movement can also be found. Since the recognition is still under improvement, the window size can be optimized when the recognition part is fully developed. 3.5 Finger Tracking This method is to search the fingertip from the top left of the window. When the finger appears in the window, the top of the finger tips will be tracked and returned the position as mouse coordinate. As a mouse, click function is necessary for control. This algorithm uses Frustrated Total Internal Reflection Technology (FTIR) [4] and Gestures which is being described in [3, 8]. Using a finger as a mouse, click function is defined by a motion. The clicking motion is checked by GestureAnalyzer [3]. The new coordinates of the tip location are compared with the previous coordinates, and the difference between the coordinates is added to the previous coordinates. This results in mouse movement. The speed of the mouse can be controlled by using a multiplying factor T. Mathematically: Q(x,t) = T*{R(x,t) R(x,t-1)} + Q(x,t-1) Q(y,t) = T*{R(y,t) R(y,t-1)} + Q(y,t-1) Where Q denotes location of mouse pointer on screen, R denotes the coordinates of fingertip, x & y denotes horizontal and vertical axes respectively, T is multiplying factor and t denotes the present time. The following triggers should been used to provide multi-touch interaction to the application: ontouchenter: { cursorid, pos (new position) } ontouchmove: { cursorid, pos, movevector, acelerationvector} ontouchleave: { cursorid} Here, the cursorid is a sequential number generated when the user touches and destroyed when the finger is lifted. The events of clicking left and right virtual mouse button[5] are handled by the following C++ code. We set the threshold of button click trigger with ±0.6 radian roll angles. if (OnTracking) { CalculateRollAngle(); if (RollAngle<0.6) { GetCursorPos( &lmousepos ); ::mouse_event(mouseeventf_leftdown, lmousepos.x, lmousepos.y, 0, 0); ::mouse_event(mouseeventf_leftup, lmousepos.x, lmousepos.y, 0, 0); } else if (RollAngle>-0.6) { GetCursorPos( &rmousepos ); 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 138

} 4. Results ::mouse_event(mouseeventf_rightdown, rmousepos.x, rmousepos.y,0, 0); ::mouse_event(mouseeventf_rightup, rmousepos.x, rmousepos.y, 0, 0); } This paper presents a new approach to use the free movement of a finger in the air as the mouse pointer controls. It consists of several image processing algorithms, including RGB to YUV conversion, pixel classification, window searching and a finger tracking method. The image processing is based on YUV color space. The pixels belonging to skin color, in YUV color space, exhibit similar chrominance values. With this skin-color map, the pixels of the input image can be classified into skin color and non-skin color. Consequently, the resulted image can be used for finger tracking. A mouse coordinate is extracted after having tracked the finger tip. A clicking recognition algorithm is also designed. A prototyping device is built with a web-cam. It can connect to a computer becoming a real mouse. The prominence of the work is that the whole system is running on an embedded system which is far cheaper and highly mobile. The design can be implemented into a number of other products, such as those for use of disabled people. Future work on the system can include the integration of a character-recognition system with this system. Various extensions and improvements in the fingertip detection and tracking system can be explored. The programming is done in OpenCV and the application is developed. The screenshots are as follows: Figure 1: Property Sheet Dialog box in which the values for webcam is provided and coordinates of pixels are displayed. 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 139

Figure 2: The windows for Capture Device and Mono in which a blob is detected as finger tip and convert into black and white. Figure 3: The windows for Smoothing and background removal in which the blob is smoothen by rearranging pixels and the noise is removed. Figure 4: The windows for brightness, contrast and rectify in which the brightness and contrast has been set after which rectification has done. 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 140

Figure 5: This calibration window in which thw pixels are represented. Figure 6: The final values of all the pixel coordinates represented in XML sheet. References: [1] The real-time digital color converter core for Xilinx FPGA by Valeriy Hlukhov, Anatoliy Melnik [2] Modelling, calibration and rendition of colour logarithmic CMOS image sensors by Dileepan Joseph and Steve Collins of Engineering Science, University of Oxford, OX1 3PJ, United Kingdom. [3] Michael Thornlund, Gesture Analyzing for Multi-Touch Interfaces [4] J Han. Low-cost multi-touch sensing through frustrated total internal reflection. Proceedings of the 18th annual ACM symposium on User interface software and technology, 2005. [5] hmouse: Head Tracking Driven Virtual Computer Mouse by Yun Fu and Thomas S. Huang Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana- Champaign 405 North Mathews Avenue, Urbana, IL 61801 [6] Poster: Interscopic Multi-Touch Surfaces: Using bimanual Interaction for intuitive Manipulation of Spatial Data by Johannes Sch oning, Frank Steinicke, Antonio Kr uger, Klaus Hinrichs. [7] Multi-touch User Interfaces by Thomas E. Hansen of Department of Computer Science, University of Iowa 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 141

[8] Multitouch Gesture Learning and Recognition System by Sashikanth Damaraju, Andruid Kerne of Interface Ecology Lab at Texas A&M University [9] High Density Impulse noise Removal in Color Images Using Median Controlled Adaptive Recursive Weighted Median Filter by V.R.Vijay Kumar, S.Manikandan, D.Ebenezer, P.T.Vanathi and P.Kanagasabapathy [10] A Fingertip Detection and Tracking System as a Virtual Mouse, a Signature Input Device and an Application Selector, Proceedings of the 7th International Caribbean Conference on Devices, Circuits and Systems by Abhimanyu Sanghi, Himanshu Arora, Kshitij Gupta, Vipin B. Vats. 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 142

Journal of Computer Science & Applications (TIJCSA) ISSN 2278-1080, Vol. 2 No. 03 2013, http://www.journalofcomputerscience.com - TIJCSA All Rights Reserved 143