A Study on SURF Algorithm and Real-Time Tracking Objects Using Optical Flow



Similar documents
The use of computer vision technologies to augment human monitoring of secure computing facilities

Feature Tracking and Optical Flow

Object tracking & Motion detection in video sequences

A Comparative Study between SIFT- Particle and SURF-Particle Video Tracking Algorithms

siftservice.com - Turning a Computer Vision algorithm into a World Wide Web Service

Feature Point Selection using Structural Graph Matching for MLS based Image Registration

EFFICIENT VEHICLE TRACKING AND CLASSIFICATION FOR AN AUTOMATED TRAFFIC SURVEILLANCE SYSTEM

Android Ros Application

Image Segmentation and Registration

Probabilistic Latent Semantic Analysis (plsa)

An Iterative Image Registration Technique with an Application to Stereo Vision

A Learning Based Method for Super-Resolution of Low Resolution Images

Local features and matching. Image classification & object localization

TouchPaper - An Augmented Reality Application with Cloud-Based Image Recognition Service

CS231M Project Report - Automated Real-Time Face Tracking and Blending

OBJECT TRACKING USING LOG-POLAR TRANSFORMATION

Simultaneous Gamma Correction and Registration in the Frequency Domain

Vision-Based Blind Spot Detection Using Optical Flow

Effective Use of Android Sensors Based on Visualization of Sensor Information

Vision based Vehicle Tracking using a high angle camera

Accurate and robust image superresolution by neural processing of local image representations

Tracking and Recognition in Sports Videos


SIGNATURE VERIFICATION

Practical Tour of Visual tracking. David Fleet and Allan Jepson January, 2006

Distinctive Image Features from Scale-Invariant Keypoints

Video stabilization for high resolution images reconstruction

Part-Based Recognition

Computational Foundations of Cognitive Science

BRIEF: Binary Robust Independent Elementary Features

Bildverarbeitung und Mustererkennung Image Processing and Pattern Recognition

ROBUST VEHICLE TRACKING IN VIDEO IMAGES BEING TAKEN FROM A HELICOPTER

Traffic Flow Monitoring in Crowded Cities

Robert Collins CSE598G. More on Mean-shift. R.Collins, CSE, PSU CSE598G Spring 2006

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

High Quality Image Magnification using Cross-Scale Self-Similarity

Automatic georeferencing of imagery from high-resolution, low-altitude, low-cost aerial platforms

Face Recognition in Low-resolution Images by Using Local Zernike Moments

Automatic 3D Mapping for Infrared Image Analysis

3D Model based Object Class Detection in An Arbitrary View

Removing Moving Objects from Point Cloud Scenes

Design of Multi-camera Based Acts Monitoring System for Effective Remote Monitoring Control

Canny Edge Detection

VEHICLE LOCALISATION AND CLASSIFICATION IN URBAN CCTV STREAMS

Mean-Shift Tracking with Random Sampling

Virtual Mouse Using a Webcam

Optical Flow. Shenlong Wang CSC2541 Course Presentation Feb 2, 2016

Image Gradients. Given a discrete image Á Òµ, consider the smoothed continuous image ܵ defined by

Automatic 3D Reconstruction via Object Detection and 3D Transformable Model Matching CS 269 Class Project Report

Multi-Modal Acoustic Echo Canceller for Video Conferencing Systems

Augmented Reality Tic-Tac-Toe

Edge-based Template Matching and Tracking for Perspectively Distorted Planar Objects

Automatic Restoration Algorithms for 35mm film

Convolution. 1D Formula: 2D Formula: Example on the web:

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

Real-time Traffic Congestion Detection Based on Video Analysis

Tracking of Small Unmanned Aerial Vehicles

Segmentation of building models from dense 3D point-clouds

SURF: Speeded Up Robust Features

The Scientific Data Mining Process

A Comparative Study of SIFT and its Variants

Classifying Manipulation Primitives from Visual Data

Two-Frame Motion Estimation Based on Polynomial Expansion

HSI BASED COLOUR IMAGE EQUALIZATION USING ITERATIVE n th ROOT AND n th POWER

Modelling, Extraction and Description of Intrinsic Cues of High Resolution Satellite Images: Independent Component Analysis based approaches

Face Model Fitting on Low Resolution Images

Crime Hotspots Analysis in South Korea: A User-Oriented Approach

Randomized Trees for Real-Time Keypoint Recognition

CHAPTER 6 TEXTURE ANIMATION

Interactive Offline Tracking for Color Objects

Analecta Vol. 8, No. 2 ISSN

Image Stitching based on Feature Extraction Techniques: A Survey

Robust and accurate global vision system for real time tracking of multiple mobile robots

Nowcasting of significant convection by application of cloud tracking algorithm to satellite and radar images

CS 534: Computer Vision 3D Model-based recognition

Efficient Background Subtraction and Shadow Removal Technique for Multiple Human object Tracking

Image Analysis for Volumetric Industrial Inspection and Interaction

MOTION DETECTION FOR PC BASED SECURITY SYSTEM BY USING OPTICAL FLOW NUR NABILAH BT MOHAMAD HAMID

The Development of an Intellectual Tracking App System based on IoT and RTLS

Detection and Restoration of Vertical Non-linear Scratches in Digitized Film Sequences

MusicGuide: Album Reviews on the Go Serdar Sali

Static Environment Recognition Using Omni-camera from a Moving Vehicle

ACCURACY ASSESSMENT OF BUILDING POINT CLOUDS AUTOMATICALLY GENERATED FROM IPHONE IMAGES

Implementation of Canny Edge Detector of color images on CELL/B.E. Architecture.

Blind Deconvolution of Barcodes via Dictionary Analysis and Wiener Filter of Barcode Subsections

A Genetic Algorithm-Evolved 3D Point Cloud Descriptor

Automatic Labeling of Lane Markings for Autonomous Vehicles

A Simple Feature Extraction Technique of a Pattern By Hopfield Network

Projection Center Calibration for a Co-located Projector Camera System

Face detection is a process of localizing and extracting the face region from the

Segmentation & Clustering

How To Analyze Ball Blur On A Ball Image

REAL TIME TRAFFIC LIGHT CONTROL USING IMAGE PROCESSING

Transcription:

, pp.233-237 http://dx.doi.org/10.14257/astl.2014.51.53 A Study on SURF Algorithm and Real-Time Tracking Objects Using Optical Flow Giwoo Kim 1, Hye-Youn Lim 1 and Dae-Seong Kang 1, 1 Department of electronices engineering, Dong-A University, 37 Nakdong-Daero 550, beon-gil saha-gu, Busan, Korea dskang@dau.ac.kr Abstract. This paper presents SURF (Speed-Up Robust Features) algorithm and real-time tracking objects using optical flow algorithm. SURF is the fastest algorithm to find features of an object image in real-time and then by using this algorithm we will compare object image features with our real-time object image features. After that match object image features and real-time features. Next, we will apply optical flow algorithm to track the object location. It tracks the object exactly in real-time when object appear in camera s view. In this paper, we use LK (Lucas-Kanade) optical flow which can operate reduction by choosing an area. We can track object quickly and exactly in real-time. Keywords: SURF, Optical flow, Matching, Feature detection 1 Introduction In recent years terrorism is increasing all around the world, the demand of security related services and software s have become higher importance. For example, if we want to track a person or object in any public places or shopping malls with the help of CCTV footage video we can track and find the person and also like black box in missing planes and ships, using black box to check mechanical problem in airplane and ship and so on. Various techniques are developed based on different approaches, applications. Here, we use SURF and optical flow algorithms to find and track objects. SURF is robust local feature detector, first presented by Herbert Bay et al. in 2006, that can be used in computer vision tasks like object recognition. It is partly inspired by the SIFT (Scale Invariant Feature Transform) descriptor. The standard version of SURF is several times faster than SIFT. SURF is based on sums of 2D Haar wavelet responses and makes an efficient use of integral images. It uses an integer approximation to the determinant of Hessian blob detector, which can be computed extremely quickly with an integral image (3 integer operations). For features, it uses the sum of the Haar wavelet response around the point of interest. Again, these can be computed with the aid of the integral image. Optical flow is the pattern of apparent motion of image objects between two consecutive frames caused by the movement of object or camera. It is 2D vector field where each vector is a displacement vector showing the movement of points from first frame to second. ISSN: 2287-1233 ASTL Copyright 2014 SERSC

2 Related Theory 2.1 SURF Algorithm The SURF algorithm is composed of three consecutive steps [1][2]. The first step is interest point detection and second step is building the descriptor associated with each interest points. The last step is descriptor matching. Similarly to the SIFT method, the first two steps rely on a scale-space representation, and on first and second order differential operators. The originality of the SURF method is that these operations are speeded-up by the use of an integral image and box filters techniques. While the scale space is obtained by convolution of the initial images with Gaussians, the discrete box-space is obtained similarly by convolving the original image with box filters at several different discrete sizes. In the detection step, the local maxima of a Hessianlike operator, the box Hessian operator, applied to the box-space are computed to select interest point candidates. These candidates are then validated if the response is above a given threshold. Both box size and location of these candidates are then refined using an iterated procedure fitting locally a quadratic function. Typically, a few hundreds interest points are detected in a digital image of one megapixel. The purpose of the second step described is to build a descriptor that is invariant to viewpoint changes of the local neighborhood of the point of interest. The location of this point in the box-space provides invariance to scale and provides scale and translation invariance. To achieve rotation invariance, a dominant orientation is defined by considering the local gradient orientation distribution, estimated with Haar wavelets. Making use of a spatial localization grid, a 64-dimensional descriptor is then built, corresponding to a local histogram of the Haar wavelet responses. Classically, the third step matches the descriptors of both compared images. Exhaustive comparison is performed by computing the Euclidean distance between all potential matching pairs. 2.2 Optical Flow Algorithm We usually use optical flow algorithm when tracking of an object [3]. This algorithm divides two types. One is dense optical flow and the other is sparse optical flow. First, dense optical flow case, it is not easy to calculate. For example, after choosing object, compare with previous frame and present frame and find that object color values are same. Only the edges may change, and even then only those perpendicular to the direction of motion. The result is the dense methods must have some methods of interpolating between points that are more easily tracked so as solve those points. This leads to alterative option sparse optical flow [4]. Algorithms of this nature rely on some means of specifying beforehand the subset of points that are to be tracked. This method uses LK (Lucas-Kanade) algorithm with pyramid image. When we use LK algorithm, we assume three points. One is brightness constancy. The pixel from the image of an object in the scene does not change in appearance as it moves from frame to frame. Second is temporal persistence. The image motion of the surface 234 Copyright 2014 SERSC

patch changes slowly in time. Third is spatial coherence. Neighboring points belong to same surface, similar motion and project to nearby points. 3 Proposed Algorithm We proposed this algorithm using a SURF algorithm with optical flow. SURF algorithm can find an object in real-time and optical flow algorithm can be possible to track an object in real-time. In fig.1, we can see our proposed algorithm. First, through the SURF feature detector, we get a feature in real-time and this feature is compared with image SURF features. After detector, a descriptor and interest point s process can be possible to detect an object. If an object gets matched, then image features should match with in real-time image feature which have same points more than fifty. Fig. 1. The diagram of proposed system algorithm. (a) (b) (c) (d) Fig. 2. The result of matching based on SURF. (a) shows 5 th frame (d) shows 15 th frame object moved from left to right. (a) (b) (c) (d) Fig. 3. The results of applying optical flow. (a) is optical flow in real-time; (b) to (c) show object moving; (d) shows pixel moved. The fig. 2 indicated SURF and matching result. We can see the object moving and change in coordinates. After matching we record time and coordinates. It could be possible to compare with first image and last image. It helps to calculate location and direction. At the same time optical flow also calculates. To see the optical flow calculation, we draw the line like in fig. 3. Where, (a) shows optical flow in real-time; Copyright 2014 SERSC 235

(b) is previous image and (c) is present image; (d) shows what pixels are moved. The fig.4 presents final result. Each figure, we can see the object directions which are present using allows. When we focus on the object, we know that almost allows exist near the object area and background area has a little arrow amount. It means the background is fixed when we see the camera s view. It makes to get object direction. This result means if we use SURF algorithm and optical flow algorithm we can track object fast and exactly. (a) Fig. 4. The final result of proposed method. (a) presents the object move from south to east; (b) presents object move from south to west. (b) 4 Conclusion and future Research In this paper, we proposed our algorithm using SURF algorithm and optical flow algorithm in real-time. In real-time experience, we can get an object location and direction fast and exact. In this experiment, when we track object using a webcam, background is same all times and it can track an object. In the future, we can track objects while the camera is moving so that it can track its background along with an object. Acknowledgments. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2011-0011735). References 1. Bay, Herbert, et al.: Speeded-up robust features (SURF). In: Computer vision and image understanding 110.3, pp. 346 --359 (2008) 2. Gossow, David, Peter Decker, and Dietrich Paulus.: An evaluation of open source SURF implementations. In: RoboCup 2010: Robot Soccer World Cup XIV, pp. 169--179. Springer, Berlin Heidelberg (2011) 3. Horn, Berthold K, and Schunck, B.: Determining optical flow. In: 1981 Technical Symposium East. International Society for Optics and Photonics (1981) 236 Copyright 2014 SERSC

4. Bruhn, Andres, Joachim Weickert, and Christoph Schnorr.: Lucas/Kanade meets Horn/Schunck: Combining local and global optic flow methods. In: International Journal of Computer Vision 61.3, pp 211--231 (2005) Copyright 2014 SERSC 237