Open Access Open Access  Restricted Access Subscription or Fee Access

Gait Recognition and Carried Object Detection

X. Pergin Sheni, Dr.D. Sharmila

Abstract


The term gait recognition is typically used to signify the identification of people in image sequences by the way they walk. Gait is determined by the physical characteristics of each individual, and so is believed to be as unique to the person as a fingerprint. Gait is also one of the few biometrics that can be measured at a distance, which makes it useful in surveillance applications as well. In this paper our objective is to develop robust methods for extracting discriminant gait features automatically and passively from low-resolution video. We also find whether the person carrying any object or not.

Keywords


biometrics,surveillance applications,keystroke dynamics

Full Text:

PDF

References


Christopher M. Bishop , “Neural networks for pattern recognition”, Oxford University Press 1995.

P. Bloomfield. “Fourier Analysis of Time Series: an Introduction”. John Wiley and Sons, 1976.

Rafael C. Gonzalez Richard E. Woods , “Digital Image Processing”, Prentice Hall 15- Jan 2002 .

M. S. Nixon C. Yam and J. N. Carter. Extended model- based automatic gait recognition of walking and running. In Audio- and Video-based Biometric Person Authentication,2001.

M. Turk and A. Pentland. Face recognition using eigenfaces. In Proceedings of the Computer Vision and Pattern Recognition, 1991.

N. Cuntoor A. Kale and R. Chellapa. A framework for activity based human recognition.In International Conference on Acoustics Speech and Signal Processing,2002.

R. G. Cutler C. BenAbdelkader and L. S. Davis. Eigengait: Motion- based recognition of people using image self-similarity. In Audio-

and Video-based Biometric Person Authentication, 2001.

X. Feng Y. Song and P. Perona. Towards detection of human motion. In Proceedingsof the Computer Vision and Pattern Recognition, 2000.


Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.