Advances in Motion Detection and Tracking of Human as a Target
J. Biju1, D. Shanthi2

1J. Biju, Asst. Professor, Department of Computer Science and Engineering, Theni Kammavar Sangam College of Technology Theni, Tamilnadu, India. 
2Dr. D. Shanthi, Professor and Head, Department of Computer Science and Engineering, PSNA College of Engineering & Technology, Dindigul, Tamilnadu, India.
Manuscript received on 28 August 2019. | Revised Manuscript received on 05 September 2019. | Manuscript published on 30 September 2019. | PP: 3470-3473 | Volume-8 Issue-11, September 2019. | Retrieval Number: K25610981119/2019©BEIESP | DOI: 10.35940/ijitee.K2561.0981119
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (

Abstract: Motion estimation of a target is the major area with higher computational complexity in video processing. It is the progression of discovery the motion patterns that describe the transformation from one frame to another in a sequence of video. Therefore, it is reasonable to carry out motion estimation only where movement is present. Image data in an image series remains mostly the same between frames along the target motion. To make use of the image statistics redundancy in image sequences, there is a need to guess motion. Motion estimation is valid for video compression improvement, stereo correspondence, object tracking and finding optical flow. Many precise methods have been proposed in the framework of one or more of these applications. Most motion estimation algorithms either operate directly in the image domain or finding the similar metric that measures how alike two pixels or two patches of pixels. In this paper, a review of a variety of motion estimation technique is presented.
Keywords: Optical flow, Target tracking, Segmentation, Motion detection, Representation of targets
Scope of the Article: Human Computer Interaction (HCI)