Regular Paper

[OPTICAL REVIEW Vol. 21, No. 3 (2014) 304-312]
© 2014 The Japan Society of Applied Physics

Eigenspace-Based Tracking for Feature Points

Chen PENG*, Qian CHEN, and Wei-xian QIAN

Jiangsu Key Laboratory of Spectral Imaging and Intelligent Sense, Nanjing 210094, China

(Received September 24, 2013; Revised February 21, 2014; Accepted March 24, 2014)

Feature point tracking deals with image streams that change over time. Most existing feature point tracking algorithms only consider two adjacent frames at a time, and forget the feature information of previous frames. In this paper, we present a new eigenspace-based tracking method that learns an eigenspace representation of training features online, and finds the target feature point with Gauss–Newton style search method. A coarse-to-fine processing strategy is introduced to handle large affine transformations. Several simulations and experiments on real images indicate the effectiveness of the proposed feature tracking algorithm under the conditions of large pose changes and temporary occlusions.

Key words: feature points, visual tracking, eigenspace methods, occlusion


*E-mail address: 309040626@njust.edu.cn

 

© 1994-2014 The Japan Society of Applied Physics
Produced, Developed, and Maintained by The Optical Society of Japan (An Affiliate of the Japan Society of Applied Physics)
Printed in Japan by Komiyama Printing Co., Ltd.

mail to Editorial Office, OPTICAL REVIEW