TY - JOUR
T1 - Robust face tracking via collaboration of generic and specific models
AU - Wang, Peng
AU - Ji, Qiang
N1 - Funding Information:
Manuscript received February 5, 2007; revised March 4, 2008. This work was supported in part by the Rensselaer Polytechnic Institute from the Task Support Working Group (TSWG) of the United States under Grant N41756-03-C-4028. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Zoltan Kato. P. Wang is with Siemens Corporate Research, Princeton, NJ 08540 USA. Q. Ji is with the Department of Electrical, Computer, and Systems Engineering, Rensselaer Polytechnic Institute, Troy, NY 12180 USA. Digital Object Identifier 10.1109/TIP.2008.924287
PY - 2008/7
Y1 - 2008/7
N2 - Significant appearance changes of objects under different orientations could cause loss of tracking, "drifting." In this paper, we present a collaborative tracking framework to robustly track faces under large pose and expression changes and to learn their appearance models online. The collaborative tracking framework probabilistically combines measurements from an offline-trained generic face model with measurements from online-learned specific face appearance models in a dnamic Bayesian nework. In this framework, generic face models provide the knowledge of the whole face class, while specific face models provide information on individual faces being tracked. Their combination, therefore, provides robust measurements for multiview face tracking. We introduce a mixture of probabilistic principal component analysis (MPPCA) model to represent the appearance of a specific face under multiple views, and we also present an online EM algorithm to incrementally update the MPPCA model using tracking results. Experimental results demonstrate that the collaborative tracking and online learning methods can handle large pose changes and are robust to distractions from the background.
AB - Significant appearance changes of objects under different orientations could cause loss of tracking, "drifting." In this paper, we present a collaborative tracking framework to robustly track faces under large pose and expression changes and to learn their appearance models online. The collaborative tracking framework probabilistically combines measurements from an offline-trained generic face model with measurements from online-learned specific face appearance models in a dnamic Bayesian nework. In this framework, generic face models provide the knowledge of the whole face class, while specific face models provide information on individual faces being tracked. Their combination, therefore, provides robust measurements for multiview face tracking. We introduce a mixture of probabilistic principal component analysis (MPPCA) model to represent the appearance of a specific face under multiple views, and we also present an online EM algorithm to incrementally update the MPPCA model using tracking results. Experimental results demonstrate that the collaborative tracking and online learning methods can handle large pose changes and are robust to distractions from the background.
KW - Collaborative tracking
KW - Generic face model
KW - Mixture of probabilistic principal component analysis (MPPCA)
KW - Multiview face tracking
KW - Online learning
UR - https://www.scopus.com/pages/publications/45949094205
U2 - 10.1109/TIP.2008.924287
DO - 10.1109/TIP.2008.924287
M3 - Article
C2 - 18586626
AN - SCOPUS:45949094205
SN - 1057-7149
VL - 17
SP - 1189
EP - 1199
JO - IEEE Transactions on Image Processing
JF - IEEE Transactions on Image Processing
IS - 7
ER -