Optimal mean two-dimensional principal component analysis with F-norm minimization

Qianqian Wang, Quanxue Gao, Xinbo Gao, Feiping Nie

Research output: Contribution to journalArticlepeer-review

35 Scopus citations

Abstract

Two-dimensional principal component analysis (2DPCA) employs the squared F-norm as distance metric for feature extraction and is widely used in the field of pattern analysis and recognition, especially face image analysis. But it is sensitive to the presence of outliers due to the fact that squared F-norm remarkably enlarges the role of outliers in the criterion function. To handle this problem, we propose a robust formulation for 2DPCA, namely optimal mean 2DPCA with F-norm minimization (OMF-2DPCA). In OMF-2DPCA, distance in spatial dimensions (attribute dimensions) is measured in F-norm, while the summation over different data points uses 1-norm. Moreover, we center the data using the optimized mean rather than the fixed mean. This helps further improve robustness of our method. To solve OMF-2DPCA, we propose a fast iterative algorithm, which has a closed-form solution in each iteration. Experimental results on face image databases illustrate its effectiveness and advantages.

Original languageEnglish
Pages (from-to)286-294
Number of pages9
JournalPattern Recognition
Volume68
DOIs
StatePublished - 1 Aug 2017
Externally publishedYes

Keywords

  • 2DPCA
  • Dimensionality reduction
  • F-norm
  • Optimized mean

Fingerprint

Dive into the research topics of 'Optimal mean two-dimensional principal component analysis with F-norm minimization'. Together they form a unique fingerprint.

Cite this