Capped ℓp-Norm LDA for Outliers Robust Dimension Reduction

Zheng Wang, Feiping Nie, Canyu Zhang, Rong Wang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Linear discriminant analysis technique is an effective strategy to solve the long-standing issue, i.e., the 'curse of dimensionality' that brings many obstacles on high-dimensional data storage and analysis. However, the projections are prone to be affected, especially when the training set contains outlier samples whose distribution deviates from the globality. In many real-world applications, the outlier samples contaminated by noisy signal or spottiness have negative effects on the classification and clustering performance. To address this issue, we propose to develop a novel capped ℓp-norm LDA model for robust dimension reduction against to outliers specifically. Proposed method integrates the capped ℓp-norm based loss into the objective, which not only suppresses the light outliers but also works well even though the training set is contaminated seriously. Furthermore, we derive an alternative iterative re-weighted optimization algorithm to minimize the proposed objective based on capped ℓp-norm with rigorous convergence proofs. Extensive experiments conducted on synthetic and real-world datasets demonstrate the robustness against to outliers of proposed method.

Original languageEnglish
Article number9146276
Pages (from-to)1315-1319
Number of pages5
JournalIEEE Signal Processing Letters
Volume27
DOIs
StatePublished - 2020

Keywords

  • capped ℓ-norm
  • image classification
  • non-convex optimization
  • Robust dimension reduction

Fingerprint

Dive into the research topics of 'Capped ℓp-Norm LDA for Outliers Robust Dimension Reduction'. Together they form a unique fingerprint.

Cite this