Capped ℓp-Norm LDA for Outliers Robust Dimension Reduction

Zheng Wang, Feiping Nie, Canyu Zhang, Rong Wang, Xuelong Li

科研成果: 期刊稿件文章同行评审

16 引用 (Scopus)

摘要

Linear discriminant analysis technique is an effective strategy to solve the long-standing issue, i.e., the 'curse of dimensionality' that brings many obstacles on high-dimensional data storage and analysis. However, the projections are prone to be affected, especially when the training set contains outlier samples whose distribution deviates from the globality. In many real-world applications, the outlier samples contaminated by noisy signal or spottiness have negative effects on the classification and clustering performance. To address this issue, we propose to develop a novel capped ℓp-norm LDA model for robust dimension reduction against to outliers specifically. Proposed method integrates the capped ℓp-norm based loss into the objective, which not only suppresses the light outliers but also works well even though the training set is contaminated seriously. Furthermore, we derive an alternative iterative re-weighted optimization algorithm to minimize the proposed objective based on capped ℓp-norm with rigorous convergence proofs. Extensive experiments conducted on synthetic and real-world datasets demonstrate the robustness against to outliers of proposed method.

源语言英语
文章编号9146276
页(从-至)1315-1319
页数5
期刊IEEE Signal Processing Letters
27
DOI
出版状态已出版 - 2020

指纹

探究 'Capped ℓp-Norm LDA for Outliers Robust Dimension Reduction' 的科研主题。它们共同构成独一无二的指纹。

引用此