Max–Min Robust Principal Component Analysis

Sisi Wang, Feiping Nie, Zheng Wang, Rong Wang, Xuelong Li

科研成果: 期刊稿件文章同行评审

3 引用 (Scopus)

摘要

Principal Component Analysis (PCA) is a powerful unsupervised dimensionality reduction algorithm, which uses squared ℓ2-norm to cleverly connect reconstruction error and projection variance, and those improved PCA methods only consider one of them, which limits their performance. To alleviate this problem, we propose a novel Max–Min Robust Principal Component Analysis via binary weight, which ingeniously combines reconstruction error and projection variance to learn projection matrix more accurately, and uses ℓ2-norm as evaluation criterion to make the model rotation invariant. In addition, we design binary weight to remove outliers to improve the robustness of model and obtain the ability of anomaly detection. Subsequently, we exploit an efficient iterative optimization algorithm to solve this problem. Extensive experimental results show that our model outperforms related state-of-the-art PCA methods.

源语言英语
页(从-至)89-98
页数10
期刊Neurocomputing
521
DOI
出版状态已出版 - 7 2月 2023

指纹

探究 'Max–Min Robust Principal Component Analysis' 的科研主题。它们共同构成独一无二的指纹。

引用此