TY - JOUR
T1 - Discrete Robust Principal Component Analysis via Binary Weights Self-Learning
AU - Nie, Feiping
AU - Wang, Sisi
AU - Wang, Zheng
AU - Wang, Rong
AU - Li, Xuelong
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2023/11/1
Y1 - 2023/11/1
N2 - Principal component analysis (PCA) is a typical unsupervised dimensionality reduction algorithm, and one of its important weaknesses is that the squared ℓ2 -norm cannot overcome the influence of outliers. Existing robust PCA methods based on paradigm have the following two drawbacks. First, the objective function of PCA based on the ℓ1-norm has no rotational invariance and limited robustness to outliers, and its solution mostly uses a greedy search strategy, which is expensive. Second, the robust PCA based on the ℓ2,1-norm and the ℓ2,p-norm is essential to learn probability weights for data, which only weakens the influence of outliers on the learning projection matrix and cannot be completely eliminated. Moreover, the ability to detect anomalies is also very poor. To solve these problems, we propose a novel discrete robust principal component analysis (DRPCA). Through self-learning binary weights, the influence of outliers on the projection matrix and data center estimation can be completely eliminated, and anomaly detection can be directly performed. In addition, an alternating iterative optimization algorithm is designed to solve the proposed problem and realize the automatic update of binary weights. Finally, our proposed model is successfully applied to anomaly detection applications, and experimental results demonstrate that the superiority of our proposed method compared with the state-of-the-art methods.
AB - Principal component analysis (PCA) is a typical unsupervised dimensionality reduction algorithm, and one of its important weaknesses is that the squared ℓ2 -norm cannot overcome the influence of outliers. Existing robust PCA methods based on paradigm have the following two drawbacks. First, the objective function of PCA based on the ℓ1-norm has no rotational invariance and limited robustness to outliers, and its solution mostly uses a greedy search strategy, which is expensive. Second, the robust PCA based on the ℓ2,1-norm and the ℓ2,p-norm is essential to learn probability weights for data, which only weakens the influence of outliers on the learning projection matrix and cannot be completely eliminated. Moreover, the ability to detect anomalies is also very poor. To solve these problems, we propose a novel discrete robust principal component analysis (DRPCA). Through self-learning binary weights, the influence of outliers on the projection matrix and data center estimation can be completely eliminated, and anomaly detection can be directly performed. In addition, an alternating iterative optimization algorithm is designed to solve the proposed problem and realize the automatic update of binary weights. Finally, our proposed model is successfully applied to anomaly detection applications, and experimental results demonstrate that the superiority of our proposed method compared with the state-of-the-art methods.
KW - Anomaly detection
KW - binary weights
KW - reconstruction
KW - robust principal component analysis (PCA)
UR - http://www.scopus.com/inward/record.url?scp=85127811488&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2022.3155607
DO - 10.1109/TNNLS.2022.3155607
M3 - 文章
C2 - 35380971
AN - SCOPUS:85127811488
SN - 2162-237X
VL - 34
SP - 9064
EP - 9077
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 11
ER -