TY - JOUR
T1 - ℓ 2,p-norm based PCA for image recognition
AU - Wang, Qianqian
AU - Gao, Quanxue
AU - Gao, Xinbo
AU - Nie, Feiping
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2018/3
Y1 - 2018/3
N2 - Recently, many ℓ 1-norm-based PCA approaches have been developed to improve the robustness of PCA. However, most existing approaches solve the optimal projection matrix by maximizing ℓ 1-norm-based variance and do not best minimize the reconstruction error, which is the true goal of PCA. Moreover, they do not have rotational invariance. To handle these problems, we propose a generalized robust metric learning for PCA, namely, ℓ 2, p-PCA, which employs ℓ 2, p-norm as the distance metric for reconstruction error. The proposed method not only is robust to outliers but also retains PCA's desirable properties. For example, the solutions are the principal eigenvectors of a robust covariance matrix and the low-dimensional representation have rotational invariance. These properties are not shared by ℓ 1-norm-based PCA methods. A new iteration algorithm is presented to solve ℓ 2, p-PCA efficiently. Experimental results illustrate that the proposed method is more effective and robust than PCA, PCA-L1 greedy, PCA-L1 nongreedy, and HQ-PCA.
AB - Recently, many ℓ 1-norm-based PCA approaches have been developed to improve the robustness of PCA. However, most existing approaches solve the optimal projection matrix by maximizing ℓ 1-norm-based variance and do not best minimize the reconstruction error, which is the true goal of PCA. Moreover, they do not have rotational invariance. To handle these problems, we propose a generalized robust metric learning for PCA, namely, ℓ 2, p-PCA, which employs ℓ 2, p-norm as the distance metric for reconstruction error. The proposed method not only is robust to outliers but also retains PCA's desirable properties. For example, the solutions are the principal eigenvectors of a robust covariance matrix and the low-dimensional representation have rotational invariance. These properties are not shared by ℓ 1-norm-based PCA methods. A new iteration algorithm is presented to solve ℓ 2, p-PCA efficiently. Experimental results illustrate that the proposed method is more effective and robust than PCA, PCA-L1 greedy, PCA-L1 nongreedy, and HQ-PCA.
KW - Dimensionality reduction
KW - Principal component analysis
KW - ℓ 1-norm
UR - http://www.scopus.com/inward/record.url?scp=85035777768&partnerID=8YFLogxK
U2 - 10.1109/TIP.2017.2777184
DO - 10.1109/TIP.2017.2777184
M3 - 文章
C2 - 29989986
AN - SCOPUS:85035777768
SN - 1057-7149
VL - 27
SP - 1336
EP - 1346
JO - IEEE Transactions on Image Processing
JF - IEEE Transactions on Image Processing
IS - 3
ER -