TY - GEN
T1 - Low-rank graph regularized sparse coding
AU - Zhang, Yupei
AU - Liu, Shuhui
AU - Shang, Xuequn
AU - Xiang, Ming
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2018.
PY - 2018
Y1 - 2018
N2 - In this paper, we propose a solution to the instability problem of sparse coding with the technique of low-rank representation (LRR) which is a promising method of discovering subspace structures of data. Graph regularized sparse coding has been extensively studied for keeping the locality of the high-dimensional observations. However, in practice, data is always corrupted by noises such that samples from the same class may not inhabit the nearest area. To this end, we present a novel method for robust sparse representation, dubbed low-rank graph regularized sparse coding (LogSC). LogSC uses LRR to capture the multiple subspace structures of the data and aims to preserve this structure into the resultant sparse codes. Different from the traditional methods, our method, jointly rather than separately, learns the sparse codes and the LRR; our method maintains the global structure of the data no longer the local structure. Thus, the yielding sparse codes can be not only robust to the corrupted samples thanks to the LRR, but also discriminative arising from the multiple subspaces preserving. The optimization problem of LogSC can be effectively tackled by the linearized alternating direction method with adaptive penalty. To evaluate our approach, we apply LogSC for image clustering and classification, and meanwhile probe it in noisy scenes. The inspiring experimental results on the public image data sets manifest the discrimination, the robustness and the usability of the proposed LogSC.
AB - In this paper, we propose a solution to the instability problem of sparse coding with the technique of low-rank representation (LRR) which is a promising method of discovering subspace structures of data. Graph regularized sparse coding has been extensively studied for keeping the locality of the high-dimensional observations. However, in practice, data is always corrupted by noises such that samples from the same class may not inhabit the nearest area. To this end, we present a novel method for robust sparse representation, dubbed low-rank graph regularized sparse coding (LogSC). LogSC uses LRR to capture the multiple subspace structures of the data and aims to preserve this structure into the resultant sparse codes. Different from the traditional methods, our method, jointly rather than separately, learns the sparse codes and the LRR; our method maintains the global structure of the data no longer the local structure. Thus, the yielding sparse codes can be not only robust to the corrupted samples thanks to the LRR, but also discriminative arising from the multiple subspaces preserving. The optimization problem of LogSC can be effectively tackled by the linearized alternating direction method with adaptive penalty. To evaluate our approach, we apply LogSC for image clustering and classification, and meanwhile probe it in noisy scenes. The inspiring experimental results on the public image data sets manifest the discrimination, the robustness and the usability of the proposed LogSC.
KW - Image clustering and classification
KW - Laplacian sparse coding
KW - Low-rank representation
KW - Multiple subspaces preserving
UR - http://www.scopus.com/inward/record.url?scp=85051921847&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-97304-3_14
DO - 10.1007/978-3-319-97304-3_14
M3 - 会议稿件
AN - SCOPUS:85051921847
SN - 9783319973036
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 177
EP - 190
BT - PRICAI 2018
A2 - Kang, Byeong-Ho
A2 - Geng, Xin
PB - Springer Verlag
T2 - 15th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2018
Y2 - 28 August 2018 through 31 August 2018
ER -