TY - JOUR
T1 - Large-Scale Affine Matrix Rank Minimization With a Novel Nonconvex Regularizer
AU - Wang, Zhi
AU - Liu, Yu
AU - Luo, Xin
AU - Wang, Jianjun
AU - Gao, Chao
AU - Peng, Dezhong
AU - Chen, Wu
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2022/9/1
Y1 - 2022/9/1
N2 - Low-rank minimization aims to recover a matrix of minimum rank subject to linear system constraint. It can be found in various data analysis and machine learning areas, such as recommender systems, video denoising, and signal processing. Nuclear norm minimization is a dominating approach to handle it. However, such a method ignores the difference among singular values of target matrix. To address this issue, nonconvex low-rank regularizers have been widely used. Unfortunately, existing methods suffer from different drawbacks, such as inefficiency and inaccuracy. To alleviate such problems, this article proposes a flexible model with a novel nonconvex regularizer. Such a model not only promotes low rankness but also can be solved much faster and more accurate. With it, the original low-rank problem can be equivalently transformed into the resulting optimization problem under the rank restricted isometry property (rank-RIP) condition. Subsequently, Nesterov's rule and inexact proximal strategies are adopted to achieve a novel algorithm highly efficient in solving this problem at a convergence rate of $O(1/K)$ , with $K$ being the iterate count. Besides, the asymptotic convergence rate is also analyzed rigorously by adopting the Kurdyka- ojasiewicz (KL) inequality. Furthermore, we apply the proposed optimization model to typical low-rank problems, including matrix completion, robust principal component analysis (RPCA), and tensor completion. Exhaustively empirical studies regarding data analysis tasks, i.e., synthetic data analysis, image recovery, personalized recommendation, and background subtraction, indicate that the proposed model outperforms state-of-the-art models in both accuracy and efficiency.
AB - Low-rank minimization aims to recover a matrix of minimum rank subject to linear system constraint. It can be found in various data analysis and machine learning areas, such as recommender systems, video denoising, and signal processing. Nuclear norm minimization is a dominating approach to handle it. However, such a method ignores the difference among singular values of target matrix. To address this issue, nonconvex low-rank regularizers have been widely used. Unfortunately, existing methods suffer from different drawbacks, such as inefficiency and inaccuracy. To alleviate such problems, this article proposes a flexible model with a novel nonconvex regularizer. Such a model not only promotes low rankness but also can be solved much faster and more accurate. With it, the original low-rank problem can be equivalently transformed into the resulting optimization problem under the rank restricted isometry property (rank-RIP) condition. Subsequently, Nesterov's rule and inexact proximal strategies are adopted to achieve a novel algorithm highly efficient in solving this problem at a convergence rate of $O(1/K)$ , with $K$ being the iterate count. Besides, the asymptotic convergence rate is also analyzed rigorously by adopting the Kurdyka- ojasiewicz (KL) inequality. Furthermore, we apply the proposed optimization model to typical low-rank problems, including matrix completion, robust principal component analysis (RPCA), and tensor completion. Exhaustively empirical studies regarding data analysis tasks, i.e., synthetic data analysis, image recovery, personalized recommendation, and background subtraction, indicate that the proposed model outperforms state-of-the-art models in both accuracy and efficiency.
KW - Inexact proximal step
KW - low-rank minimization
KW - matrix completion
KW - novel nonconvex regularizer
KW - robust principal component analysis (RPCA)
KW - tensor completion
UR - http://www.scopus.com/inward/record.url?scp=85102308641&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2021.3059711
DO - 10.1109/TNNLS.2021.3059711
M3 - 文章
C2 - 33646960
AN - SCOPUS:85102308641
SN - 2162-237X
VL - 33
SP - 4661
EP - 4675
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 9
ER -