TY - GEN
T1 - Hierarchical Low-Rank Model with Double Tensor Structural Sparsity for Tensor Completion
AU - Bu, Yuanyang
AU - Zhao, Yongqiang
AU - Zhang, Xun
N1 - Publisher Copyright:
© 2024 Technical Committee on Control Theory, Chinese Association of Automation.
PY - 2024
Y1 - 2024
N2 - In the field of tensor completion, existing models employing a mono-layered low-rank structure have exhibited potent results. For comprehensive exploitation of tensor data's intrinsic low-rankness, a novel hierarchical low-rank model that amalgamates double tensor structural sparsity is presented, specifically devised for the completion of third-order tensors. This innovative model concurrently encapsulates the initial-layer low-rankness derived from the primal tensor objects, facilitated by means of tensor factorization via the t-product. Further, the low-rankness of each constituent factor tensor at the secondary layer is characterized employing an innovative tensor structural sparsity regularizer. The model is further fortified by a surrogate theorem, ostensibly asserting that the aforementioned hierarchical low-rank model can offer a more precise representation of tensor multi-rank. Additionally, an efficient learning algorithm for the model has been developed. Comprehensive experimental results exhibit the hierarchical low-rank model's superiority over competing methods of tensor completion that merely consider mono-layered low-rankness.
AB - In the field of tensor completion, existing models employing a mono-layered low-rank structure have exhibited potent results. For comprehensive exploitation of tensor data's intrinsic low-rankness, a novel hierarchical low-rank model that amalgamates double tensor structural sparsity is presented, specifically devised for the completion of third-order tensors. This innovative model concurrently encapsulates the initial-layer low-rankness derived from the primal tensor objects, facilitated by means of tensor factorization via the t-product. Further, the low-rankness of each constituent factor tensor at the secondary layer is characterized employing an innovative tensor structural sparsity regularizer. The model is further fortified by a surrogate theorem, ostensibly asserting that the aforementioned hierarchical low-rank model can offer a more precise representation of tensor multi-rank. Additionally, an efficient learning algorithm for the model has been developed. Comprehensive experimental results exhibit the hierarchical low-rank model's superiority over competing methods of tensor completion that merely consider mono-layered low-rankness.
KW - Hierarchical Low-rankness
KW - Structural Sparsity
KW - Tensor Completion
KW - Tensor Factorization
UR - http://www.scopus.com/inward/record.url?scp=85205478225&partnerID=8YFLogxK
U2 - 10.23919/CCC63176.2024.10662763
DO - 10.23919/CCC63176.2024.10662763
M3 - 会议稿件
AN - SCOPUS:85205478225
T3 - Chinese Control Conference, CCC
SP - 7818
EP - 7823
BT - Proceedings of the 43rd Chinese Control Conference, CCC 2024
A2 - Na, Jing
A2 - Sun, Jian
PB - IEEE Computer Society
T2 - 43rd Chinese Control Conference, CCC 2024
Y2 - 28 July 2024 through 31 July 2024
ER -