Enhanced Sparsity Prior Model for Low-Rank Tensor Completion

Jize Xue, Yongqiang Zhao, Wenzhi Liao, Jonathan Cheung Wai Chan, Seong G. Kong

Research output: Contribution to journalArticlepeer-review

78 Scopus citations

Abstract

Conventional tensor completion (TC) methods generally assume that the sparsity of tensor-valued data lies in the global subspace. The so-called global sparsity prior is measured by the tensor nuclear norm. Such assumption is not reliable in recovering low-rank (LR) tensor data, especially when considerable elements of data are missing. To mitigate this weakness, this article presents an enhanced sparsity prior model for LRTC using both local and global sparsity information in a latent LR tensor. In specific, we adopt a doubly weighted strategy for nuclear norm along each mode to characterize global sparsity prior of tensor. Different from traditional tensor-based local sparsity description, the proposed factor gradient sparsity prior in the Tucker decomposition model describes the underlying subspace local smoothness in real-world tensor objects, which simultaneously characterizes local piecewise structure over all dimensions. Moreover, there is no need to minimize the rank of a tensor for the proposed local sparsity prior. Extensive experiments on synthetic data, real-world hyperspectral images, and face modeling data demonstrate that the proposed model outperforms state-of-the-art techniques in terms of prediction capability and efficiency.

Original languageEnglish
Article number8941238
Pages (from-to)4567-4581
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume31
Issue number11
DOIs
StatePublished - Nov 2020

Keywords

  • Enhanced sparsity prior
  • factor gradient sparsity
  • global and local sparsities
  • low-rank (LR) tensor completion (TC)
  • Tucker decomposition

Fingerprint

Dive into the research topics of 'Enhanced Sparsity Prior Model for Low-Rank Tensor Completion'. Together they form a unique fingerprint.

Cite this