TY - JOUR
T1 - When Laplacian Scale Mixture Meets Three-Layer Transform
T2 - A Parametric Tensor Sparsity for Tensor Completion
AU - Xue, Jize
AU - Zhao, Yongqiang
AU - Bu, Yuanyang
AU - Chan, Jonathan Cheung Wai
AU - Kong, Seong G.
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2022/12/1
Y1 - 2022/12/1
N2 - Recently, tensor sparsity modeling has achieved great success in the tensor completion (TC) problem. In real applications, the sparsity of a tensor can be rationally measured by low-rank tensor decomposition. However, existing methods either suffer from limited modeling power in estimating accurate rank or have difficulty in depicting hierarchical structure underlying such data ensembles. To address these issues, we propose a parametric tensor sparsity measure model, which encodes the sparsity for a general tensor by Laplacian scale mixture (LSM) modeling based on three-layer transform (TLT) for factor subspace prior with Tucker decomposition. Specifically, the sparsity of a tensor is first transformed into factor subspace, and then factor sparsity in the gradient domain is used to express the local similarity in within-mode. To further refine the sparsity, we adopt LSM by the transform learning scheme to self-adaptively depict deeper layer structured sparsity, in which the transformed sparse matrices in the sense of a statistical model can be modeled as the product of a Laplacian vector and a hidden positive scalar multiplier. We call the method as parametric tensor sparsity delivered by LSM-TLT. By a progressive transformation operator, we formulate the LSM-TLT model and use it to address the TC problem, and then the alternating direction method of multipliers-based optimization algorithm is designed to solve the problem. The experimental results on RGB images, hyperspectral images (HSIs), and videos demonstrate the proposed method outperforms state of the arts.
AB - Recently, tensor sparsity modeling has achieved great success in the tensor completion (TC) problem. In real applications, the sparsity of a tensor can be rationally measured by low-rank tensor decomposition. However, existing methods either suffer from limited modeling power in estimating accurate rank or have difficulty in depicting hierarchical structure underlying such data ensembles. To address these issues, we propose a parametric tensor sparsity measure model, which encodes the sparsity for a general tensor by Laplacian scale mixture (LSM) modeling based on three-layer transform (TLT) for factor subspace prior with Tucker decomposition. Specifically, the sparsity of a tensor is first transformed into factor subspace, and then factor sparsity in the gradient domain is used to express the local similarity in within-mode. To further refine the sparsity, we adopt LSM by the transform learning scheme to self-adaptively depict deeper layer structured sparsity, in which the transformed sparse matrices in the sense of a statistical model can be modeled as the product of a Laplacian vector and a hidden positive scalar multiplier. We call the method as parametric tensor sparsity delivered by LSM-TLT. By a progressive transformation operator, we formulate the LSM-TLT model and use it to address the TC problem, and then the alternating direction method of multipliers-based optimization algorithm is designed to solve the problem. The experimental results on RGB images, hyperspectral images (HSIs), and videos demonstrate the proposed method outperforms state of the arts.
KW - Hierarchical representation
KW - Laplacian scale mixture (LSM)
KW - tensor completion (TC)
KW - three-layer transform (TLT) sparsity
KW - tucker decomposition
UR - http://www.scopus.com/inward/record.url?scp=85123793057&partnerID=8YFLogxK
U2 - 10.1109/TCYB.2021.3140148
DO - 10.1109/TCYB.2021.3140148
M3 - 文章
C2 - 35081033
AN - SCOPUS:85123793057
SN - 2168-2267
VL - 52
SP - 13887
EP - 13901
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 12
ER -