TY - GEN
T1 - Fast Low-Rank Approximation of Matrices via Randomization with Application to Tensor Completion
AU - Kaloorazi, M. F.
AU - Ahmadi-Asl, S.
AU - Chen, J.
AU - Rahardja, S.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The approximation of voluminous datasets, which admit a low-rank structure, by ones of considerably lower ranks have recently found many practical applications in science and engineering. Randomized algorithms have emerged as an powerful choice, due to their efficacy and efficiency, particularly in exploiting parallelism in modern architectures. In this paper, we present a fast randomized rank-revealing algorithm tailored for low-rank matrix approximation and decomposition. However, unlike the previous works, which have applied deterministic decompositional algorithms such as the singular value decomposition (SVD), pivoted QR and QLP, we make use of a randomized algorithm to factorize the compressed matrix. We furnish bounds for the rank-revealing property of the proposed algorithm. In addition, we utilize our proposed algorithm to develop an efficient algorithm for the low-rank tensor decomposition, namely the tensor-SVD. We apply our proposed algorithms to various classes of multidimensional synthetic and real-world datasets.
AB - The approximation of voluminous datasets, which admit a low-rank structure, by ones of considerably lower ranks have recently found many practical applications in science and engineering. Randomized algorithms have emerged as an powerful choice, due to their efficacy and efficiency, particularly in exploiting parallelism in modern architectures. In this paper, we present a fast randomized rank-revealing algorithm tailored for low-rank matrix approximation and decomposition. However, unlike the previous works, which have applied deterministic decompositional algorithms such as the singular value decomposition (SVD), pivoted QR and QLP, we make use of a randomized algorithm to factorize the compressed matrix. We furnish bounds for the rank-revealing property of the proposed algorithm. In addition, we utilize our proposed algorithm to develop an efficient algorithm for the low-rank tensor decomposition, namely the tensor-SVD. We apply our proposed algorithms to various classes of multidimensional synthetic and real-world datasets.
KW - Low-rank approximation
KW - multilinear algebra
KW - pivoted QLP
KW - randomized algorithm
KW - tensor-SVD
UR - http://www.scopus.com/inward/record.url?scp=85214889536&partnerID=8YFLogxK
U2 - 10.1109/ICSPCC62635.2024.10770371
DO - 10.1109/ICSPCC62635.2024.10770371
M3 - 会议稿件
AN - SCOPUS:85214889536
T3 - 2024 IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2024
BT - 2024 IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 14th IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2024
Y2 - 19 August 2024 through 22 August 2024
ER -