Conjugate Gradient and Variance Reduction Based Online ADMM for Low-Rank Distributed Networks

Yitong Chen, Danqi Jin, Jie Chen, Cédric Richard, Wen Zhang

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

Modeling the relationships that may connect optimal parameter vectors is essential for the performance of parameter estimation methods in distributed networks. In this paper, we consider a low-rank relationship and introduce matrix factorization to promote this low-rank property. To devise a distributed algorithm that does not require any prior knowledge about the low-rank space, we first formulate local optimization problems at each node, which are subsequently addressed using the Alternating Direction Method of Multipliers (ADMM). Three subproblems naturally arise from ADMM, each resolved in an online manner with low computational costs. Specifically, the first one is solved using stochastic gradient descent (SGD), while the other two are handled using the conjugate gradient descent method to avoid matrix inversion operations. To further enhance performance, a variance reduction algorithm is incorporated into the SGD. Simulation results validate the effectiveness of the proposed algorithm.

源语言英语
页(从-至)706-710
页数5
期刊IEEE Signal Processing Letters
32
DOI
出版状态已出版 - 2025

指纹

探究 'Conjugate Gradient and Variance Reduction Based Online ADMM for Low-Rank Distributed Networks' 的科研主题。它们共同构成独一无二的指纹。

引用此