Distributed Online Learning Over Multitask Networks With Rank-One Model

Yitong Chen, Danqi Jin, Jie Chen, Cedric Richard, Wen Zhang, Gongping Huang, Jingdong Chen

Research output: Contribution to journalArticlepeer-review

Abstract

Modeling multitask relations in distributed networks has garnered considerable interest in recent years. In this paper, we present a novel rank-one model, where all the optimal vectors to be estimated are scaled versions of an unknown vector to be determined. By considering the rank-one relation, we develop a constrained centralized optimization problem, and after a decoupling process, it is solved in a distributed way by using the projected gradient descent method. To perform an efficient calculation of this projection, we suggest substituting the intensive singular value decomposition with the computationally efficient power method. Additionally, local estimates targeting the same optimal vector are combined within a neighborhood to further improve their accuracy. Theoretical analyses of the proposed algorithm are conducted for star topologies, and conditions are derived to guarantee its stability in both the mean and mean-square senses. Finally, simulation results are presented to demonstrate the effectiveness of the proposed algorithms.

Original languageEnglish
Pages (from-to)314-328
Number of pages15
JournalIEEE Transactions on Signal and Information Processing over Networks
Volume11
DOIs
StatePublished - 2025

Keywords

  • Combination matrix
  • distributed optimization
  • multitask diffusion strategy
  • power method
  • rank-one model

Fingerprint

Dive into the research topics of 'Distributed Online Learning Over Multitask Networks With Rank-One Model'. Together they form a unique fingerprint.

Cite this