Iteratively Re-Weighted Method for Sparsity-Inducing Norms

Feiping Nie, Zhanxuan Hu, Xiaoqian Wang, Xuelong Li, Heng Huang

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Among a big body of recently developed algorithms for machine learning and data mining, a class of models using nonconvex/ non-smooth sparsity-inducing norms achieves promising results on many challenging tasks. An important problem faced with such models is to find an effective solution for the objective function with one or multiple intractable terms. Although a large number of optimization approaches have been developed, most of them are tailored to a specific model. Besides, these approaches generally introduce some additional parameters and no longer guarantee convergence. In this work, we first revisit some representative nonconvex/ non-smooth machine learning models, and then unity them into a generic formulation. Theoretically, we develop a simple yet efficient optimization framework, namely Iteratively Re-Weighted method (IRW), to solve such a class of models and provide the corresponding convergence analysis. Particularly, we validate our proposed method on two challenging machine learning tasks: multitask regression and feature selection.

Original languageEnglish
Pages (from-to)7045-7055
Number of pages11
JournalIEEE Transactions on Knowledge and Data Engineering
Volume35
Issue number7
DOIs
StatePublished - 1 Jul 2023

Keywords

  • Clustering
  • feature selection
  • low rank
  • multi-task learning
  • sparse

Fingerprint

Dive into the research topics of 'Iteratively Re-Weighted Method for Sparsity-Inducing Norms'. Together they form a unique fingerprint.

Cite this