Abstract
Among a big body of recently developed algorithms for machine learning and data mining, a class of models using nonconvex/ non-smooth sparsity-inducing norms achieves promising results on many challenging tasks. An important problem faced with such models is to find an effective solution for the objective function with one or multiple intractable terms. Although a large number of optimization approaches have been developed, most of them are tailored to a specific model. Besides, these approaches generally introduce some additional parameters and no longer guarantee convergence. In this work, we first revisit some representative nonconvex/ non-smooth machine learning models, and then unity them into a generic formulation. Theoretically, we develop a simple yet efficient optimization framework, namely Iteratively Re-Weighted method (IRW), to solve such a class of models and provide the corresponding convergence analysis. Particularly, we validate our proposed method on two challenging machine learning tasks: multitask regression and feature selection.
Original language | English |
---|---|
Pages (from-to) | 7045-7055 |
Number of pages | 11 |
Journal | IEEE Transactions on Knowledge and Data Engineering |
Volume | 35 |
Issue number | 7 |
DOIs | |
State | Published - 1 Jul 2023 |
Keywords
- Clustering
- feature selection
- low rank
- multi-task learning
- sparse