Elaborate multi-task subspace learning with discrete group constraint

Wei Chang, Feiping Nie, Rong Wang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

In multi-task learning (MTL), multiple related tasks can be learned simultaneously under the shared information to improve the generalization performance. However, most of MTL methods assume that all the learning tasks are related indeed and appropriate for joint learning. In some real situations, this assumption may not hold and further lead to the problem of negative transfer. Therefore, in this paper, we not only focus on researching the problem of robustly learning the common feature structure shared by tasks, but also discriminate with which tasks one task should share. By combining with the idea of subspace learning, we propose an elaborate multi-task subspace learning model (EMTSL) with discrete group structure constraint, which can cluster the learned tasks into a set of groups. By introducing the Schatten p-norm instead of trace norm, our model EMTSL can better approximate the low-rank constraint and also avoid the trivial solution. Furthermore, we design an efficient algorithm based on the re-weighted method to solve the proposed model. In addition, the convergence analysis of our algorithm is given in this paper. Experimental results on both synthetic and real-world datasets demonstrate the superiority of our method.

Original languageEnglish
Article number109515
JournalPattern Recognition
Volume139
DOIs
StatePublished - Jul 2023

Keywords

  • Multi-task learning
  • Negative transfer
  • Re-weighted method
  • Subspace learning

Fingerprint

Dive into the research topics of 'Elaborate multi-task subspace learning with discrete group constraint'. Together they form a unique fingerprint.

Cite this