Calibrated multi-task learning

Feiping Nie, Zhanxuan Hu, Xuelong Li

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

30 Scopus citations

Abstract

This paper proposes a novel algorithm, named Non-Convex Calibrated Multi-Task Learning (NC-CMTL), for learning multiple related regression tasks jointly. Instead of utilizing the nuclear norm, NC-CMTL adopts a non-convex low rank regularizer to explore the shared information among different tasks. In addition, considering that the regularization parameter for each regression task desponds on its noise level, we replace the least squares loss function by square-root loss function. Computationally, as proposed model has a non-smooth loss function and a non-convex regularization term, we construct an efficient re-weighted method to optimize it. Theoretically, we first present the convergence analysis of constructed method, and then prove that the derived solution is a stationary point of original problem. Particularly, the regularizer and optimization method used in this paper are also suitable for other rank minimization problems. Numerical experiments on both synthetic and real data illustrate the advantages of NC-CMTL over several state-of-the-art methods.

Original languageEnglish
Title of host publicationKDD 2018 - Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
PublisherAssociation for Computing Machinery
Pages2012-2021
Number of pages10
ISBN (Print)9781450355520
DOIs
StatePublished - 19 Jul 2018
Event24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2018 - London, United Kingdom
Duration: 19 Aug 201823 Aug 2018

Publication series

NameProceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining

Conference

Conference24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2018
Country/TerritoryUnited Kingdom
CityLondon
Period19/08/1823/08/18

Keywords

  • Calibration
  • Loss function
  • Multi-Task Learning
  • Regression

Fingerprint

Dive into the research topics of 'Calibrated multi-task learning'. Together they form a unique fingerprint.

Cite this