Robust Supervised and Semisupervised Least Squares Regression Using ℓ2,p-Norm Minimization

Jingyu Wang, Fangyuan Xie, Feiping Nie, Xuelong Li

Research output: Contribution to journalArticlepeer-review

15 Scopus citations

Abstract

Least squares regression (LSR) is widely applied in statistics theory due to its theoretical solution, which can be used in supervised, semisupervised, and multiclass learning. However, LSR begins to fail and its discriminative ability cannot be guaranteed when the original data have been corrupted and noised. In reality, the noises are unavoidable and could greatly affect the error construction in LSR. To cope with this problem, a robust supervised LSR (RSLSR) is proposed to eliminate the effect of noises and outliers. The loss function adopts $\ell _{2,p}$ -norm ( $0< p\leq 2$ ) instead of square loss. In addition, the probability weight is added to each sample to determine whether the sample is a normal point or not. Its physical meaning is very clear, in which if the point is normal, the probability value is 1; otherwise, the weight is 0. To effectively solve the concave problem, an iterative algorithm is introduced, in which additional weights are added to penalize normal samples with large errors. We also extend RSLSR to robust semisupervised LSR (RSSLSR) to fully utilize the limited labeled samples. A large number of classification performances on corrupted data illustrate the robustness of the proposed methods.

Original languageEnglish
Pages (from-to)8389-8403
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume34
Issue number11
DOIs
StatePublished - 1 Nov 2023

Keywords

  • least squares regression (LSR)
  • robust
  • supervised and semisupervised classification
  • ℓ-norm

Fingerprint

Dive into the research topics of 'Robust Supervised and Semisupervised Least Squares Regression Using ℓ2,p-Norm Minimization'. Together they form a unique fingerprint.

Cite this