Robust Supervised and Semisupervised Least Squares Regression Using ℓ2,p-Norm Minimization

Jingyu Wang, Fangyuan Xie, Feiping Nie, Xuelong Li

科研成果: 期刊稿件文章同行评审

13 引用 (Scopus)

摘要

Least squares regression (LSR) is widely applied in statistics theory due to its theoretical solution, which can be used in supervised, semisupervised, and multiclass learning. However, LSR begins to fail and its discriminative ability cannot be guaranteed when the original data have been corrupted and noised. In reality, the noises are unavoidable and could greatly affect the error construction in LSR. To cope with this problem, a robust supervised LSR (RSLSR) is proposed to eliminate the effect of noises and outliers. The loss function adopts $\ell _{2,p}$ -norm ( $0< p\leq 2$ ) instead of square loss. In addition, the probability weight is added to each sample to determine whether the sample is a normal point or not. Its physical meaning is very clear, in which if the point is normal, the probability value is 1; otherwise, the weight is 0. To effectively solve the concave problem, an iterative algorithm is introduced, in which additional weights are added to penalize normal samples with large errors. We also extend RSLSR to robust semisupervised LSR (RSSLSR) to fully utilize the limited labeled samples. A large number of classification performances on corrupted data illustrate the robustness of the proposed methods.

源语言英语
页(从-至)8389-8403
页数15
期刊IEEE Transactions on Neural Networks and Learning Systems
34
11
DOI
出版状态已出版 - 1 11月 2023

指纹

探究 'Robust Supervised and Semisupervised Least Squares Regression Using ℓ2,p-Norm Minimization' 的科研主题。它们共同构成独一无二的指纹。

引用此