Feature Learning Viewpoint of Adaboost and a New Algorithm

Fei Wang, Zhongheng Li, Fang He, Rong Wang, Weizhong Yu, Feiping Nie

Research output: Contribution to journalArticlepeer-review

64 Scopus citations

Abstract

The AdaBoost algorithm has the superiority of resisting overfitting. Understanding the mysteries of this phenomenon is a very fascinating fundamental theoretical problem. Many studies are devoted to explaining it from statistical view and margin theory. In this paper, this phenomenon is illustrated by the proposed AdaBoost+SVM algorithm from feature learning viewpoint, which clearly explains the resistance to overfitting of AdaBoost. Firstly, we adopt the AdaBoost algorithm to learn the base classifiers. Then, instead of directly combining the base classifiers, we regard them as features and input them to SVM classifier. With this, the new coefficient and bias can be obtained, which can be used to construct the final classifier. We explain the rationality of this and illustrate the theorem that when the dimension of these features increases, the performance of SVM would not be worse, which can explain the resistance to overfitting of AdaBoost.

Original languageEnglish
Article number8868178
Pages (from-to)149890-149899
Number of pages10
JournalIEEE Access
Volume7
DOIs
StatePublished - 2019

Keywords

  • AdaBoost
  • feature learning
  • overfitting
  • SVM

Fingerprint

Dive into the research topics of 'Feature Learning Viewpoint of Adaboost and a New Algorithm'. Together they form a unique fingerprint.

Cite this