Decision Tree SVM: An extension of linear SVM for non-linear classification

Feiping Nie, Wei Zhu, Xuelong Li

科研成果: 期刊稿件文章同行评审

62 引用 (Scopus)

摘要

Kernel trick is widely applied to Support Vector Machine (SVM) to deal with linearly inseparable data which is known as kernel SVM. However, kernel SVM always has high computational cost in practice which makes it unsuitable to handle large scale data. Moreover, kernel SVM always brings hyper-parameters, e.g. bandwidth in Gaussian kernel. Since the hyper-parameters have a significant influence on the final performance of kernel SVM and are pretty hard to tune especially for large scale data, one may need to put lots of effort into finding good enough parameters, and improper settings of the hyper-parameters often make the classification performance even lower than that of linear SVM. Inspired by recent progresses on linear SVM for dealing with large scale data, we propose a well-designed classifier to efficiently handle large scale linearly inseparable data, i.e., Decision Tree SVM (DTSVM). DTSVM has much lower computational cost compared with kernel SVM, and it brings almost no hyper-parameters except a few thresholds which can be fixed in practice. Comprehensive experiments on large scale datasets demonstrate the superiority of the proposed method.

源语言英语
页(从-至)153-159
页数7
期刊Neurocomputing
401
DOI
出版状态已出版 - 11 8月 2020

指纹

探究 'Decision Tree SVM: An extension of linear SVM for non-linear classification' 的科研主题。它们共同构成独一无二的指纹。

引用此