New primal SVM solver with linear computational cost for big data classifications

Feiping Nie, Yizhen Huang, Xiaoqian Wang, Heng Huang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

49 Scopus citations

Abstract

Support Vector Machines (SVM) is among the most popular classification techniques in ma-chine learning, hence designing fast primal SVM algorithms for large-scale datasets is a hot topic in recent years. This paper presents a new L2- norm regularized primal SVM solver using Augmented Lagrange Multipliers, with linear computational cost for Lp-norm loss functions. The most computationally intensive steps (that determine the algorithmic complexity) of the proposed algorithm is purely and simply matrix-by- vector multiplication, which can be easily parallelized on a multi-core server for parallel computing. We implement and integrate our algorithm into the interfaces and framework of the well-known LibLinear software toolbox. Experiments show that our algorithm is with stable performance and on average faster than the state- of-the-art solvers such as SVMperf, Pegasos and the LibLinear that integrates the TRON, PCD and DCD algorithms.

Original languageEnglish
Title of host publication31st International Conference on Machine Learning, ICML 2014
PublisherInternational Machine Learning Society (IMLS)
Pages1883-1891
Number of pages9
ISBN (Electronic)9781634393973
StatePublished - 2014
Externally publishedYes
Event31st International Conference on Machine Learning, ICML 2014 - Beijing, China
Duration: 21 Jun 201426 Jun 2014

Publication series

Name31st International Conference on Machine Learning, ICML 2014
Volume3

Conference

Conference31st International Conference on Machine Learning, ICML 2014
Country/TerritoryChina
CityBeijing
Period21/06/1426/06/14

Fingerprint

Dive into the research topics of 'New primal SVM solver with linear computational cost for big data classifications'. Together they form a unique fingerprint.

Cite this