Multi-View Scaling Support Vector Machines for Classification and Feature Selection

Jinglin Xu, Junwei Han, Feiping Nie, Xuelong Li

Research output: Contribution to journalArticlepeer-review

55 Scopus citations

Abstract

With the explosive growth of data, the multi-view data is widely used in many fields, such as data mining, machine learning, computer vision, and so on. Because such data always has a complex structure, i.e., many categories, many perspectives of description and high dimension, how to formulate an accurate and reliable framework for the multi-view classification is a very challenging task. In this paper, we propose a novel multi-view classification method by using multiple multi-class Support Vector Machines (SVMs) with a novel collaborative strategy. Here, each multi-class SVM embeds the scaling factor to renewedly adjust the weight allocation of all features, which is beneficial to highlight more important and discriminative features. Furthermore, we adopt the decision function values to integrate multiple multi-class learners and introduce the confidence score across multiple classes to determine the final classification result. In addition, through a series of the mathematical deduction, we bridge the proposed model with the solvable problem and solve it through an alternating iteration optimization method. We evaluate the proposed method on several image and face datasets, and the experimental results demonstrate that our proposed method performs better than other state-of-the-art learning algorithms.

Original languageEnglish
Article number8664197
Pages (from-to)1419-1430
Number of pages12
JournalIEEE Transactions on Knowledge and Data Engineering
Volume32
Issue number7
DOIs
StatePublished - 1 Jul 2020

Keywords

  • Multiple views
  • classification
  • feature selection
  • multi-class support vector machines

Fingerprint

Dive into the research topics of 'Multi-View Scaling Support Vector Machines for Classification and Feature Selection'. Together they form a unique fingerprint.

Cite this