A general kernelization framework for learning algorithms based on kernel PCA

Changshui Zhang, Feiping Nie, Shiming Xiang

Research output: Contribution to journalArticlepeer-review

93 Scopus citations

Abstract

In this paper, a general kernelization framework for learning algorithms is proposed via a two-stage procedure, i.e., transforming data by kernel principal component analysis (KPCA), and then directly performing the learning algorithm with the transformed data. It is worth noting that although a very few learning algorithms were also kernelized by this procedure before, why and under what condition this procedure is feasible have not been further studied. In this paper, we explicitly present this kernelization framework, and give a rigorous justification to reveal that under some mild conditions, the kernelization under this framework is equivalent to traditional kernel method. We show that these mild conditions are usually satisfied in most of learning algorithms. Therefore, most of learning algorithms can be kernelized under this framework without having to reformulate it into inner product form, which is a common yet vital step in traditional kernel methods. Enlightened by this framework, we also propose a novel kernel method based on the low-rank KPCA, which could be used to remove the noise in the feature space, speed up the kernel algorithm and improve the numerical stability for the kernel algorithm. Experiments are presented to verify the validity and effectiveness of the proposed methods.

Original languageEnglish
Pages (from-to)959-967
Number of pages9
JournalNeurocomputing
Volume73
Issue number4-6
DOIs
StatePublished - Jan 2010
Externally publishedYes

Keywords

  • Kernel method
  • Kernel PCA
  • Learning algorithm
  • Two-stage framework

Fingerprint

Dive into the research topics of 'A general kernelization framework for learning algorithms based on kernel PCA'. Together they form a unique fingerprint.

Cite this