Online dictionary learning for kernel LMS

Wei Gao, Jie Chen, Cédric Richard, Jianguo Huang

Research output: Contribution to journalArticlepeer-review

85 Scopus citations

Abstract

Adaptive filtering algorithms operating in reproducing kernel Hilbert spaces have demonstrated superiority over their linear counterpart for nonlinear system identification. Unfortunately, an undesirable characteristic of these methods is that the order of the filters grows linearly with the number of input data. This dramatically increases the computational burden and memory requirement. A variety of strategies based on dictionary learning have been proposed to overcome this severe drawback. In the literature, there is no theoretical work that strictly analyzes the problem of updating the dictionary in a time-varying environment. In this paper, we present an analytical study of the convergence behavior of the Gaussian least-mean-square algorithm in the case where the statistics of the dictionary elements only partially match the statistics of the input data. This theoretical analysis highlights the need for updating the dictionary in an online way, by discarding the obsolete elements and adding appropriate ones. We introduce a kernel least-mean-square algorithm with ℓ1 -norm regularization to automatically perform this task. The stability in the mean of this method is analyzed, and the improvement of performance due to this dictionary adaptation is confirmed by simulations.

Original languageEnglish
Article number6800092
Pages (from-to)2765-2777
Number of pages13
JournalIEEE Transactions on Signal Processing
Volume62
Issue number11
DOIs
StatePublished - 1 Jun 2014
Externally publishedYes

Keywords

  • Nonlinear adaptive filtering
  • online forward-backward splitting
  • reproducing kernel
  • sparsity

Fingerprint

Dive into the research topics of 'Online dictionary learning for kernel LMS'. Together they form a unique fingerprint.

Cite this