Kernel Risk-Sensitive Loss: Definition, Properties and Application to Robust Adaptive Filtering

Badong Chen, Lei Xing, Bin Xu, Haiquan Zhao, Nanning Zheng, José C. Príncipe

Research output: Contribution to journalArticlepeer-review

160 Scopus citations

Abstract

Nonlinear similarity measures defined in kernel space, such as correntropy, can extract higher order statistics of data and offer potentially significant performance improvement over their linear counterparts especially in non Gaussian signal processing and machine learning. In this paper, we propose a new similarity measure in kernel space, called the kernel risk-sensitive loss (KRSL), and provide some important properties. We apply the KRSL to adaptive filtering and investigate the robustness, and then develop the MKRSL algorithm and analyze the mean square convergence performance. Compared with correntropy, the KRSL can offer a more efficient performance surface, thereby enabling a gradient-based method to achieve faster convergence speed and higher accuracy while still maintaining the robustness to outliers. Theoretical analysis results and superior performance of the new algorithm are confirmed by simulation.

Original languageEnglish
Article number7857046
Pages (from-to)2888-2901
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume65
Issue number11
DOIs
StatePublished - 1 Jun 2017

Keywords

  • Correntropy
  • kernel risksensitive loss
  • risk-sensitive criterion
  • robust adaptive filtering

Fingerprint

Dive into the research topics of 'Kernel Risk-Sensitive Loss: Definition, Properties and Application to Robust Adaptive Filtering'. Together they form a unique fingerprint.

Cite this