Abstract
In the context of channel estimation amid non-Gaussian impulse noise, traditional non-kernel-space methods face challenges of divergence, while many kernel-space methods fail to fully exploit the a priori information embedded in the channel. To address this, we introduce a robust sparse recursive adaptive filtering algorithm named convex regularized recursive kernel risk-sensitive loss (CR-RKRSL) in this paper. By combining the KRSL with a convex function constraint term, our proposed algorithm maximizes the utilization of channel a priori information. Furthermore, we delve into the theoretical aspects of the proposed algorithm, presenting expressions for the convergence and steady-state error. Through extensive simulation results, we demonstrate that CR-RKRSL outperforms the APSA, LHCAF, PRMCC, CR-RMC, RZAMCC algorithms. In comparison to existing algorithms, CR-RKRSL exhibits superior robustness and faster convergence, particularly in scenarios involving highly sparse systems.
Original language | English |
---|---|
Article number | 109568 |
Journal | Signal Processing |
Volume | 223 |
DOIs | |
State | Published - Oct 2024 |
Keywords
- Convex regularized
- Kernel risk-sensitive loss (KRSL)
- Normalized mean squared deviation (NMSD)