Acoustic source localization in strong reverberant environment by parametric Bayesian dictionary learning

Lu Wang, Yanshan Liu, Lifan Zhao, Qiang Wang, Xiangyang Zeng, Kean Chen

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Sparse representation techniques have become increasingly promising for localizing the sound source in reverberant environment, where the multipath channel effects can be accurately characterized by the image model. In this paper, a dictionary is constructed by discretizing the inner space of the enclosure, which is parameterized by the unknown energy reflective ratio. More specifically, each atom of the dictionary can characterize a specific source-to-microphone multipath channel. Subsequently, source localization can be reformulated as a joint sparse signal recovery and parametric dictionary learning problem. In particular, a sparse Bayesian framework is utilized for modeling, where its solution can be obtained by variational Bayesian expectation maximization technique. Moreover, the joint sparsity in frequency domain is exploited to improve the dictionary learning performances. A remarkably advantage of this approach is that no laborious parameter tuning procedure is required and statistical information can be provided. Numerical simulation results have shown that the proposed algorithm achieves high source localization accuracy, low sidelobes and high robustness for multiple sources with low computational complexity in strong reverberant environments, compared with other state-of-the-art methods.

Original languageEnglish
Pages (from-to)232-240
Number of pages9
JournalSignal Processing
Volume143
DOIs
StatePublished - Feb 2018

Keywords

  • Parametric dictionary learning
  • Reverberant environment
  • Source localization
  • Sparse Bayesian method

Fingerprint

Dive into the research topics of 'Acoustic source localization in strong reverberant environment by parametric Bayesian dictionary learning'. Together they form a unique fingerprint.

Cite this