TY - JOUR
T1 - Sparse Bayesian DOA estimation using hierarchical synthesis lasso priors for off-grid signals
AU - Yang, Jie
AU - Yang, Yixin
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2020
Y1 - 2020
N2 - Within the conventional sparse Bayesian learning (SBL) framework, only Gaussian scale mixtures have been adopted to model sparsity-inducing priors that guarantee the exact inverse recovery. In light of the relative scarcity of formal SBL tools in enforcing a proper sparsity profile of signal vectors, we explore the use of hierarchical synthesis lasso (HSL) priors for representing the same small subset of features among multiple responses. We outline a viable approximation to this particular choice of sparse prior, leading to tractable marginalization over all weights and hyperparameters. We then discuss how the statistical variables of the hierarchical Bayesian model can be estimated via an adaptive updating formula, and include a refined one dimensional searching procedure to extraordinarily improve the direction of arrival (DOA) estimation performance when take the off-grid DOAs into account. Using these modifications, we show that exploiting HSL priors are very helpful in encouraging sparseness. Numerical simulations also verify the superiority of the proposal in terms of convergence speed and root mean squared estimation error, as compared to the traditional and more recent sparse Bayesian algorithms.
AB - Within the conventional sparse Bayesian learning (SBL) framework, only Gaussian scale mixtures have been adopted to model sparsity-inducing priors that guarantee the exact inverse recovery. In light of the relative scarcity of formal SBL tools in enforcing a proper sparsity profile of signal vectors, we explore the use of hierarchical synthesis lasso (HSL) priors for representing the same small subset of features among multiple responses. We outline a viable approximation to this particular choice of sparse prior, leading to tractable marginalization over all weights and hyperparameters. We then discuss how the statistical variables of the hierarchical Bayesian model can be estimated via an adaptive updating formula, and include a refined one dimensional searching procedure to extraordinarily improve the direction of arrival (DOA) estimation performance when take the off-grid DOAs into account. Using these modifications, we show that exploiting HSL priors are very helpful in encouraging sparseness. Numerical simulations also verify the superiority of the proposal in terms of convergence speed and root mean squared estimation error, as compared to the traditional and more recent sparse Bayesian algorithms.
KW - direction of arrival estimation
KW - hierarchical Bayesian model
KW - hierarchical synthesis lasso priors
KW - off-grid
KW - Sparse Bayesian learning
UR - http://www.scopus.com/inward/record.url?scp=85079778101&partnerID=8YFLogxK
U2 - 10.1109/TSP.2020.2967665
DO - 10.1109/TSP.2020.2967665
M3 - 文章
AN - SCOPUS:85079778101
SN - 1053-587X
VL - 68
SP - 872
EP - 884
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
M1 - 8963635
ER -