Abstract
Non-negative matrix factorization (NMF) aims at finding nonnegative representations of nonnegative data. Among different NMF algorithms, alternating direction method of multipliers (ADMM) is a popular one with superior performance. However, we find that ADMM shows instability and inferior performance on real-world data like speech signals. In this paper, to solve this problem, we develop a class of advanced regularized ADMM algorithms for NMF. Efficient and robust learning rules are achieved by incorporating l1-norm and the Frobenius norm regularization. The prior information of Laplacian distribution of data is used to solve the problem with a unique solution. We evaluate this class of ADMM algorithms using both synthetic and real speech signals for a source separation task at different cost functions, i.e., Euclidean distance (EUD), Kullback- Leibler (KL) divergence and Itakura-Saito (IS) divergence. Results demonstrate that the proposed algorithms converge faster and yield more stable and accurate results than the original ADMM algorithm.
Original language | English |
---|---|
Pages (from-to) | 1498-1502 |
Number of pages | 5 |
Journal | Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH |
Volume | 2015-January |
State | Published - 2015 |
Event | 16th Annual Conference of the International Speech Communication Association, INTERSPEECH 2015 - Dresden, Germany Duration: 6 Sep 2015 → 10 Sep 2015 |
Keywords
- Alternating direction method of multipliers
- Beta-divergence
- Regularized non-negative matrix factorization
- Source separation