Robust Bilinear Probabilistic PCA Using a Matrix Variate t Distribution

Jianhua Zhao, Xuan Ma, Lei Shi, Zhen Wang

科研成果: 期刊稿件文章同行评审

3 引用 (Scopus)

摘要

—The bilinear probabilistic principal component analysis (BPPCA) was introduced recently as a model-based dimension reduction technique on matrix data. However, BPPCA is based on the Gaussian assumption and hence is vulnerable to potential outlying matrix-valued observations. In this article, we present a new robust extension of BPPCA, called BPPCA using a matrix variate t distribution (tBPPCA), that is built upon a matrix variate t distribution. Like the multivariate t, this distribution offers an additional robustness tuning parameter, which can downweight outliers. By introducing a Gamma distributed latent weight variable, this distribution can be represented hierarchically. With this representation, two efficient accelerated expectation–maximization (EM)-like algorithms for parameter estimation are developed. Experiments on a number of synthetic and real datasets are conducted to understand tBPPCA and compare with several closely related competitors, including its vector-based counterpart. The results reveal that tBPPCA is generally more robust and accurate in the presence of outliers. Moreover, the expected latent weights under tBPPCA can be effectively used for outliers’ detection, which is much more reliable than its vector-based counterpart due to its better robustness.

源语言英语
页(从-至)10683-10697
页数15
期刊IEEE Transactions on Neural Networks and Learning Systems
34
12
DOI
出版状态已出版 - 1 12月 2023

指纹

探究 'Robust Bilinear Probabilistic PCA Using a Matrix Variate t Distribution' 的科研主题。它们共同构成独一无二的指纹。

引用此