Robust Bilinear Probabilistic PCA Using a Matrix Variate t Distribution

Jianhua Zhao, Xuan Ma, Lei Shi, Zhen Wang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

—The bilinear probabilistic principal component analysis (BPPCA) was introduced recently as a model-based dimension reduction technique on matrix data. However, BPPCA is based on the Gaussian assumption and hence is vulnerable to potential outlying matrix-valued observations. In this article, we present a new robust extension of BPPCA, called BPPCA using a matrix variate t distribution (tBPPCA), that is built upon a matrix variate t distribution. Like the multivariate t, this distribution offers an additional robustness tuning parameter, which can downweight outliers. By introducing a Gamma distributed latent weight variable, this distribution can be represented hierarchically. With this representation, two efficient accelerated expectation–maximization (EM)-like algorithms for parameter estimation are developed. Experiments on a number of synthetic and real datasets are conducted to understand tBPPCA and compare with several closely related competitors, including its vector-based counterpart. The results reveal that tBPPCA is generally more robust and accurate in the presence of outliers. Moreover, the expected latent weights under tBPPCA can be effectively used for outliers’ detection, which is much more reliable than its vector-based counterpart due to its better robustness.

Original languageEnglish
Pages (from-to)10683-10697
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume34
Issue number12
DOIs
StatePublished - 1 Dec 2023

Keywords

  • Dimension reduction
  • expectation–maximization (EM)
  • matrix data
  • matrix variate t distribution
  • principal component analysis

Fingerprint

Dive into the research topics of 'Robust Bilinear Probabilistic PCA Using a Matrix Variate t Distribution'. Together they form a unique fingerprint.

Cite this