Supervised Dimensionality Reduction Methods via Recursive Regression

Yun Liu, Rui Zhang, Feiping Nie, Xuelong Li, Chris Ding

Research output: Contribution to journalArticlepeer-review

14 Scopus citations

Abstract

In this article, the recursive problems of both orthogonal linear discriminant analysis (OLDA) and orthogonal least squares regression (OLSR) are investigated. Different from other works, the associated recursive problems are addressed via a novel recursive regression method, which achieves the dimensionality reduction in the orthogonal complement space heuristically. As for the OLDA, an efficient method is developed to obtain the associated optimal subspace, which is closely related to the orthonormal basis of the optimal solution to the ridge regression. As for the OLSR, the scalable subspace is introduced to build up an original OLSR with optimal scaling (OS). Through further relaxing the proposed problem into a convex parameterized orthogonal quadratic problem, an effective approach is derived, such that not only the optimal subspace can be achieved but also the OS could be obtained automatically. Accordingly, two supervised dimensionality reduction methods are proposed via obtaining the heuristic solutions to the recursive problems of the OLDA and the OLSR.

Original languageEnglish
Article number8861104
Pages (from-to)3269-3279
Number of pages11
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume31
Issue number9
DOIs
StatePublished - Sep 2020

Keywords

  • Optimal scaling (OS)
  • orthogonal least squares regression (OLSR)
  • orthogonal linear discriminant analysis (OLDA)
  • recursive regression
  • supervised dimensionality reduction

Fingerprint

Dive into the research topics of 'Supervised Dimensionality Reduction Methods via Recursive Regression'. Together they form a unique fingerprint.

Cite this