Orthogonal least squares regression for feature extraction

Haifeng Zhao, Zheng Wang, Feiping Nie

科研成果: 期刊稿件文章同行评审

46 引用 (Scopus)

摘要

In many data mining applications, dimensionality reduction is a primary technique to map high-dimensional data to a lower dimensional space. In order to preserve more local structure information, we propose a novel orthogonal least squares regression model for feature extraction in this paper. The main contributions of this paper are shown as follows: first, the new least squares regression method is constructed under the orthogonal constraint which can preserve more discriminant information in the subspace. Second, the optimization problem of classical least squares regression can be solved easily. However the proposed objective function is an unbalanced orthogonal procrustes problem, it is so difficult to obtain the solution that we present a novel iterative optimization algorithm to obtain the optimal solution. The last one, we also provide a proof of the convergence for our iterative algorithm. Additionally, experimental results show that we obtain a global optimal solution through our iterative algorithm even though the optimization problem is a non-convex problem. Both theoretical analysis and empirical studies demonstrate that our method can more effectively reduce the data dimensionality than conventional methods.

源语言英语
页(从-至)200-207
页数8
期刊Neurocomputing
216
DOI
出版状态已出版 - 5 12月 2016

指纹

探究 'Orthogonal least squares regression for feature extraction' 的科研主题。它们共同构成独一无二的指纹。

引用此