Multiview feature analysis via structured sparsity and shared subspace discovery

Yan Shuo Chang, Feiping Nie, Ming Yu Wang

Research output: Contribution to journalLetterpeer-review

5 Scopus citations

Abstract

Since combining features from heterogeneous data sources can significantly boost classification performance in many applications, it has attracted much research attention over the past few years. Most of the existing multiview feature analysis approaches separately learn features in each view, ignoring knowledge shared by multiple views. Different views of features may have some intrinsic correlations that might be beneficial to feature learning. Therefore, it is assumed that multiviews share subspaces fromwhich common knowledge can be discovered. In this letter, we propose a new multiview feature learning algorithm, aiming to exploit common features shared by different views. To achieve this goal, we propose a feature learning algorithm in a batch mode, by which the correlations among different views are taken into account. Multiple transformation matrices for different views are simultaneously learned in a joint framework. In this way, our algorithm can exploit potential correlations among views as supplementary information that further improves the performance result. Since the proposed objective function is nonsmooth and difficult to solve directly, we propose an iterative algorithm for effective optimization. Extensive experiments have been conducted on a number of real-world data sets. Experimental results demonstrate superior performance in terms of classification against all the compared approaches. Also, the convergence guarantee has been validated in the experiment.

Original languageEnglish
Pages (from-to)1986-2003
Number of pages18
JournalNeural Computation
Volume29
Issue number7
DOIs
StatePublished - 1 Jul 2017

Fingerprint

Dive into the research topics of 'Multiview feature analysis via structured sparsity and shared subspace discovery'. Together they form a unique fingerprint.

Cite this