An iterative locally linear embedding algorithm

Deguang Kong, Chris Ding, Heng Huang, Feiping Nie

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

32 Scopus citations

Abstract

Locally Linear embedding (LLE) is a popular dimension reduction method. In this paper, we systematically improve the two main steps of LLE: (A) learning the graph weights W, and (B) learning the embedding Y. We propose a sparse nonnegative W learning algorithm. We propose a weighted formulation for learning Y and show the results are identical to normalized cuts spectral clustering. We further propose to iterate the two steps in LLE repeatedly to improve the results. Extensive experiment results show that iterative LLE algorithm significantly improves both classification and clustering results.

Original languageEnglish
Title of host publicationProceedings of the 29th International Conference on Machine Learning, ICML 2012
Pages1647-1654
Number of pages8
StatePublished - 2012
Externally publishedYes
Event29th International Conference on Machine Learning, ICML 2012 - Edinburgh, United Kingdom
Duration: 26 Jun 20121 Jul 2012

Publication series

NameProceedings of the 29th International Conference on Machine Learning, ICML 2012
Volume2

Conference

Conference29th International Conference on Machine Learning, ICML 2012
Country/TerritoryUnited Kingdom
CityEdinburgh
Period26/06/121/07/12

Fingerprint

Dive into the research topics of 'An iterative locally linear embedding algorithm'. Together they form a unique fingerprint.

Cite this