Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering
- Yoshua Bengio ,
- Jean-François Paiement ,
- Pascal Vincent ,
- Olivier Delalleau ,
- Nicolas Le Roux ,
- Marie Ouimet
in Advances in Neural Information Processing Systems 16
Published by MIT Press | 2004 | Advances in Neural Information Processing Systems 16 edition
Several unsupervised learning algorithms based on an eigendecomposition provide either an embedding or a clustering only for given training points, with no straightforward extension for out-of-sample examples short of recomputing eigenvectors. This paper provides a unified framework for extending Local Linear Embedding (LLE), Isomap, Laplacian Eigenmaps, Multi-Dimensional Scaling (for dimensionality reduction) as well as for Spectral Clustering. This framework is based on seeing these algorithms as learning eigenfunctions of a data-dependent kernel. Numerical experiments show that the generalizations performed have a level of error comparable to the variability of the embedding algorithms due to the choice of training data.