Abstract / Description of output
The Gaussian Process Latent Variable Model (GPLVM) [1] is an attractive model
for dimensionality reduction, but the optimization of the GPLVM likelihood with
respect to the latent point locations is difficult, and prone to local optima. Here
we start from the insight that in the GPLVM, we should have that k(xi; xj) ≅ sij ,
where k(xi; xj) is the kernel function evaluated at latent points xi and xj , and sij
is the corresponding estimate from the data. For an isotropic covariance function this relationship can be inverted to yield an estimate of the interpoint distances {dij} in the latent space, and these can be fed into a multidimensional scaling
(MDS) algorithm. This yields an initial estimate of the latent locations, which can
be subsequently optimized in the usual GPLVMfashion. We compare two variants
of this approach to the standard PCA initialization and to the ISOMAP algorithm
[2], and show that our initialization converges to the best GPLVM likelihoods on
all six tested motion capture data sets.
Original language | English |
---|---|
Title of host publication | Proceedings of the NIPS 2010 workshop on Challenges of Data Visualization |
Number of pages | 6 |
Publication status | Published - 2010 |