Font Size: a A A

Coupled embedding of sequential processes using Gaussian process models

Posted on:2010-10-26Degree:Ph.DType:Dissertation
University:Rutgers The State University of New Jersey - New BrunswickCandidate:Moon, KooksangFull Text:PDF
GTID:1448390002985630Subject:Computer Science
Abstract/Summary:
In this dissertation we consider the task of making predictions from high dimensional sequential data. Problems of this type arise in many practical scenarios, such as the estimation of 3D human figure motion from a sequence of images or the predictions of implied volatility trends from sequences of option market indicators in financial time-series analysis. However, direct predictions of this type are typically infeasible due to high dimensionality of both the input and the output data, as well as the existence of temporal dependencies. To address this task we present a novel approach to subspace modeling of dyadic high dimensional sequences which have a co-occurrence or regression relationship. Statistical reasoning suggests that predictions made through low dimensional subspaces may improve the performance of predictive models if such subspaces are properly selected. We show that selection of such optimal predictive subspaces can be made, and is largely analogous, to the task of designing a particular family of Gaussian processes (GP). As a consequence, many of the models we consider here can be seen as a generalization of the well-known GP regressors.;We first study the role of dynamics in subspace modeling of single sequence and propose a new family of marginal auto-regressive (MAR) models which can describe the space of all stable auto-regressive sequences. We utilize the MAR priors in a Gaussian process latent variable model (GPLVM) framework to represent the nonlinear dimensionality reduction process with a dynamic constraint. To model the low dimensional embedding in the prediction tasks, we propose two alternative approaches: a generative model and direct predictive, discriminative model. For the generative modeling approach, we extend the framework of probabilistic latent semantic analysis (PLSA) models in a sequential setting. This dynamic PLSA approach results in a new generative model which learns a pair of mapping functions between the subspace and the two data sequences with a dynamic prior. For the discriminative modeling approach, we address the problem of learning optimal regressors that maximally reduce the dimension of the input while preserving the information necessary to predict the target values based on the sufficient dimensionality reduction concept. Instead of the iterative solutions of previous approaches, we show how a globally optimal solution in closed form can be obtained by formulating a related problem in a setting reminiscent of the GP regression. In the set of experiments on various vision and financial time-series prediction problems, the proposed two models achieve significant gains in accuracy of prediction as well as interpretability, compared to other dimension reduction and regression schemes.
Keywords/Search Tags:Model, Sequential, Prediction, Gaussian, Process, Dimensional
Related items