Large-scale Bayesian Inference for Collaborative Filtering

Posted in Science on September 07, 2008


Large-scale Bayesian Inference for Collaborative Filtering

The Netflix prize problem provides an excellent testing ground for machine learning. The problem is large scale and the data complex and noisy. It is therefore likely that relatively complex models with careful regularization are needed in order to get reasonable predictions. A Bayesian modeling approach seems ideal for the task if it is possible to scale it up to the size of the Netflix data set, where extremely high-dimensional Bayesian expectations will possibly have to be approximated. In this talk, an ordinal regression low-rank matrix decomposition model is presented. We use a variational Bayes (VB) inference algorithm to demonstrate that it is possible to make a large scale Bayesian algorithm. This model also highlight some of the general limitations of VB. The more accurate expectation propagation/expectation consistent (EP/C) inference cannot be applied to this bi-linear model without further approximations. We therefore propose a hybrid approach with EP/C inspired modifications of the VB algorithm. We compare the different variational approximations with a Laplace approximation, a MAP approximation and a Hamiltonian MCMC. In the latter one sample takes around 6 hours of computing time on a 1GHz processor, with fast C++ code, so there is a very clear case to be made for deterministic approximate inference. Another good feature of the Netflix data is the magnitude of the the test set which makes even small differences in the performance significant.

Author: Ole Winther, Technical University Of Denmark

Watch Video

Tags: Science, Lectures, Computer Science, Machine Learning, VideoLectures.Net, Bayesian Learning