Nonparametric Bayesian Models in Machine Learning

Posted in Science on September 07, 2008

Nonparametric Bayesian Models in Machine Learning

Bayesian methods make it possible to handle uncertainty in a principled manner, sidestep the problem of overfitting, and incorporate domain knowledge. However, most parametric models are too limited to adequately model complex real-world problems. Thus, interest has shifted to nonparametric models which can capture much richer and more complex probability distributions. This talk will review some of the core nonparametric tools for regression and classification (Gaussian processes; GPs) and density estimation (Dirichlet process mixtures). We will then focus on extensions of these basic tools (such as mixtures of GPs, warped GPs, and GPs for ordinal regression) and approximation methods which allow efficient inference in these models (such as expectation propagation; EP).

Author: Zoubin Ghahramani, University Of Cambridge

Watch Video

Tags: Science, Lectures, Computer Science, Machine Learning, VideoLectures.Net, Bayesian Learning