Exponential Families in Feature Space

Posted in Science on July 27, 2008


Exponential Families in Feature Space

In this introductory course we will discuss how log linear models can be extended to feature space. These log linear models have been studied by statisticians for a long time under the name of exponential family of probability distributions. We provide a unified framework which can be used to view many existing kernel algorithms as special cases. Our framework also allows us to derive many natural generalizations of existing algorithms. In particular, we show how we can recover Gaussian Processes, Support Vector Machines, multi-class discrimination, and sequence annotation (via Conditional Random Fields). We also show to deal with missing data and perform MAP estimation on Conditional Random Fields in feature space. The requisite background for the course will be covered briskly in the first two lectures. Knowledge of linear algebra and familiarity with functional analysis will be helpful.

Author: Alexander J. Smola, Australian National University - ANU

Watch Video

Tags: Science, Lectures, Computer Science, Machine Learning, VideoLectures.Net, Linear Models