Convergence of MDL and Bayesian Methods

Posted in Science on September 08, 2008

Convergence of MDL and Bayesian Methods

We introduce a complexity measure which we call KL-complexity. Based on this concept, we present a general information exponential inequality that measures the statistical complexity of some deterministic and randomized estimators. We show that simple and clean finite sample convergence bounds can be obtained from this approach. In particular, we are able to improve some classical results concerning the convergence of MDL density estimation and Bayesian posterior distributions

Author: Tong Zhang, Yahoo! Research, Yahoo!

Watch Video

Tags: Science, Lectures, Computer Science, Machine Learning, VideoLectures.Net, Bayesian Learning