# Dirichlet Processes and Nonparametric Bayesian Modelling

Bayesian modeling is a principled approach to updating the degree of belief in a hypothesis given prior knowledge and given available evidence. Both prior knowledge and evidence are combined using Bayes' rule to obtain the a posterior hypothesis. In most cases of interest to machine learning, the prior knowledge is formulated as a prior distribution over parameters and the evidence corresponds to the observed data. By applying Bayes' formula we can perform inference about new data. Having observed sufficient data, the a posteriori parameter distribution is increasingly concentrated and the influence of the prior distribution diminishes. Under some assumptions (in particular that the likelihood model is correct and that the true parameters have positive a priori probability), the a posteriori distribution converges to a point distribution located at the true parameters. The challenges in Bayesian modeling are, first, to find suitable application specific statistical models and, second, to (approximately) solve the resulting inference equations.

*Author: Volker Tresp, Siemens*