# Learning Rules with Adaptor Grammars

[note: apologies for the overscanned slides - you can find full resolution slides at http://www.cog.brown.edu/~mj/papers/johnson09-learning-rules-g.pdf ]

Presented by Mark Johnson, Brown University.

Nonparametric Bayesian methods are interesting because they may provide a way of learning the appropriate units of generalization as well as the generalization's probability or weight. Adaptor Grammars are a framework for stating a variety of hierarchical nonparametric Bayesian models, where the units of generalization can be viewed as

kinds of PCFG rules. This talk describes the mathematical and computational properties of Adaptor Grammars and linguistic applications such as word segmentation and syllabification,

and describes the MCMC algorithms we use to sample them.

Joint work with Sharon Goldwater and Tom Griffiths.

Presented by Mark Johnson, Brown University.

Nonparametric Bayesian methods are interesting because they may provide a way of learning the appropriate units of generalization as well as the generalization's probability or weight. Adaptor Grammars are a framework for stating a variety of hierarchical nonparametric Bayesian models, where the units of generalization can be viewed as

kinds of PCFG rules. This talk describes the mathematical and computational properties of Adaptor Grammars and linguistic applications such as word segmentation and syllabification,

and describes the MCMC algorithms we use to sample them.

Joint work with Sharon Goldwater and Tom Griffiths.

*Google Tech TalksJuly 6, 2009*