# Advanced Statistical Learning Theory

This set of lectures will complement the statistical learning theory course and focus on recent advances in the domain of classification.

- PAC Bayesian bounds: a simple derivation, comparison with Rademacher averages.
- Local Rademacher complexity with classification loss, Talagrand's inequality. Tsybakov noise conditions.
- Properties of loss functions for classification (influence on approximation and estimation, relationship with noise conditions).
- Applications to SVM - Estimation and approximation properties, role of eigenvalues of the Gram matrix.

*Author: Olivier Bousquet, Google*