Nonparametric Bayes Tutorial

This tutorial took place at the 2015 Machine Learning Summer School (MLSS) at the Max Planck Institute for Intelligent Systems in Tübingen, Germany. See this link for the latest versions and videos of this tutorial.

Monday, July 20: 1.5 hours
Tuesday, July 21: 1 hour
Friday, July 24: 1.5 hours

Instructor:
  Professor Tamara Broderick
  Email:


Description

Nonparametric Bayesian methods make use of infinite-dimensional mathematical structures to allow the practitioner to learn more from their data as the size of their data set grows. What does that mean, and how does it work in practice? In this tutorial, we'll cover why machine learning and statistics need more than just parametric Bayesian inference. We'll introduce such foundational nonparametric Bayesian models as the Dirichlet process and Chinese restaurant process and touch on the wide variety of models available in nonparametric Bayes. Along the way, we'll see what exactly nonparametric Bayesian methods are and what they accomplish.

Materials

Note: you may find more updated versions of this tutorial here.

Prerequisites

What we won't cover

Gaussian processes are an important branch of nonparametric Bayesian modeling, but we won't have time to cover them here. We'll be focusing on the discrete, or Poisson point process, side of nonparametric Bayesian inference.