6.435 Bayesian Modeling and Inference

Note: in 2018 and earlier, this course ran using the (temporary) number 6.882; since 2019, it has been 6.435.

Spring 2021
Location: We will be meeting virtually on Zoom. We will post details on Piazza and email registered students before the first class meeting.
Times: Tuesday, Thursday 2:30–4:00 PM
First class: Tuesday, February 16

Instructor:
  Professor Tamara Broderick
  Office Hours: Thursday, 4–5pm, on Zoom
  Email:

TAs:
  Tin ("Stan") Nguyen, Will Stephenson
  Office Hours: Tuesday, 4–5pm, on Zoom
  Email:


Introduction

As both the number and size of data sets grow, practitioners are interested in learning increasingly complex information and interactions from data. Probabilistic modeling in general, and Bayesian approaches in particular, provide a unifying framework for flexible modeling that includes prediction, estimation, and coherent uncertainty quantification. In this course, we will cover modern challenges of Bayesian inference, including (but not limited to) model construction, handling large or complex data sets, and the speed and quality of approximate inference. We will study Bayesian nonparametric methods, wherein model complexity grows with the size of the data; these methods allow us to learn, e.g., a greater diversity of topics as we read more documents from Wikipedia, identify more friend groups as we process more of Facebook's network structure, etc.

Logistics

Our course Piazza page is here: https://piazza.com/mit/spring2021/6435
All communication about the course will be through Piazza, so make sure to sign up there as soon as possible.

Note that this class is heavily based on discussion and active student participation. We will need to hold class synchronously to enable that discussion. In order to encourage open discussion, we will not be recording any classes.

Nothing will be formally due or graded during the first week of class.

Description

This course will cover Bayesian modeling and inference at an advanced graduate level. A tentative list of topics (which may change depending on our interests) is as follows:

Prerequisites

Requirements: A pre-existing graduate-level familiarity with machine learning/statistics and probability is required. (E.g. at MIT, 6.437 or 6.438 or [6.867 and 6.436].) We will assume familiarity with graphical models, exponential families, finite-dimensional Gaussian mixture models, expectation maximization, linear & logistic regression, hidden Markov models.