6.882 Bayesian Modeling and Inference

Spring 2017
Room 56-154
Tuesday, Thursday 2:30–4:00 PM
First class: Tuesday, February 7

Instructor:
  Professor Tamara Broderick
  Office Hours: Thursday, 4–5pm, 32-G498
  Email:

TA:
  Dr. Trevor Campbell
  Office Hours: Tuesday, 4–5pm, Location 32-G531
  Email:


Introduction

As both the number of data sets and data set sizes grow, practitioners are interested in learning increasingly complex information and interactions from data. Probabilistic modeling in general, and Bayesian approaches in particular, provide a unifying framework for flexible modeling that includes prediction, estimation, and coherent uncertainty quantification. In this course, we will cover the modern challenges of Bayesian inference, including (but not limited to) speed of approximate inference, making use of distributed architectures, streaming data, and complex data interactions. We will study Bayesian nonparametric models, wherein model complexity grows with the size of the data; this allows us to learn, e.g., a greater diversity of topics as we read more documents from Wikipedia, identify more friend groups as we process more of Facebook's network structure, etc.

Piazza Site

Our course Piazza page is here and will be updated shortly: https://piazza.com/mit/spring2017/6882

Description

This course will cover Bayesian modeling and inference at an advanced graduate level. A tentative list of topics (which may change depending on our interests) is as follows:

Prerequisites

Requirements: 6.867 or a more advanced graduate-level machine learning course (6.437, 6.438 are great), 6.041B or more advanced probability background, 18.06 or more advanced linear algebra background. Or see instructor for permission.

A graduate-level familiarity with statistics, machine learning, and probability is required. We will assume familiarity with graphical models, exponential families, finite-dimensional Gaussian mixture models, expectation maximization, linear & logistic regression, hidden Markov models.

Assessment