Probabilistic Graphical Models, Spring 2012

Probabilistic Graphical Models

Spring 2012

Overview

A graphical model is a probabilistic model, where the conditional dependencies between the random variables is specified via a graph. Graphical models provide a flexible framework for modeling large collections of variables with complex interactions, as evidenced by their wide domain of application, including for example machine learning, computer vision, speech and computational biology. This course will provide a comprehensive survey of learning and inference methods in graphical models.

General information

Lecture: Thursday 5:00-6:50pm
Room: Warren Weaver Hall 312

Instructor:
David Sontag   
dsontag {@ | at} cs.nyu.edu
Grader:
Chris Alberti
chris.alberti {@ | at} gmail.com

Office hours: Tuesday 5-6pm and by appointment. Location: 715 Broadway, 12th floor, Room 1204

Grading: problem sets (70%) + exam (30%). Problem Set policy

Book: Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman, MIT Press (2009).

Mailing list: To subscribe to the class list, follow instructions here.


Schedule

Week Date Topic Readings Assignments
1 Jan 26 Introduction, Bayesian networks
[Slides]
Chapters 1-3, Appendix A

How to write a spelling corrector (optional)

 ps1 due Feb 9 at 5pm. [Solutions]

2 Feb 2
Undirected graphical models
[Slides]
Chapter 4 (except for 4.6)

Introduction to Probabilistic Topic Models (optional)

3 Feb 9 Dual decomposition and NLP applications [Slides]

(Guest lecture by
Sasha Rush)
Introduction to Dual Decomposition for Inference (sections 1.1-1.4)

Dual Decomposition for NLP (optional)


4 Feb 16
Conditional random fields
[Slides]
Section 4.6

An Introduction to Conditional Random Fields (section 2)

Original paper introducing CRFs (optional)

 ps2 due Feb 23 at 5pm. [Solutions]

5 Feb 23
Exact inference
[Slides]
Sections 9-9.4, 9.6.1, 9.7-9.8, Chapter 10

 ps3 (data) due March 19 at 10am

6 March 1
Exact inference (continued)
[Slides, Notes]
Sections 13.1-13.3, 13.5-13.5.2, 13.7-13.9

(also relevant: readings from week 3)

 

7 March 8

(no class March 15, Spring break)
LP relaxations for MAP inference
[Slides, Notes]
Chapter 8

Introduction to Dual Decomposition for Inference (sections 1.5, 1.6)

 ps4 (data) due March 30 at 5pm

8 March 22
Variational inference
[Slides]
Chapter 11

 

9 March 29
Monte-Carlo methods for inference
Chapter 12

 ps5 (data) due April 12 at 5pm [Solutions]

10 April 3, 7-9pm.
Note special date, time, and location! Class will be in 719 Broadway, Room 1221
Learning (Bayesian networks)
[Slides]
Chapters 16, 17. Section 18.1

 

11 April 12 Learning (unobserved data, EM)
[Slides]
Sections 19.1, 19.2

The Expectation Maximization Algorithm: A short tutorial

Latent Dirichlet Allocation (sections A.3, A.4)

 ps6 (data) due May 3 at 5pm [Solutions]

12 April 19
Learning (Markov networks)
[Slides]
Chapter 20 (except for 20.7)

Notes on pseudo-likelihood

An Introduction to Conditional Random Fields (section 4)

Recent paper on approximate maximum entropy learning in MRFs (optional)

 

13 April 26 Learning (structured prediction)
[Slides]

 

14 May 3
Advanced topics (spectral algorithms)
[Slides]
Notes on spectral learning of hidden Markov models (optional)

A Method of Moments for Mixture Models and Hidden Markov Models (optional)

15
May 10
Final exam (in class)



Acknowledgements: Many thanks to the Toyota Technological Institute, Hebrew University, UC Berkeley, and Stanford University for sharing material used in slides and homeworks

Prerequisites

This is a graduate-level course. Students should previously have taken one of the following classes: In addition, students should have a solid understanding of basic concepts from probability and algorithms (e.g., dynamic programming, graphs, shortest paths, complexity).

These prerequisites may be waived in some cases (please e-mail instructor).

Problem Set policy

I expect you to try solving each problem set on your own. However, when being stuck on a problem, I encourage you to collaborate with other students in the class, subject to the following rules:

  1. You may discuss a problem with any student in this class, and work together on solving it. This can involve brainstorming and verbally discussing the problem, going together through possible solutions, but should not involve one student telling another a complete solution.

  2. Once you solve the homework, you must write up your solutions on your own, without looking at other people's write-ups or giving your write-up to others.

  3. In your solution for each problem, you must write down the names of any person with whom you discussed it. This will not affect your grade.

  4. Do not consult solution manuals or other people's solutions from similar courses.


MIT Accessibility