Probabilistic Graphical Models
Spring 2012
Overview A graphical model is a probabilistic model, where the
conditional dependencies between the random variables is
specified via a graph. Graphical models provide a flexible
framework for modeling large collections of variables with
complex interactions, as evidenced by their wide domain of
application, including for example machine learning,
computer vision, speech and computational biology. This
course will provide a comprehensive survey of learning and
inference methods in graphical models. |
||||||||||
General information Lecture: Thursday 5:00-6:50pm
Office hours: Tuesday
5-6pm and by appointment. Location: 715 Broadway, 12th floor, Room
1204 Grading: problem sets (70%) + exam
(30%). Problem Set policy Book:
Probabilistic Graphical
Models: Principles and Techniques by Daphne Koller
and Nir Friedman, MIT Press (2009). Mailing list: To subscribe to the class list, follow instructions here. |
Schedule
Week | Date | Topic | Readings | Assignments |
1 | Jan 26 | Introduction, Bayesian networks [Slides] |
Chapters 1-3, Appendix A How to write a spelling corrector (optional) |
|
2 | Feb 2 |
Undirected graphical models [Slides] |
Chapter 4 (except for 4.6) Introduction to Probabilistic Topic Models (optional) |
|
3 | Feb 9 | Dual decomposition and NLP applications
[Slides] (Guest lecture by Sasha Rush) |
Introduction to Dual Decomposition for Inference (sections 1.1-1.4) Dual Decomposition for NLP (optional) |
|
4 | Feb 16 |
Conditional random fields [Slides] |
Section 4.6 An Introduction to Conditional Random Fields (section 2) Original paper introducing CRFs (optional) |
|
5 | Feb 23 |
Exact inference [Slides] |
Sections 9-9.4, 9.6.1, 9.7-9.8, Chapter 10 |
|
6 | March 1 |
Exact inference (continued) [Slides, Notes] |
Sections 13.1-13.3, 13.5-13.5.2, 13.7-13.9 (also relevant: readings from week 3) |
|
7 | March 8 (no class March 15, Spring break) |
LP relaxations for MAP inference [Slides, Notes] |
Chapter 8 Introduction to Dual Decomposition for Inference (sections 1.5, 1.6) |
|
8 | March 22 |
Variational inference [Slides] |
Chapter 11 |
|
9 | March 29 |
Monte-Carlo methods for inference |
Chapter 12 |
|
10 | April 3, 7-9pm. Note special date, time, and location! Class will be in 719 Broadway, Room 1221 |
Learning (Bayesian networks) [Slides] |
Chapters 16, 17. Section 18.1 |
|
11 | April 12 | Learning (unobserved data, EM) [Slides] |
Sections 19.1, 19.2 The Expectation Maximization Algorithm: A short tutorial Latent Dirichlet Allocation (sections A.3, A.4) |
|
12 | April 19 |
Learning (Markov networks) [Slides] |
Chapter 20 (except for 20.7) Notes on pseudo-likelihood An Introduction to Conditional Random Fields (section 4) Recent paper on approximate maximum entropy learning in MRFs (optional) |
|
13 | April 26 | Learning (structured prediction) [Slides] |
|
|
14 | May 3 |
Advanced topics (spectral algorithms) [Slides] |
Notes on spectral learning of hidden Markov models (optional) A Method of Moments for Mixture Models and Hidden Markov Models (optional) |
|
15 |
May 10 |
Final exam (in class) |
PrerequisitesThis is a graduate-level course. Students should previously have taken one of the following classes:
These prerequisites may be waived in some cases (please
e-mail instructor). |
Problem
Set policy I expect you to try solving each problem set on your own. However, when being stuck on a problem, I encourage you to collaborate with other students in the class, subject to the following rules:
|