Tommi S. Jaakkola, Ph.D. Professor of Electrical Engineering and Computer Science MIT Computer Science and Artificial Intelligence Laboratory Stata Center, Bldg 32G470 Cambridge, MA 02139 Email: tommi at csail.mit.edu 
Home Papers Research Courses People 
Research synopsis (more...)On the theoretical side, our research focuses on statistical inference and estimation, development of principled approximation methods for problems with limited computational resources, analysis and development of algorithms for various modern estimation problems such as those involving predominantly incomplete data. The applied side of our work involves primarily functional genomics (transcriptional regulation), large scale inference problems, and information retrieval.
Students/postdocs* (more...)Keshav Dhandhania, Andreea Gane, Tatsu Hashimoto, Jean Honorio*, Paresh Malalur, Jonas Mueller, David Reshef, Yu Xin
TutorialsNIPS*2011 Tutorial with Amir Globerson on LP relaxations: Part 1 (Jaakkola) [pdf], Part 2 (Globerson) [pdf]
Recent papers (more...)Lowrank tensors for scoring dependency structures. In Association for Computational Linguistics, 2014. [pdf]
Steps to excellence: Simple inference with refined scoring of dependency trees. In Association for Computational Linguistics, 2014. [pdf]
A unified framework for consistency of regularized loss minimizers. In Proceedings of the 31th International Conference on Machine Learning, 2014. [pdf]
Learning with maximum aposteriori perturbation models. In Artificial Intelligence and Statistics, 2014. [pdf]
Active boundary annotation using random map perturbations. In Artificial Intelligence and Statistics, 2014. [pdf]
Tight bounds for the expected risk of linear classifiers and pacbayes finitesample guarantees. In Artificial Intelligence and Statistics, 2014. [pdf]
On measure concentration of random maximum aposteriori perturbations. In Proceedings of the 31th International Conference on Machine Learning, 2014.
Discovery of directional and nondirectional pioneer transcription factors by modeling dnase profile magnitude and shape. Nature Biotechnology, 32(2):171178, 2014. [pdf]
Learning efficient random maximum aposteriori predictors with nondecomposable loss functions. In In Advances of Neural Information Processing Systems, 2013. [pdf]
On sampling from the gibbs distribution with random maximum a posteriori perturbations. In In Advances of Neural Information Processing Systems, 2013. [pdf]
Twosided exponential concentration bounds for bayes error rate and shannon entropy. In Proceedings of the 30th International Conference on Machine Learning, 2013. [pdf]
Inverse covariance estimation for highdimensional data in linear time and space: Spectral methods for riccati and sparse models. In Proceedings of the 29th Conference on Uncertainty in Artificial Intelligence, 2013. [pdf]
On the partition function and random maximum aposteriori perturbations. In Proceedings of the 29th International Conference on Machine Learning (ICML), 2012. [pdf]
Lineage based identification of cellular states and expression programs. In Proceedings of the 20th Annual International Conference on Intelligent Systems for Molecular Biology (ISMB), 2012.
Approximate inference in additive factorial hmms with application to energy disaggregation. Proceedings of the 15th International Conference on Artificial Intelligence and Statistics, JMLR WCP, 22:14721482, 2012. [pdf]
Primaldual methods for sparse constrained matrix completion. Proceedings of the 15th International Conference on Artificial Intelligence and Statistics, JMLR WCP, 22:13231331, 2012. [pdf]
