Accessibility
You can view all the papers in reverse
chronological order, sets of papers related to broad categories such as
machine learning,
natural language processing,
chemistry,
computational biology,
or physics, or papers in more specific areas including
inference, semi-supervised learning , information retrieval, or
reinforcement learning.
Inference papers
Y. Zhang, T. Lei, R. Barzilay, and T. Jaakkola.
Greed is good if randomized: New inference for dependency parsing.
In Conference on Empirical Methods in Natural Language Processing (EMNLP), 2014.
[pdf]
T. Lei, Y. Xin, Y. Zhang, R. Barzilay, and T. Jaakkola.
Low-rank tensors for scoring dependency structures.
In Association for Computational Linguistics, 2014.
[pdf]
Y. Zhang, T. Lei, R. Barzilay, T. Jaakkola, and A. Globerson.
Steps to excellence: Simple inference with refined scoring of dependency trees.
In Association for Computational Linguistics, 2014.
[pdf]
A. Gane, T. Hazan, and T. Jaakkola.
Learning with maximum a-posteriori perturbation models.
In Artificial Intelligence and Statistics, 2014.
[pdf]
T. Hazan, S. Maji, and T. Jaakkola.
On sampling from the gibbs distribution with random maximum a posteriori perturbations.
In Advances of Neural Information Processing Systems, 2013.
[pdf]
T. Hazan and T. Jaakkola.
On the partition function and random maximum a-posteriori perturbations.
In Proceedings of the 29th International Conference on Machine Learning (ICML), 2012.
[pdf]
O. Meshi, T. Jaakkola, and A. Globerson.
Convergence rate analysis of map coordinate minimization algorithms.
In Advances in Neural Information Processing Systems, 2012.
Z. Kolter and T. Jaakkola.
Approximate inference in additive factorial hmms with application to energy disaggregation.
Proceedings of the 15th International Conference on Artificial Intelligence and Statistics, JMLR WCP, 22:1472--1482, 2012.
[pdf]
D. Sontag, A. Globerson, and T. Jaakkola.
Introduction to dual decomposition for inference.
In S. Sra, S. Nowozin, and S. Wright, Eds., Optimization for Machine Learning. MIT Press, 2010.
[pdf]
D. Sontag, O. Meshi, T. Jaakkola, and A. Globerson.
More data means less inference: A pseudo-max approach to structured learning.
In Advances in Neural Information Processing Systems 24, 2010.
[pdf]
A. Rush, D. Sontag, M. Collins, and T. Jaakkola.
On dual decomposition and linear programming relaxations for natural language processing.
In Conference on Empirical Methods in Natural Language Processing (EMNLP), 2010.
[pdf]
T. Koo, A. Rush, M. Collins, T. Jaakkola, and D. Sontag.
Dual decomposition for parsing with non-projective head automata.
In Conference on Empirical Methods in Natural Language Processing (EMNLP), 2010.
[pdf]
O. Meshi, D. Sontag, T. Jaakkola, and A. Globerson.
Learning efficiently with approximate inference via dual losses.
In Proceedings of the 27th International Conference on Machine Learning, 2010.
[pdf]
T. Jaakkola, D. Sontag, A. Globerson, and M. Meila.
Learning bayesian network structure using lp relaxations.
In Proceedings of the 13th International Conference on Artificial Intelligence and Statistics, 2010.
[pdf] [slides]
D. Sontag and T. Jaakkola.
Tree block coordinate descent for map in graphical models.
In Proceedings of the 12th International Conference on Artificial Intelligence and Statistics, 2009.
[pdf]
D. Sontag, A. Globerson, and T. Jaakkola.
Clusters and coarse partitions in lp relaxations.
In Advances in Neural Information Processing Systems 21, 2008.
[pdf]
D. Sontag, T. Meltzer, A. Globerson, T. Jaakkola, and Y. Weiss.
Tightening lp relaxations for map using message passing.
In Proceedings of the 24rd Conference on Uncertainty in Artificial Intelligence, 2008.
[pdf]
D. Sontag and T. Jaakkola.
New outer bounds on the marginal polytope.
In Advances in Neural Information Processing Systems 20, 2007.
[pdf]
A. Globerson and T. Jaakkola.
Fixing max-product: Convergent message passing algorithms for map lp-relaxations.
In Advances in Neural Information Processing Systems 20, 2007.
[pdf]
A. Globerson and T. Jaakkola.
Convergent propagation algorithms via oriented trees.
In Proceedings of the 23rd Conference on Uncertainty in Artificial Intelligence, 2007.
[pdf]
D. Sontag and T. Jaakkola.
On iteratively constraining the marginal polytope for approximate inference and map.
Technical report, 2007.
[pdf]
A. Globerson and T. Jaakkola.
Approximate inference using conditional entropy decompositions.
In Proceedings of the 11th International Conference on Artificial Intelligence and Statistics, 2007.
[pdf]
A. Globerson and T. Jaakkola.
Approximate inference using planar graph decomposition.
In Advances in Neural Information Processing Systems 19, 2006.
[pdf]
M. Wainwright, T. Jaakkola, and A. Willsky.
Map estimation via agreement on (hyper)trees: Message-passing and linear-programming approaches.
IEEE Transactions on Information Theory, 51(11):3697--3717, 2005.
[pdf]
M. Wainwright, T. Jaakkola, and A. Willsky.
A new class of upper bounds on the log partition function.
IEEE Transactions on Information Theory, 51:2313--2335, 2005.
[pdf]
M. Wainwright, T. Jaakkola, and A. Willsky.
Tree consistency and bounds on the performance of the max-product algorithm and its generalizations.
Statistics and Computing, 14(2):143--166, 2004.
[pdf]
M. Wainwright, T. Jaakkola, and A. Willsky.
Tree-based parameterization framework for analysis of belief propagation and related algorithms.
IEEE Transactions on information theory, 2002.
M. J. Wainwright, T. Jaakkola, and A. S. Willsky.
Exact map estimates by (hyper)tree agreement.
In Advances in Neural Information processing systems 15, 2002.
[ps.gz]
M. J. Wainwright, T. Jaakkola, and A. S. Willsky.
A new class of upper bounds on the log partition function.
In Proceedings of the Eighteenth Annual Conference on Uncertainty in Artificial Intelligence, 2002.
[ps.gz]
M. Wainwright, T. Jaakkola, and A. Willsky.
Tree-based reparameterization for approximate estimation on loopy graphs.
In Advances in Neural Information processing systems 14, 2001.
[pdf]
M. J. Wainwright, T. Jaakkola, and A. S. Willsky.
Tree-based reparameterization framework for approximate estimation in graphs with cycles.
LIDS Technical Report P-2510, 2001.
[ps.gz]
T. Jaakkola.
Tutorial on variational approximation methods.
In Advanced mean field methods: theory and practice. MIT Press, 2000.
[ps]
B. Frey, R. Patrascu, T. Jaakkola, and J. Moran.
Sequentially fitting inclusive trees for inference in noisy-or networks.
In Advances in Neural Information Processing Systems 13. MIT Press, 2000.
[ps]
T. Jaakkola and M. Jordan.
Variational probabilistic inference and the qmr-dt database.
Journal of Artificial Intelligence Research, 10:291--322, 1999.
[ps] [pdf]
M. Jordan, Z. Ghahramani, T. Jaakkola, and L. Saul.
An introduction to variational methods for graphical models.
Machine Learning, 37(2):183, 1999.
[ps]
C. Bishop, N. Lawrence, T. Jaakkola, and M. Jordan.
Approximating posterior distributions in belief networks using mixtures.
In Advances in Neural Information Processing Systems 10, 1997.
[ps]
T. Jaakkola.
Variational methods for inference and estimation in graphical models.
PhD thesis, MIT, 1997.
[ps]
T. Jaakkola and M. Jordan.
Improving the mean field approximation via the use of mixture distributions.
In Proceedings of the NATO ASI on Learning in Graphical Models. Kluwer, 1997.
[ps]
L. Saul, T. Jaakkola, and M. Jordan.
Mean field theory for sigmoid belief networks.
Journal of Artificial Intelligence Research, 4:61--76, 1996.
[ps] [pdf]
T. Jaakkola and M. Jordan.
Recursive algorithms for approximating probabilities in graphical models.
In Advances in Neural Information Processing Systems 9, 1996.
[ps]
T. Jaakkola and M. Jordan.
Computing upper and lower bounds on likelihoods in intractable networks.
In Proceedings of the Twelfth Annual Conference on Uncertainty in Artificial Intelligence, pages 340--348, 1996.
[ps]
T. Jaakkola, L. Saul, and M. Jordan.
Fast learning by bounding likelihoods in sigmoid type belief networks.
In Advances in Neural Information Processing Systems 8, 1995.
[ps]