
I am a Professor at MIT's Electrical Engineering and Computer Science department, a member of CSAIL, and affiliated with LIDS and ORC. I am also an investigator in the MIT Institute for Foundations of Data Science (MIFODS).
[Short Bio] [Honors and Awards]
Research Interests: Theory of Computation, and its interface with Economics and Game Theory, Machine Learning, Statistics and Probability Theory, and Computational Biology
Publications
Recent Publications
 Constantinos Daskalakis, Themis Gouleakis, Christos Tzamos, Manolis Zampetakis: Computationally and Statistically Efficient Truncated Regression.
In the 32nd Annual Conference on Learning Theory, COLT 2019. pdf.
 Yuval Dagan, Constantinos Daskalakis, Nishanth Dikkala, Siddhartha Jayanti: Generalization and learning under Dobrushin's condition.
In the 32nd Annual Conference on Learning Theory, COLT 2019. arXiv.
 Constantinos Daskalakis, Nishanth Dikkala, Ioannis Panageas: Regression from Dependent Observations.
In the 51st Annual ACM Symposium on the Theory of Computing, STOC 2019. arXiv
 Andrew Ilyas, Ajil Jalal, Eirini Asteri, Constantinos Daskalakis, Alexandros G. Dimakis: The Robust Manifold Defense: Adversarial Training using Generative Models. arXiv.
 Constantinos Daskalakis, Ioannis Panageas: LastIterate Convergence: ZeroSum Games and Constrained MinMax Optimization.
In the 10th Innovations in Theoretical Computer Science (ITCS) conference, ITCS 2019. arXiv.
 Constantinos Daskalakis, Ioannis Panageas: The Limit Points of (Optimistic) Gradient Descent in MinMax Optimization.
In the 32nd Annual Conference on Neural Information Processing Systems (NeurIPS), NeurIPS 2018. arXiv.
 Constantinos Daskalakis, Nishanth Dikkala, Siddhartha Jayanti: HOGWILD!Gibbs can be PanAccurate.
In the 32nd Annual Conference on Neural Information Processing Systems (NeurIPS), NeurIPS2018. arXiv.
 Nima Anari, Constantinos Daskalakis, Wolfgang Maass, Christos Papadimitriou, Amin Saberi, Santosh Vempala: Smoothed Analysis of Discrete Tensor Decomposition and Assemblies of Neurons.
In the 32nd Annual Conference on Neural Information Processing Systems (NeurIPS), NeurIPS 2018. arXiv.
 Jayadev Acharya, Arnab Bhattacharyya, Constantinos Daskalakis, Saravanan Kandasamy: Learning and Testing Causal Models with Interventions.
In the 32nd Annual Conference on Neural Information Processing Systems (NeurIPS), NeurIPS 2018. arXiv.
 Constantinos Daskalakis, Themis Gouleakis, Christos Tzamos, Manolis Zampetakis: Efficient Statistics, in High Dimensions, from Truncated Samples.
In the 59th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2018. arXiv.
 Shipra Agrawal, Constantinos Daskalakis, Vahab Mirrokni, Balasubramanian Sivan: Robust Repeated Auctions under Heterogeneous Buyer Behavior.
In the 19th ACM conference on Economics and Computation, EC 2018. arXiv.
 Constantinos Daskalakis, Nishanth Dikkala, Nick Gravin: Testing Symmetric Markov Chains from a Single Trajectory.
In the 31st Annual Conference on Learning Theory, COLT 2018. arXiv.
 Constantinos Daskalakis, Christos Tzamos, Manolis Zampetakis: Bootstrapping EM via EM and Convergence Analysis in the Naive Bayes Model.
In the 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018.
 Constantinos Daskalakis, Andrew Ilyas, Vasilis Syrgkanis, Haoyang Zeng: Training GANs with Optimism.
In the 6th International Conference on Learning Representations, ICLR 2018. arXiv.
 Constantinos Daskalakis, Christos Tzamos and Manolis Zampetakis: A Converse to Banach's Fixed Point Theorem and its CLS Completeness.
In the 50th Annual ACM Symposium on the Theory of Computing, STOC 2018. arXiv
 Constantinos Daskalakis, Gautam Kamath and John Wright: Which Distribution Distances are Sublinearly Testable?.
In the 29th Annual ACMSIAM Symposium on Discrete Algorithms, SODA 2018. arXiv
 Constantinos Daskalakis, Nishanth Dikkala and Gautam Kamath: Testing Ising Models.
In the 29th Annual ACMSIAM Symposium on Discrete Algorithms, SODA 2018. arXiv
 Constantinos Daskalakis, Nishanth Dikkala and Gautam Kamath: Concentration of Multilinear Functions of the Ising Model with Applications to Network Data.
In the 31st Annual Conference on Neural Information Processing Systems (NIPS), NIPS 2017. arXiv
 Yang Cai and Constantinos Daskalakis: Learning MultiItem Auctions with (or without) Samples.
In the 58th IEEE Symposium on Foundations of Computer Science (FOCS), FOCS 2017. arxiv
 Constantinos Daskalakis and Yasushi Kawase: Optimal Stopping Rules for Sequential Hypothesis Testing.
In the 25th Annual European Symposium on Algorithms (ESA), ESA 2017. pdf
 Bryan Cai, Constantinos Daskalakis and Gautam Kamath: Priv'IT: Private and Sample Efficient Identity Testing.
In the 34th International Conference on Machine Learning, ICML 2017. arXiv
 Constantinos Daskalakis, Christos Tzamos and Manolis Zampetakis: Ten Steps of EM Suffice for Mixtures of Two Gaussians.
In the 30th Annual Conference on Learning Theory, COLT 2017. Preliminary version presented at NIPS 2016 Workshop on NonConvex Optimization for Machine Learning. arXiv
 Constantinos Daskalakis and Qinxuan Pan: Square Hellinger Subadditivity for Bayesian Networks and its Applications to Identity Testing.
In the 30th Annual Conference on Learning Theory, COLT 2017. arXiv
Selected Publications
Equilibrium Complexity:
 Constantinos Daskalakis, Paul W. Goldberg and Christos H. Papadimitriou: The Complexity of Computing a Nash Equilibrium.
In the 38th ACM Symposium on Theory of Computing, STOC 2006.
Journal version as SIAM Journal on Computing, 39(1), 195259, May 2009. (Invited, special issue for STOC 2006.) pdf
Expository article in Communications of the ACM 52(2):8997, 2009. (Invited.) pdf

Constantinos Daskalakis: On the Complexity of Approximating a Nash Equilibrium.
In the 22nd Annual ACMSIAM Symposium on Discrete Algorithms, SODA 2011.
ACM Transactions on Algorithms (TALG), 9(3): 23, 2013. Special Issue for SODA 2011. Invited. pdf
 Constantinos Daskalakis and Christos Papadimitriou: Approximate Nash Equilibria in Anonymous Games.
Journal of Economic Theory, 156:207245, 2015. pdf
Journal version of papers in FOCS 2007, FOCS 2008, and STOC 2009.
Phylogenetics:
 Constantinos Daskalakis, Elchanan Mossel and Sebastien Roch: Optimal Phylogenetic Reconstruction.
In the 38th ACM Symposium on Theory of Computing, STOC 2006. arXiv
Journal version as Probability Theory and Related Fields, 149(12):149189, 2011.
Coding Theory:

Sanjeev Arora, Constantinos Daskalakis and David Steurer: MessagePassing Algorithms and Improved LP Decoding.
In the 41st ACM Symposium On Theory of Computing, STOC 2009.
Journal version as IEEE Transactions on Information Theory, 58(12):72607271, 2012. pdf
Game Theory
 Constantinos Daskalakis and Qinxuan Pan: A CounterExample to Karlin's Strong Conjecture for Fictitious Play.
In the 55th IEEE Symposium on Foundations of Computer Science, FOCS 2014. arxiv
 Yang Cai, Ozan Candogan, Constantinos Daskalakis and Christos Papadimitriou: Zerosum Polymatrix Games: A Generalization of Minmax.
Mathematics of Operations Research, 41(2): 648655, 2016. pdf
Probability Theory:
 Constantinos Daskalakis and Christos Papadimitriou: Sparse Covers for Sums of Indicators.
Probability Theory and Related Fields, 162(3):679705, 2015. arXiv
 Constantinos Daskalakis, Anindya De, Gautam Kamath and Christos Tzamos: A SizeFree CLT for Poisson Multinomials and its Applications.
In the 48th ACM Symposium on Theory of Computing, STOC 2016. arXiv
 Constantinos Daskalakis, Nishanth Dikkala and Gautam Kamath: Concentration of Multilinear Functions of the Ising Model with Applications to Network Data.
In the 31st Annual Conference on Neural Information Processing Systems (NIPS), NIPS 2017. arXiv
Mechanism Design:
 Yang Cai, Constantinos Daskalakis and Matt Weinberg: An Algorithmic Characterization of MultiDimensional Mechanisms.
In the 44th ACM Symposium on Theory of Computing, STOC 2012. ECCC Report.
 Yang Cai, Constantinos Daskalakis and Matt Weinberg: Understanding Incentives: Mechanism Design becomes Algorithm Design.
In the 54th IEEE Symposium on Foundations of Computer Science, FOCS 2013. arxiv
 Constantinos Daskalakis, Alan Deckelbaum and Christos Tzamos: Mechanism Design via Optimal Transport.
In the 14th ACM Conference on Electronic Commerce, EC 2013. Best Paper and Best Student Paper Award.
pdf
 Constantinos Daskalakis, Alan Deckelbaum and Christos Tzamos: Strong Duality for a MultipleGood Monopolist.
In the 16th ACM Conference on Economics and Computation (EC), EC 2015. arXiv
Journal version as Econometrica, 85(3):735767, 2017.
 Yang Cai, Constantinos Daskalakis and Christos H. Papadimitriou: Optimum Statistical Estimation with Strategic Data Sources.
In the 28th Annual Conference on Learning Theory (COLT), COLT 2015. arXiv
 Constantinos Daskalakis, Christos Papadimitriou and Christos Tzamos: Does Information Revelation Improve Revenue?.
In the 17th ACM Conference on Economics and Computation (EC), EC 2016. Invited to Special Issue. pdf
 Constantinos Daskalakis and Vasilis Syrgkanis: Learning in Auctions: Regret is Hard, Envy is Easy.
In the 57th IEEE Symposium on Foundations of Computer Science (FOCS), FOCS 2016. arxiv
 Yang Cai and Constantinos Daskalakis: Learning MultiItem Auctions with (or without) Samples.
In the 58th IEEE Symposium on Foundations of Computer Science (FOCS), FOCS 2017. arxiv
Learning and Statistics:
 Constantinos Daskalakis, Ilias Diakonikolas and Rocco A. Servedio: Learning Poisson Binomial Distributions.
In the 44th ACM Symposium on Theory of Computing, STOC 2012. arXiv
Algorithmica, 72(1):316357, 2015. Special Issue on New Theoretical Challenges in Machine Learning. Invited. arXiv
 Constantinos Daskalakis, Christos Tzamos and Manolis Zampetakis: Ten Steps of EM Suffice for Mixtures of Two Gaussians.
In the 30th Annual Conference on Learning Theory, COLT 2017. Preliminary version presented at NIPS 2016 Workshop on NonConvex Optimization for Machine Learning. arXiv
 Constantinos Daskalakis, Christos Tzamos and Manolis Zampetakis: A Converse to Banach's Fixed Point Theorem and its CLS Completeness.
In the 50th Annual ACM Symposium on the Theory of Computing, STOC 2018. arXiv
 Constantinos Daskalakis, Andrew Ilyas, Vasilis Syrgkanis, Haoyang Zeng: Training GANs with Optimism.
In the 6th International Conference on Learning Representations, ICLR 2018. arXiv.
 Constantinos Daskalakis, Themis Gouleakis, Christos Tzamos, Manolis Zampetakis: Efficient Statistics, in High Dimensions, from Truncated Samples.
In the 59th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2018. arXiv.
 Andrew Ilyas, Ajil Jalal, Eirini Asteri, Constantinos Daskalakis, Alexandros G. Dimakis: The Robust Manifold Defense: Adversarial Training using Generative Models.
arXiv, 2017.
Property Testing and Statistics:
 Jayadev Acharya, Constantinos Daskalakis and Gautam Kamath: Optimal Testing for Properties of Distributions.
In the 29th Annual Conference on Neural Information Processing Systems (NIPS), NIPS 2015. Spotlight Paper. arXiv
 Constantinos Daskalakis and Qinxuan Pan: Square Hellinger Subadditivity for Bayesian Networks and its Applications to Identity Testing.
In the 30th Annual Conference on Learning Theory, COLT 2017. arXiv
 Bryan Cai, Constantinos Daskalakis and Gautam Kamath: Priv'IT: Private and Sample Efficient Identity Testing.
In the 34th International Conference on Machine Learning, ICML 2017. arXiv
 Constantinos Daskalakis, Gautam Kamath and John Wright: Which Distribution Distances are Sublinearly Testable?
In the 29th Annual ACMSIAM Symposium on Discrete Algorithms, SODA 2018. arXiv
Research Highlights, Slides, Videos
My research focuses on algorithms, game theory, probability, learning and statistics. Here are some highlights.
Complexity of Equilibria: My Ph.D. research examines whether rational, strategic agents may arrive,
through their interaction, at a state where no single one of them would be better off by changing their own strategy unless others did
so as well. Such a state is called Nash equilibrium, in honor of John Nash, who showed that such a state always exists. It is commonly
used in Game Theory to predict the outcome arising through the interaction of strategic individuals in situations of conflict. With Paul
Goldberg and Christos Papadimitriou I show that in complex systems Nash equilibrium can be computationally unachievable.
This implies that it is not always justifiable or relevant to study the Nash equilibria of a system. Here is a simplified exposition
of our article that we wrote for CACM's February 2009 Issue. Here is also a survey article that I wrote on the complexity of Nash equilibria,
which appeared in a Computer Science Review special volume dedicated to Christos Papadimitriou's
work.
I more recently showed that even arriving at an approximate Nash equilibrium can be computationally intractable.
My dissertation was awarded the 2008 ACM Doctoral Dissertation Award. With Paul Goldberg and Christos Papadimitriou, we also received the Game Theory and Computer Science Prize from the
Game Theory Society. The citation for our paper is as follows: "This paper made key conceptual and technical contributions in an illustrious line of work on the complexity of computing Nash
equilibrium. It also highlights the necessity of constructing practical algorithms that compute equilibria efficiently on important subclasses of games."
Here is a blog post from the congress by Paul. In 2011, our same paper was awarded the 2011 SIAM Outstanding Paper Prize.
Tutorial on Complexity of Equilibria (Video) at Simons Insitute for the Theory of Computing
Mechanism Design: Mechanism design, sometimes called "inverse Game Theory," studies how to design incentives that promote a desired objective in the
outcome of the strategic interaction between agents. Examples include designing an election protocol that ensures the election of the best candidate,
designing a procedure for allocating public resources in order to maximize the social welfare from the allocation, and designing an auction for
selling paintings that maximizes the auctioneer's revenue. In all these cases, the design challenge lies in the fact that the designer's objective is not
aligned with that of each strategic agent: a voter wants to maximize the chance that her favorite candidate is elected, while a bidder wants to maximize his own utility.
So the right incentives need to be put in place in order to incentivize the agents to behave in a manner that promotes the designer's objective.
Mechanism design has received a lot of study in Economics since the 1970s, and has found a plethora of applications in practice
such as in the design of online markets and the sale of radio spectrum to telecoms by governments, but many challenges remain. My group has focused on the challenge of
multidimensional mechanism design,
which targets mechanism design settings in which the preferences of the agents are multidimensional, e.g. selling multiple items in an auction. While maximizing social welfare in this setting can be achieved with the
celebrated VCG mechanism, the understanding of other objectives such as revenue optimization is quite limited. With Yang Cai and Matt Weinberg, I provide an
algorithmic framework
for computing optimal mechanisms for arbitrary objectives. As a special application of our framework, we provide an
algorithmic generalization
of Myerson's celebrated singleitem revenueoptimal auction to the multiitem setting. Given the auctioneer's information about
the bidders, in the form of a distribution over their preferences over items and bundles of items, our algorithmic framework efficiently
computes a revenueoptimal auction.
My work with Alan Deckelbaum and Christos Tzamos obtains analytical characterizations of optimal multiitem mechanisms in the singlebuyer setting, using optimal transport theory. See our Econometrica paper,
whose preliminary form received the Best
Paper Award at the ACM Conference on Economics and Computation in 2013.
SLIDES on Algorithmic Bayesian Mechanism Design from an EC 2014 tutorial with Cai and Weinberg.
Tutorial on Algorithmic Mechanism Design (Video) at Simons Insitute for the Theory of Computing
Recent Surveys:
Statistics: Statistics has traditionally focused on the asymptotic analysis of tests, as the number of samples tends to infinity. Moreover,
typically statistical tests only detect certain types of deviations
from the null hypothesis, or are designed to select between a null and an alternative hypothesis that are fixed distributions
or parametric families of distributions. Motivated by applications where the
sample size is small
relative to the support of the distribution, which arises naturally in highdimensional settings,
we rethink the foundations of Statistics in the small sample regime. We also place emphasis on accommodating composite and nonparametric hypotheses, and aim at detecting all deviations from the null.
In a NIPS'15 spotlight paper with Acharya and Kamath, we provide statistical tests for
testing nonparametric properties of distributions in the nonasymptotic regime and with tight control for type II errors.
A talk on this work can be viewed here.
Here is also a survey I wrote on nonasymptotic statistical hypothesis testing with Kamath and Wright.
Our more recent work, focuses on highdimensional testing problems (see Bayesnets, Ising 1 and Ising 2),
on testing causal models, and on testing while respecting the privacy of the sample.
Here are SLIDES from a tutorial I gave at Greek Stochastics Theta.
