I am a Professor at IST Austria. My research focuses on high-performance algorithms, and spans from algorithms and lower bounds, to practical implementations.
Before IST, I was a researcher at ETH Zurich and Microsoft Research, Cambridge, UK. Prior to that, I was a Postdoctoral Associate at MIT CSAIL, working with Prof. Nir Shavit. I received my PhD from the EPFL, under the guidance of Prof. Rachid Guerraoui.
My research is supported by a 2018 ERC Starting Grant, the Austrian FWF, and generous support from Amazon and Google.
I am also a Principal Machine Learning Researcher at Neural Magic.
You can find a list of recent projects and publications here.
Open Positions:
- We have open PhD and postdoc positions. Applicants with experience in optimization theory or distributed systems implementation are particularly encouraged, although all strong applicants will be given proper consideration.
For questions about the positions and the application process, please contact me via email at dan.alistarh@ist.ac.at. The application should contain a CV, publication list, a 1-page statement describing motivation and research interests, and links to your publications.
Recent/future service:
- Distributed Computing Journal, Journal of Machine Learning Research, PODC 2022, PPoPP 2022, NeurIPS 2022, MLSYS 2023
- DCC 2021 (Program Chair), DISC 2021, ICDCS 2021 (Track Chair), PPoPP 2021, MLSys 2021
- FOCS 2020, PODC 2020, PPoPP 2020, AAAI 2020, MLSys 2020
- NeurIPS 2019, DISC 2019, ICML 2019, ISCA 2019 (external), ICML 2019, SysML 2019, PPoPP 2019
- NIPS 2018, DISC 2018, PODC 2018, ICDCS 2018, DISC 2017, ALGOCLOUD 2017 (PC chair)
News:
- Jen and Alex’s work on characterizing bias in pruned models has been accepted to CVPR 2023.
- Our work on one-shot quantization of GPT-scale models (GPTQ) and compression-aware optimization (CrAM) will appear in the proceedings of ICLR 2023.
- Elias’s work on one-shot compression of deep neural networks appeared in NeurIPS 2022.
- Our work on compressing large language models using second-order information will appear in EMNLP 2022.
- Our paper on neural network compression with speedup guarantees (SPDY) appeared at ICML 2022, while our paper on the transferrability of sparse features appeared at CVPR 2022.
- Our former intern Sasha Voitovych presented our work on leader election in graphical population protocols at PODC 2022.
- Our group has 5 research papers accepted at NeurIPS 2021, on efficient approximations of second-order information, sparse DNN training with guarantees, decentralized training of DNNs, as well as upper and lower bounds for fundamental problems in distributed optimization.
- Our paper on Lower Bounds for Leader Election under Bounded Contention won the best paper award at DISC 2021.
- Together with Torsten Hoefler, I gave a tutorial at ICML 2021 on Sparsity in Deep Learning. The recordings are available here, and the JMLR survey on which the tutorial is based is available here.
- Our paper on communication-compression for distributed second-order optimization was accepted to ICML 2021.
- Our work on analyzing competitive dynamics in population protocols was accepted to PODC 2021, while our work proposing a first concurrent algorithm for dynamic connectivity was accepted to SPAA 2021.
- We have new work appearing in ICLR 2021 on an optimal compression scheme for distributed variance reduction, and on a state-of-the-art Byzantine-resilient defence for distributed SGD.
- Two new papers accepted to AAAI 2021, on a new consistency condition for distributed SGD, and on the first convergence guarantees for asynchronous SGD over non-smooth, non-convex objectives.
- Three new papers presented at NeurIPS 2020, on model compression, scheduling for concurrent belief propagation, and adaptive gradient quantization for SGD.
- Our work describing the Splay-List (a distributionally-adaptive concurrent Skip-List) was presented at DISC 2020, and was invited to the Special Issue of Distributed Computing dedicated to the conference.
- Our work on Concurrent Search Trees with Doubly-Logarithmic Running Time won the Best Paper Award at PPoPP 2020. Our work on handling load imbalance in deep learning workloads was also presented at the conference.
- Our work on searching for the fastest concurrent Union-Find algorithm was chosen as Best Paper at OPODIS 2019.
- I gave a Keynote Talk at DISC 2019 on Distributed Machine Learning. The slides are here.
I am extremely lucky to be able to work with the following students and postdocs:
- Joel Rybicki (PostDoc @ IST)
- Janne Korhonen (PostDoc @ IST)
- Giorgi Nadiradze (PhD Student @ IST)
- Alexandra Peste (PhD @ IST, co-advised with Christoph Lampert)
- Aleksandr Shevchenko (PhD @ IST, co-advised with Marco Mondelli)
- Ilya Markov (PhD Student @ IST)
- Jen Iofinova (PhD Student @ IST)
- Elias Frantar (PhD Student @ IST)
- Nikita Koval (PhD Student @ ITMO, Researcher at JetBrains)
Visitors / Collaborators / Friends of the Lab:
- Prof. Faith Ellen (visiting February-May 2020)
- Prof. Nir Shavit (visiting November 2019)
- Prof. Thomas Sauerwald (visiting Oct 2019)
- Prof. Gregory Valiant (visiting Nov 2018)
- Prof. Robert Tarjan (visiting May 2018)
- Dr. Frank McSherry (visiting May 2018)
- Dr. Aleksandar Prokopec (visiting April 2018)
- Prof. Ce Zhang (ETH)
- Prof. Markus Pueschel (ETH)
- Prof. Torsten Hoefler (ETH)
Alumni:
- Peter Davies (PostDoc @ IST -> Lecturer at University of Surrey)
- Bapi Chatterjee (PostDoc @ IST -> Faculty at IIIT Delhi)
- Vitaly Aksenov (PostDoc @ IST -> Lecturer @ ITMO)
- Trevor Brown (PostDoc @ IST -> Assistant Professor @ Waterloo)
- Amirmojtaba Sabour (Intern @ IST -> PhD Student @ UofT)
- Vijaykrishna Gurunanthan (Intern @ IST -> PhD Student @Stanford)
- Saleh Ashkboos (Intern @ IST -> PhD @ ETH Zurich)
- Antonio Polino (MSc @ ETH -> Google Search)
- Rati Gelashvili (PhD@MIT, now Algorithms Lead at NeuralMagic)
- Justin Kopinsky (PhD@MIT, now Engineering Lead at NeuralMagic)
- Cédric Renggli (MSc Thesis (ETH Medal), with Torsten Hoefler, now PhD@ETH)
- Nandini Singhal (Intern @ IST -> Microsoft Research)
- Martin Thoresen (Intern -> PhD @ IST)
- Amirmojtaba Sabour (Intern @ IST)
- Amirkeivan Mohtashami (Intern @ IST)
- Faezeh Ebrahimianghazani (Intern @ IST)
- Dasha Voronkova (Intern @ IST)
- Aditya Sharma (Intern @ IST)
- Arnab Kar (MSc Thesis @ IST -> PhD @ Duke University)
- Ekaterina Goltsova (ISTern @ IST -> Master’s @ EPFL)
- Demjan Grubić (MSc, now at Google)
- Aline Abler (BSc, with Torsten Hoefler)
- Raphael Kuebler (BSc)
- Jerry Z. Li (Intern, PhD@MIT -> Microsoft Research)
- Jenny Iglesias (Intern, PhD@CMU)
- Syed Kamran Haider (Intern @ MSR, now Researcher @ Qualcomm)
- Hyunjik Kim (Intern @ MSR -> PhD @ Oxford -> Researcher at DeepMind)