I’m an Assistant Professor at IST Austria. My research focuses on concurrent data structures and distributed algorithms, and spans from algorithms and lower bounds, to practical implementations.
Before IST, I was a researcher at ETH Zurich and Microsoft Research, Cambridge, UK. Prior to that, I was a Postdoctoral Associate at MIT CSAIL, working with Prof. Nir Shavit. I received my PhD from the EPFL, under the guidance of Prof. Rachid Guerraoui.
My research is supported by a 2018 ERC Starting Grant, the Austrian FWF, and generous support from Amazon and Google.
I am also the Machine Learning Research Lead at Neural Magic.
- We have open intern, PhD and postdoc positions as part of the ScaleML ERC Starting Grant project, whose goal is to develop new theory, algorithms, and systems for scalable machine learning. The ideal candidate would have a strong background in CS or Math, with a PhD in CS or a related field. Applicants with experience in optimization theory or distributed systems implementation are particularly encouraged, although all strong applicants will be given proper consideration.
For questions about the positions and the application process, please contact me via email at email@example.com. The application should contain a CV, publication list, a 1-page statement describing motivation and research interests, and, if applicable, links to your publications.
- NeurIPS 2021, DCC 2021 (Program Chair), DISC 2021, PPoPP 2021, MLSys 2021, FOCS 2020, PODC 2020, PPoPP 2020, AAAI 2020, MLSys 2020, NeurIPS 2019, DISC 2019, ICML 2019, ISCA 2019 (external), ICML 2019, SysML 2019, PPoPP 2019, NIPS 2018, DISC 2018, PODC 2018, ICDCS 2018, DISC 2017, ALGOCLOUD 2017 (PC chair)
- I am organizing the 2021 Workshop on Distributed Cloud Computing. Check out the great list of speakers!
- Our paper on quantization for distributed second-order optimization was accepted to ICML 2021.
- Our work on analyzing competitive dynamics in population protocols was accepted to PODC 2021, while our work proposing a first concurrent algorithm for dynamic connectivity was accepted to SPAA 2021.
- We have new work appearing in ICLR 2021 on an optimal compression scheme for distributed variance reduction, and on a state-of-the-art Byzantine-resilient defence for distributed SGD.
- Two new papers accepted to AAAI 2021, on a new consistency condition for distributed SGD, and on the first convergence guarantees for asynchronous SGD over non-smooth, non-convex objectives.
- Three new papers presented at NeurIPS 2020, on model compression, scheduling for concurrent belief propagation, and adaptive gradient quantization for SGD.
- Our work describing the Splay-List (a distributionally-adaptive concurrent Skip-List) was presented at DISC 2020, and was invited to the Special Issue of Distributed Computing dedicated to the conference.
- Two new papers in ICML 2020 on Leveraging Activation Sparsity in DNN Inference, and on the Sample Complexity of Adversarial Multi-Source PAC Learning.
- Our work on Concurrent Search Trees with Doubly-Logarithmic Running Time won the Best Paper Award at PPoPP 2020. Our work on handling load imbalance in deep learning workloads was also presented at the conference.
- Our work on searching for the fastest concurrent Union-Find algorithm was chosen as Best Paper at OPODIS 2019.
- I gave a Keynote Talk at DISC 2019 on Distributed Machine Learning. The slides are here.
- Our work on Powerset Convolutional Nets is accepted to NeurIPS 2019. Congratulations to Chris Wendler on his first paper!
- The paper describing SparCML, our communication library for machine learning applications, has been accepted to SC19.
- Nikita gave a talk describing our work on designing and implementing scalable channels at EuroPar 2019.
- Our paper on the limitations of extension-based impossibility proofs was accepted to STOC 2019.
- I gave a tutorial on Distributed and Concurrent Machine Learning at PODC 2018. Slides are here.
- Our work on neural network compression using distilled quantization was accepted to ICLR18. The implementation is here.
- QSGD was accepted as a Spotlight paper in NIPS 2017. Both CNTK and TensorFlow implementations are available.
I am extremely lucky to be able to work with the following students and postdocs:
- Joel Rybicki (PostDoc @ IST)
- Janne Korhonen (PostDoc @ IST)
- Bapi Chatterjee (PostDoc @ IST)
- Peter Davies (PostDoc @ IST)
- Giorgi Nadiradze (PhD @ IST)
- Alexandra Peste (PhD @ IST)
- Aleksandr Shevchenko (PhD @ IST)
- Ilya Markov (PhD @ IST)
- Ahad Baig (PhD @ IST)
- Nikita Koval (PhD @ ITMO, Researcher at JetBrains)
- Chris Wendler (PhD @ ETH)
Visitors / Collaborators / Friends of the Lab:
- Prof. Faith Ellen (visiting February-May 2020)
- Prof. Nir Shavit (visiting November 2019)
- Prof. Thomas Sauerwald (visiting Oct 2019)
- Prof. Gregory Valiant (visiting Nov 2018)
- Prof. Robert Tarjan (visiting May 2018)
- Dr. Frank McSherry (visiting May 2018)
- Dr. Aleksandar Prokopec (visiting April 2018)
- Prof. Ce Zhang (ETH)
- Prof. Markus Pueschel (ETH)
- Prof. Torsten Hoefler (ETH)
- Vitaly Aksenov (PostDoc @ IST -> Lecturer @ ITMO)
- Saleh Ashkboos (Intern @ IST -> PhD @ ETH Zurich)
- Trevor Brown (PostDoc @ IST -> Assistant Professor @ Waterloo)
- Antonio Polino (MSc @ ETH -> Google Search)
- Rati Gelashvili (PhD@MIT, now Algorithms Lead at NeuralMagic)
- Justin Kopinsky (PhD@MIT, now Engineering Lead at NeuralMagic)
- Cédric Renggli (MSc Thesis (ETH Medal), with Torsten Hoefler, now PhD@ETH)
- Nandini Singhal (Intern @ IST -> Microsoft Research)
- Martin Thoresen (Intern -> PhD @ IST)
- Amirmojtaba Sabour (Intern @ IST)
- Amirkeivan Mohtashami (Intern @ IST)
- Faezeh Ebrahimianghazani (Intern @ IST)
- Dasha Voronkova (Intern @ IST)
- Aditya Sharma (Intern @ IST)
- Arnab Kar (MSc Thesis @ IST -> PhD @ Duke University)
- Ekaterina Goltsova (ISTern @ IST -> Master’s @ EPFL)
- Demjan Grubić (MSc, now at Google)
- Aline Abler (BSc, with Torsten Hoefler)
- Raphael Kuebler (BSc)
- Jerry Z. Li (Intern, PhD@MIT -> Microsoft Research)
- Jenny Iglesias (Intern, PhD@CMU)
- Syed Kamran Haider (Intern @ MSR, now Researcher @ Qualcomm)
- Hyunjik Kim (Intern @ MSR -> PhD @ Oxford -> Researcher at DeepMind)