I’m an Assistant Professor at IST Austria. My research focuses on concurrent data structures and distributed algorithms, and spans from algorithms and lower bounds, to practical implementations.
Before IST, I was affiliated with ETH Zurich and Microsoft Research, Cambridge, UK. Prior to that, I was a Postdoctoral Associate at MIT CSAIL, working with Prof. Nir Shavit. I received my PhD from the EPFL, under the guidance of Prof. Rachid Guerraoui.
My research is supported through a 2018 ERC Starting Grant.
- We have open PhD and postdoc positions as part of the ScaleML ERC Starting Grant project, whose goal is to develop new theory, algorithms, and systems for scalable machine learning.
The ideal candidate would have a strong background in CS, with a PhD in distributed computing or a related field. Applicants with experience in optimization theory or distributed systems implementation are particularly encouraged, although all strong applicants will be given proper consideration.
For questions about the positions and the application process, please contact me via email at firstname.lastname@example.org. The application should contain a CV, publication list, a 1-page statement describing motivation and research interests, and, if applicable, links to your publications.
- ISCA 2019, ICML 2019, SysML 2019, PPoPP 2019, NIPS 2018, DISC 2018, PODC 2018, ICDCS 2018, DISC 2017, ALGOCLOUD 2017
- Two new papers on communication-efficient machine learning and Byzantine-resilient SGD to appear in NeurIPS 2018.
- Our paper describing the Clover framework for low-bitwidth computation was accepted to IEEE SIPS 2018, and invited to the journal Special Issue. The code is here.
- Paper analyzing the convergence of gradient descent with sparsified gradients to appear in the IEEE Conference on Decision and Control.
- I gave a tutorial on Distributed and Concurrent Machine Learning at PODC 2018. Slides are here.
- Our work on relaxed scheduling and asynchronous SGD was recently presented at PODC 2018.
- Our work on randomized relaxed data structures and on transaction scheduling was accepted to SPAA 2018.
- I gave an invited talk at NVIDIA GTC 2018 on sparsification and quantization for scalable machine learning.
- Our work on neural network compression using distilled quantization was accepted to ICLR18. The implementation is here.
- QSGD was accepted as a Spotlight paper in NIPS 2017. Both CNTK and TensorFlow implementations are available.
I am extremely lucky to be able to work with the following students and postdocs:
- Joel Rybicki (PostDoc @ IST)
- Bapi Chatterjee (PostDoc @ IST)
- Giorgi Nadiradze (PhD @ IST)
- Martin Thoresen (Intern -> PhD @ IST)
- Nikita Koval (Intern -> PhD @ IST)
Visitors / Collaborators / Friends of the Lab:
- Prof. Gregory Valiant (visiting Nov 2018)
- Prof. Robert Tarjan (visiting May 2018)
- Dr. Frank McSherry (visiting May 2018)
- Dr. Aleksandar Prokopec (visiting April 2018)
- Prof. Ce Zhang (ETH)
- Prof. Markus Pueschel (ETH)
- Prof. Torsten Hoefler (ETH)
- Trevor Brown (PostDoc @ IST -> Assistant Professor @ Waterloo)
- Antonio Polino (MSc @ ETH -> Google Search)
- Rati Gelashvili (PhD@MIT, now Algorithms Lead at NeuralMagic)
- Justin Kopinsky (PhD@MIT, now Engineering Lead at NeuralMagic)
- Nandini Singhal (Intern @ IST -> Microsoft Research)
- Arnab Kar (MSc Thesis @ IST -> PhD @ Duke University)
- Ekaterina Goltsova (ISTern @ IST)
- Demjan Grubić (MSc, now at Google)
- Cédric Renggli (MSc Thesis (ETH Medal), with Torsten Hoefler, now PhD@ETH)
- Aline Abler (BSc, with Torsten Hoefler)
- Raphael Kuebler (BSc)
- Jerry Z. Li (Intern, PhD@MIT)
- Jenny Iglesias (Intern, PhD@CMU)
- Syed Kamran Haider (Intern @ MSR, now Researcher @ Qualcomm)
- Hyunjik Kim (Intern @ MSR -> PhD @ Oxford)