Dan Alistarh is currently an Assistant Professor at IST Austria, and the Machine Learning research lead at Neural Magic, Inc.. Previously, he was affiliated with ETH Zurich, Microsoft Research, and MIT. He received his PhD from the EPFL, under the guidance of Prof. Rachid Guerraoui. His research focuses on distributed algorithms and concurrent data structures, and spans from algorithms and lower bounds to practical implementations. Recently, his focus has been on efficient algorithms for machine learning, for which he has been awarded an ERC Starting Grant. He was a co-recipient of best paper awards at OPODIS19 and ACM PPoPP20, and a keynote speaker at DISC 2019.
To request a single lecture/event, click on the desired lecture and complete the Request Lecture Form.
Compressing Deep Neural Networks for Fun and Profit
Deep learning continues to make significant advances, solving tasks from image classification to translation or reinforcement learning. One aspect of the field receiving considerable attention is...
Data Structures of the Future: Concurrent, Optimistic, and Relaxed
This talk is about a relatively new family of "relaxed" ordered data structures, in the vein of stacks, queues, and priority queues, but which provide weaker semantics than their...
- Large-Scale Distributed Optimization for Machine LearningMachine learning has made considerable progress over the past decade,matching and even surpassing human performance on a varied set of narrow computational tasks. This progress has been...
To request a tour with this speaker, please complete this online form.
If you are not requesting a tour, click on the desired lecture and complete the Request this Lecture form.
All requests will be sent to ACM headquarters for review.
- Large-Scale Distributed Optimization for Machine Learning