What's new

December, 4th – 9th, I’ll be at NIPS 2017 to give a “spotlight” presentation of my paper with Le Song, Santosh Vempala, and Bo Xie, “On the Complexity of Learning Neural Networks.” This paper is featured on a recent Data Skeptic podcast.



I do research in machine learning, the theory of computing, and discrete math.

I am currently focused on provable guarantees for neural networks, defending against adversarial noise in classification tasks, and algorithmic problems in matching theory. Previous work included faster algorithms for bottleneck cases of the graph isomorphism problem. My most cited papers concern the abelian sandpile model.


Currently, I am a postdoc at the Algorithms and Randomness Center at Georgia Tech. Previously, I completed my Ph.D. at the University of Chicago under the supervision of László Babai.


My time spent on Argentine tango and Euro-style board games has in recent years yielded to the more toddler-friendly pursuits of hiking and biking.