I do research in machine learning, the theory of computing, and discrete math.
I apply mathematical tools to study hard algorithmic problems. Current areas of focus are new training algorithms and provable guarantees for neural networks, and the structure and symmetries of highly regular combinatorial objects.
Previously, I was a postdoc at the Algorithms and Randomness Center at Georgia Tech where I worked on provable guarantees for training neural networks and algorithmic problems in matching theory with Santosh Vempala and Eric Vigoda.
I completed my Ph.D. at the University of Chicago under the supervision of László Babai. My work focused on faster algorithms for bottleneck cases of the graph isomorphism problem.
My most cited papers concern the abelian sandpile model.