website picture

About Me

I am a PhD candidate in physics at Harvard, where I am fortunate to be advised by Cengiz Pehlevan. I did my undergraduate work in physics at Yale, where I worked with Damon Clark on visual motion detection and locomotor coordination in fruit flies.

My research interests lie in some neighborhood of the intersection between theoretical neuroscience, statistical physics, and machine learning. For more information about my work, see some of my recent publications, as well as my teaching experience.

I can also be found on Mastodon.


  • [May 2023] I’ve uploaded the notes for the tutorial on the NNGP correspondence I gave on 11 November 2022 at the HHMI Janelia Junior Scientist Workshop on Theoretical Neuroscience.

  • [April 2023] Our note on how the replica method can be used to study the eigenvalues of Wishart product matrices was just published in SciPost Physics Core.

  • [December 2022] I’m presenting our work with Paul Masset on fast sampling in spiking networks at NeurIPS 2022. I’m also presenting our work on generalization in deep Bayesian linear neural networks at the Machine Learning and the Physical Sciences Workshop, and our work on the geometry of deep network representations at the NeurReps Workshop.

  • [November 2022] Our paper on the asymptotics of representation learning in Bayesian neural networks was re-published as an invited contribution to the Machine Learning 2022 special issue of JSTAT.

  • [September 2022] I’ve written a set of lecture notes introducing the replica method through the example of Wishart matrix eigenvalues for Harvard Applied Math 226, which are available here.

  • [September 2022] Our paper with Paul Masset on algorithms for sampling using spiking neural networks was accepted to NeurIPS 2022.

  • [June 2022] Our paper on generalization in deep linear Bayesian neural networks and random feature models was just published in PRE.