
About Me
I am a PhD candidate in physics at Harvard, where I am fortunate to be advised by Cengiz Pehlevan. I did my undergraduate work in physics at Yale, where I worked with Damon Clark on visual motion detection and locomotor coordination in fruit flies.
My research interests lie in some neighborhood of the intersection between theoretical neuroscience, statistical physics, and machine learning. For more information about my work, see some of my recent publications, as well as my teaching experience and expository notes.
I can also be found on Mastodon.
I am on the market for academic postdoc and research scientist positions, targeting a fall 2024 start.
News
-
[September 2023] Three papers accepted to NeurIPS 2023! Check out our work with Paul Masset on models for fast inference in the olfactory bulb, our work with Hamza Chaudhry and Dima Krotov on associative memory for long sequences, and our work on learning curves for models with many layers of structured features.
-
[September 2023] I had the pleasure of presenting our work on the geometry of distributed neural codes for fast inference and in deep networks at the Washington University in St. Louis Computational Neuroscience Next Generation (CNNG) Symposium.
-
[August 2023] I spoke about our work on deep linear Bayesian neural networks at the Statistical Physics and Machine Learning Back Together Again workshop in Cargèse. Slides from my talk have been posted on the workshop website.
-
[May 2023] I’ve uploaded the notes for the tutorial on the NNGP correspondence I gave on 11 November 2022 at the HHMI Janelia Junior Scientist Workshop on Theoretical Neuroscience.
-
[April 2023] Our note on how the replica method can be used to study the eigenvalues of Wishart product matrices was just published in SciPost Physics Core.
-
[December 2022] I’m presenting our work with Paul Masset on fast sampling in spiking networks at NeurIPS 2022. I’m also presenting our work on generalization in deep Bayesian linear neural networks at the Machine Learning and the Physical Sciences Workshop, and our work on the geometry of deep network representations at the NeurReps Workshop.
-
[November 2022] Our paper on the asymptotics of representation learning in Bayesian neural networks was re-published as an invited contribution to the Machine Learning 2022 special issue of JSTAT.