Here is some old news:
-
[March 2024] Three posters on our work were presented at Cosyne 2024! I also had the pleasure of speaking at the “Comparative approaches to advance neural circuits and computation” workshop organized by Anqi Zhang and Siddharth Jayakumar.
-
[December 2023] I’m presenting three papers at NeurIPS 2023! Check out our work with Paul Masset on models for fast inference in the olfactory bulb, our work with Hamza Chaudhry and Dima Krotov on associative memory for long sequences, and our work on learning curves for models with many layers of structured features. Also check out Hamza’s talk at the AMHN workshop.
-
[September 2023] I had the pleasure of presenting our work on the geometry of distributed neural codes for fast inference and in deep networks at the Washington University in St. Louis Computational Neuroscience Next Generation (CNNG) Symposium.
-
[August 2023] I spoke about our work on deep linear Bayesian neural networks at the Statistical Physics and Machine Learning Back Together Again workshop in Cargèse. Slides from my talk have been posted on the workshop website.
-
[May 2023] I’ve uploaded the notes for the tutorial on the NNGP correspondence I gave on 11 November 2022 at the HHMI Janelia Junior Scientist Workshop on Theoretical Neuroscience.
-
[April 2023] Our note on how the replica method can be used to study the eigenvalues of Wishart product matrices was just published in SciPost Physics Core.
-
[December 2022] I’m presenting our work with Paul Masset on fast sampling in spiking networks at NeurIPS 2022. I’m also presenting our work on generalization in deep Bayesian linear neural networks at the Machine Learning and the Physical Sciences Workshop, and our work on the geometry of deep network representations at the NeurReps Workshop.
-
[November 2022] Our paper on the asymptotics of representation learning in Bayesian neural networks was re-published as an invited contribution to the Machine Learning 2022 special issue of JSTAT.
-
[September 2022] I’ve written a set of lecture notes introducing the replica method through the example of Wishart matrix eigenvalues for Harvard Applied Math 226, which are available here.
-
[September 2022] Our paper with Paul Masset on algorithms for sampling using spiking neural networks was accepted to NeurIPS 2022.
-
[June 2022] Our paper on generalization in deep linear Bayesian neural networks and random feature models was just published in PRE.
-
[May 2022] Our note on connections between neural network kernels and pattern storage capacity was just published in Neural Computation.
-
[February 2022] Our review on locomotor kinematics in mice and flies (with Ana Gonçalves, Megan Carey, and Damon Clark) was just published in Current Opinion in Neurobiology.
-
[January 2022] Our review on representational drift (with Paul Masset and Shanshan Qin) was just published in Biological Cybernetics.
-
[September 2021] Our papers on representation learning and function-space prior distributions in Bayesian neural networks were accepted to NeurIPS 2021.