Theodor Misiakiewicz

alt text 

PhD,
Statistics Department,
Stanford University
390 Serra Mall, Sequoia Hall,
Stanford, CA 94305-9510
Phone: +* 390 7662
Email: misiakie [@] stanford [DOT] edu

About me

From September 2023 to June 2024, you can either find me at the Toyota Technological Institute at Chicago or at the University of California, Berkeley.

I will start as an Assistant Professor at Yale University, Department of Statistics and Data Science, in July 2024! I will be looking for PhD students and postdocs to join my group. Please email me if you are interested in working with me!

My interest lies broadly at the intersection of statistics, machine learning, probability and computer science. Lately, I have been focusing on the statistical and computational aspects of deep learning, and the performance of kernel and random feature methods in high dimension. Previous research projects led me to work on inverse Ising problem, non-convex optimization (max-cut, community detection, synchronization), LDPC codes, CMS experiment at LHC (detection of Higgs Boson in the 2015 data), D-Wave 2X (quantum annealer) and massive gravity.

I received my PhD in Statistics from Stanford University, where I was fortunate to be advised by Prof. Andrea Montanari. My thesis focused on various statistical and computational aspects of learning with neural networks in high dimensions, and can be found here. Before my Ph.D, I completed my undergraduate studies in France at Ecole Normale Superieure de Paris with a focus on pure math and physics. I received a B.Sc. in Mathematics and a B.Sc. in Physics in 2014, and a M.Sc. in Theoretical Physics at ICFP in 2016.

Here is a link to my google scholar account.

Research

My research interests include

  • Theory of Deep Learning: mean-field description and neural tangent kernel

  • Kernel and random feature methods in high-dimension (benign overfitting, multiple descent, structured kernels)

  • Non-convex optimization, implicit regularization, landscape analysis

  • Computational limits of learning with neural networks

  • Random matrix theory, high-dimensional probability

Full list of publications.
CV (last updated 2022).