Portrait

Theodor Misiakiewicz
PhD candidate - Statistics
Stanford University

I am a fourth-year PhD candidate in Statistics at Stanford University working with Andrea Montanari. Before my Ph.D, I completed my undergraduate studies in France at Ecole Normale Superieure de Paris with a focus on pure math and physics. I received a B.Sc. in Mathematics and a B.Sc. in Physics in 2014, and a M.Sc. in Theoretical Physics at ICFP in 2016.

My interest lies broadly at the intersection of information theory, statistical physics, statistics and probability. Lately, I have been focusing on the theory of deep learning. Specifically, I am interested in the mean-field description of neural networks and the connection in some regime between neural networks trained by gradient descent and kernel methods.

Previous research projects led me to work on high-dimensional statistics (inverse Ising problem), non-convex optimization (max-cut, community detection, synchronization), LDPC codes, CMS experiment at LHC (detection of Higgs Boson in the 2015 data), D-Wave 2X (quantum annealer) and massive gravity.

Here is a link to my google scholar account.

News:
- 02/2021: we posted our pre-print Learning with invariances in random features and kernel models on arXiv.
- 12/2020: our discussion of:“Nonparametric regression using deep neural networks with ReLU activation function” is out!
- 12/2020: we presented When Do Neural Networks Outperform Kernel Methods? at NeurIPS 2020. Here is the poster summarizing the paper.
- 09/2019: our pre-print on Limitations of Lazy Training of Two-layers Neural Networks was accepted at NeurIPS 2019. Here are the slides presented at the spotlight. Here is the poster summarizing the paper.


Talks:
- ML lunch, Stanford, 04/28/2021.
- NSF-Simons Journal Club, 04/21/2021.
- G-Research, 02/10/2021.
- ML Foundations seminar, Microsoft Research, 11/05/2020.
- MoDL (NSF collaboration), 09/01/2020.
- Spotlight, NeurIPS 2019, 12/2019.


Papers

  • Minimum complexity interpolation in random features models
    Michael Celentano, Theodor Misiakiewicz, Andrea Montanari
    Arxiv preprint (2021).

    Learning with invariances in random features and kernel models
    Song Mei, Theodor Misiakiewicz, Andrea Montanari
    Arxiv preprint (2021).

    Generalization error of random features and kernel methods: hypercontractivity and kernel matrix concentration
    Song Mei, Theodor Misiakiewicz, Andrea Montanari
    Arxiv preprint (2021).

    Discussion of:“Nonparametric regression using deep neural networks with ReLU activation function”
    Behrooz Ghorbani, Song Mei, Theodor Misiakiewicz, Andrea Montanari
    Annals of Statistics, volume 48, 1898-1901 (2020).

    When Do Neural Networks Outperform Kernel Methods?
    Behrooz Ghorbani, Song Mei, Theodor Misiakiewicz, Andrea Montanari
    Presented at NeurIPS 2020.

    Limitations of Lazy Training of Two-layers Neural Networks
    Behrooz Ghorbani, Song Mei, Theodor Misiakiewicz, Andrea Montanari
    NeurIPS 2019 (spotlight).

    Linearized two-layers neural networks in high dimension
    Behrooz Ghorbani, Song Mei, Theodor Misiakiewicz, Andrea Montanari
    Annals of Statistics, Vol 49, 1029-1054, April 2021.

    Mean-field theory of two-layers neural networks: dimension-free bounds and kernel limit
    Song Mei, Theodor Misiakiewicz, Andrea Montanari
    Proceedings COLT 2019.

    Solving SDPs for synchronization and MaxCut problems via the Grothendieck inequality
    Song Mei, Theodor Misiakiewicz, Andrea Montanari, Roberto I Oliveira
    Proceedings COLT 2017.

    Concentration to zero bit-error probability for regular LDPC codes on the binary symmetric channel: Proof by loop calculus
    Marc Vuffray, Theodor Misiakiewicz
    Proceedings of the 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton, 2015).

    Efficient reconstruction of transmission probabilities in a spreading process from partial observations
    Andrey Lokhov, Theodor Misiakiewicz
    Arxiv preprint (2015).

    Estimation du bruit de fond reductible par la méthode SS dans le canal de desintegration du boson de Higgs en 4 leptons
    Theodor Misiakiewicz
    Master thesis. Laboratoire Leprince-Ringuet (LLR). CMS experiment at the Large Hadron Collider (LHC), Geneva.

    Application of Graphical Models to Decoding and Machine Learning
    Theodor Misiakiewicz
    Work done at Center for Linear-Studies (CNLS) at Los Alamos National Laboratory (LANL), New Mexico.

    Ondes gravitationnelles en theorie de la gravite massive
    Theodor Misiakiewicz
    Astroparticle and Cosmology Laboratory (APC) Cosmology and Gravitation theory group. Advisor: Prof. Daniele Steer.