Ilias Zadik

Massachussets Institute of Technology, Mathematics Department

Contact: izadik at mit.edu
Links to: CV (last update 1/14/21), Google Scholar, arXiv.

headshot
About me

I am a postdoctoral researcher working with Prof. Elchanan Mossel and Prof. Nike Sun. as a member of the NSF/Simons program Collaboration on the Theoretical Foundations of Deep Learning.
During Fall 2021, I am also a long-term participant at the Simons Institute Program on "Computational Complexity of Statistical Inference.".

Research Interests
My research lies broadly in the interface of high dimensional statistics, the theory of machine learning and applied probability. A lot of my work has the goal to build and use mathematical tools to bring insights into the computational and statistical challenges of modern machine learning tasks. Three directions that I have been recently focusing on are:

  • Geometric barriers for computationally efficient inference (The Overlap Gap Property) (e.g. see papers 4, 12, 16, 18 below).
  • (Sharp) information theoretic phase transitions (the All-or-Nothing phenomenon) (e.g. see papers 10, 11, 13, 17 below).
  • The cost of (differential) privacy in statistics (e.g. see papers 6, 14 below).

Short Bio
From September 2019 to August 2021 I was a CDS Moore-Sloan (postdoctoral) fellow at the Center for Data Science of New York University and a member of it's Math and Data (MaD) group.
I received my PhD on September 2019 from the Operations Research Center of Massachussets Institute of Technology (MIT) , where I was very fortunate to be advised by Prof. David Gamarnik. A copy of my PhD thesis can be found here.
From June 2017 to August 2017 I was an intern at the Microsoft Research Lab in New England, mentored by Jennifer Chayes and Christian Borgs . Prior joining MIT, I completed a Master of Advanced Studies in Mathematics (Part III of the Mathematical Tripos) at the University of Cambridge and a BA in Mathematics from the Mathematics Department at the University of Athens.
Recent recorded talks

Research papers (published or under review)
    2021+
  1. Stationary Points of Shallow Neural Networks with Quadratic Activation Function (30mins video by Eren - MIT MLTea)
    Submitted
    with David Gamarnik, Eren C. Kızıldağ.
  2. Shapes and recession cones in mixed-integer convex representability
    Submitted
    with Miles Lubin, Juan Pablo Vielma
  3. The Landscape of the Planted Clique Problem: Dense Subgraphs and the Overlap Gap Property ( 1hr video - NYU Probability Seminar)
    Annals of Applied Probability (Major Revisions)
    with David Gamarnik
  4. On the Cryptographic Hardness of Learning Single Periodic Neurons
    Advances in Neural Information Processing Systems, (NeurIPS), 2021
    with Joan Bruna, Min Jae Song.
  5. It was “all” for “nothing”: sharp phase transitions for noiseless discrete channels(18mins video - COLT)
    Proceedings of the Conference on Learning Theory (COLT), 2021
    with Jonathan Niles-Weed.
  6. Group testing and local search: is there a computational-statistical gap? (2hrs video by Fotis - IAS)
    Proceedings of the Conference on Learning Theory (COLT), 2021
    with Fotis Iliopoulos.
  7. Self-Regularity of Non-Negative Output Weights for Overparameterized Two-Layer Neural Networks
    IEEE Transactions of Signal Processing (Major Revisions)
    Conference version in Proceedings of the International Symposium on Information Theory (ISIT), 2021
    with David Gamarnik, Eren C. Kızıldağ.
  8. 2020
  9. Optimal Private Median Estimation under Minimal Distributional Assumptions (10mins video by Manolis - NeurIPS spotlight)
    Advances in Neural Information Processing Systems, (NeurIPS), 2020
    Selected for a Spotlight Presentation (~5% of submitted papers).
    with Manolis Vlatakis, Christos Tzamos
  10. The All-or-Nothing Phenomenon in Sparse Tensor PCA (Poster, ( 25mins video - BIRS workshop)
    Advances in Neural Information Processing Systems, (NeurIPS), 2020
    with Jonathan Niles-Weed
  11. Free Energy Wells and the Overlap Gap Property in Sparse PCA ( 25mins video - Simons workshop)
    Communications on Pure and Applied Mathematics (Major Revisions)
    Conference version in Proceedings of the Conference of Learning Theory (COLT), 2020
    with Gèrard Ben Arous, Alex Wein
  12. 2019
  13. The All-or-Nothing Phenomenon in Sparse Linear Regression (Slides, Poster)
    Mathematics of Statistics and Learning, 2021+ (To appear)
    Conference version in the Proceedings of the Conference on Learning Theory (COLT), 2019
    with Galen Reeves, Jiaming Xu
  14. All-or-Nothing Phenomena: From Single-Letter to High Dimensions
    Proceedings of the International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), 2019
    with Galen Reeves, Jiaming Xu
  15. Improved bounds on Gaussian MAC and sparse regression via Gaussian inequalities
    Proceedings of the International Symposium on Information Theory (ISIT), 2019
    with Christos Thrampoulidis, Yury Polyanskiy
  16. A simple bound on the BER of the MAP decoder for massive MIMO systems
    Proceedings of the International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019
    with Christos Thrampoulidis, Yury Polyanskiy
  17. 2018
  18. Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection
    IEEE Transactions of Information Theory, 2021+ (to appear)
    Conference version with David Gamarnik, in Advances in Neural Information Processing Systems, (NeurIPS), 2018
    with David Gamarnik, Eren C. Kızıldağ.
  19. Revealing Network Structure, Confidentially: Improved Rates for Node-Private Graphon Estimation (Slides, 1h video by Adam -Simons)
    Proceedings of the Symposium on Foundations of Computer Science (FOCS), 2018
    with Christian Borgs, Jennifer Chayes, Adam Smith
  20. Orthogonal Machine Learning: Power and Limitations (Slides, Poster, Code)
    Proceedings of International Conference of Machine Learning (ICML), 2018 (20 minute Presentation)
    with Lester Mackey, Vasilis Syrgkanis
  21. 2017
  22. Sparse High-Dimensional Linear Regression. Estimating Squared Error and a Phase Transition.
    Annals of Statistics, 2021+ (To appear)
    with David Gamarnik.
    This paper merges:
    (a) High-Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition (Slides, Poster, 20mins video)
    Proceedings of the Conference on Learning Theory (COLT), 2017 (20 minutes Presentation)
    (b) Sparse High-Dimensional Linear Regression. Algorithmic Barriers and a Local Search Algorithm
    arXiv preprint, 2017
  23. Mixed integer convex representability (Slides)
    Mathematics of Operations Research, 2021
    Conference version in Proceedings of the International Conference of Integer Programming and Combinatorial Optimization (IPCO), 2017
    with Miles Lubin, Juan Pablo Vielma
  24. Pre-2017 (complex analysis):
  25. Universal Padé approximants and their behaviour on the boundary
    Monatshefte für Mathematik, Vol. 182, p.p. 173–193, 2017
  26. Pade approximants, density of rational functions in A^(infinity)(V) and smoothness of the integration operator
    Journal of Mathematical Analysis and Applications; Vol. 423, p.p. 1514–1539, 2015
    with Vassili Nestoridis
Thesis/Notes/Survey Articles
Teaching
  • Fall 2020, DS-GA 1005: Inference and Representation. (Co-instructor with Joan Bruna)
    Advanced graduate-level class on modern theoretical aspects of statistics and machine learning.
    More information can be found on the course's website..
  • Fall 2019, DS-GA 1002: Probability and Statistics for Data Science. (Co-instructor with Carlos Fernandez-Granda )
    Introductory graduate-level class on probability and statistics.
Service
  • From Spring 2020 to Spring 2021 I had the pleasure to be among the organizers of the MaD+ seminar.
  • I have served as a reviewer for Annals of Statistics, Operations Research, SIAM Journal of Discrete Mathematics, SIAM Journal of Optimization, Combinatorica, IEEE Journal on Selected Areas in Information Theory, and for the conferences COLT, NeurIPS, ITCS, ISIT, ICALP and SODA.
  • I served in the Program Committee for COLT 2021.
Awards
  • Top 400 Reviewers Award for Neurips, 2019.
  • Honorable Mention for MIT Operations Research Center Best Student Paper Award, 2017
    Paper: High-Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition.
  • Senior Scholarship from Trinity College, Cambridge University, 2014.
  • The Onassis Foundation Scholarship for Master Studies, 2013-2014
  • The Cambridge Home and European Scholarship Scheme (CHESS) award, 2013-2014.
  • International Mathematics Competition for University Students (IMC): First Prize, 2011, Second Prize, 2010.
  • South Eastern European Mathematics Olympiad for University Students (SEEMOUS): Gold Medal (first place), 2011, Silver Medal, 2010.