Ilias Zadik

New York University, Center for Data Science (CDS)
Math and Data (MaD) group

Contact: zadik at nyu.edu
Links to: CV (last update 1/14/21), Google Scholar, arXiv.

headshot
About me

I am a CDS Moore-Sloan (postdoctoral) fellow at the Center for Data Science of NYU and a member of it's Math and Data (MaD) group.

Starting in September 2021, I will be a postdoctoral researcher at MIT, Mathematics Department, working with Prof. Elchanan Mossel and Prof. Nike Sun as a part of the NSF/Simons program Collaboration on the Theoretical Foundations of Deep Learning.

Research Interests
My research lies broadly in the interface of high dimensional statistics, the theory of machine learning and applied probability. A lot of my work has the goal to build and use mathematical tools to bring insights into the computational and statistical challenges of modern machine learning tasks. Three directions that I have been recently focusing on are:

  • Geometric barriers for computationally efficient inference (The Overlap Gap Property) (e.g. see papers 4, 12, 16, 18 below).
  • (Sharp) information theoretic phase transitions (the All-or-Nothing phenomenon) (e.g. see papers 10, 11, 13, 17 below).
  • The cost of (differential) privacy in statistics (e.g. see papers 6, 14 below).

Short Bio
I received my PhD on September 2019 from the Operations Research Center of Massachussets Institute of Technology (MIT) , where I was very fortunate to be advised by Prof. David Gamarnik. A copy of my PhD thesis can be found here.
From June 2017 to August 2017 I was an intern at the Microsoft Research Lab in New England, mentored by Jennifer Chayes and Christian Borgs . Prior joining MIT, I completed a Master of Advanced Studies in Mathematics (Part III of the Mathematical Tripos) at the University of Cambridge and a BA in Mathematics from the Mathematics Department at the University of Athens.
Recent recorded talks

Research papers (published or under review)
    2021+
  1. On the Cryptographic Hardness of Learning Single Periodic Neurons
    Submitted
    with Joan Bruna, Min Jae Song.
  2. Stationary Points of Shallow Neural Networks with Quadratic Activation Function (30mins video by Eren - MIT MLTea)
    Submitted
    with David Gamarnik, Eren C. Kızıldağ.
  3. Shapes and recession cones in mixed-integer convex representability
    Submitted
    with Miles Lubin, Juan Pablo Vielma
  4. The Landscape of the Planted Clique Problem: Dense Subgraphs and the Overlap Gap Property ( 1hr video - NYU Probability Seminar)
    Annals of Applied Probability (Major Revisions)
    with David Gamarnik
  5. Sparse High-Dimensional Linear Regression. Algorithmic Barriers and a Local Search Algorithm
    Annals of Statistics (Minor Revisions)
    with David Gamarnik
  6. It was “all” for “nothing”: sharp phase transitions for noiseless discrete channels
    To appear in Conference on Learning Theory (COLT), 2021
    with Jonathan Niles-Weed.
  7. Group testing and local search: is there a computational-statistical gap? (2hrs video by Fotis - IAS)
    To appear in Conference on Learning Theory (COLT), 2021
    with Fotis Iliopoulos.
  8. Self-Regularity of Non-Negative Output Weights for Overparameterized Two-Layer Neural Networks
    To appear in International Symposium on Information Theory (ISIT), 2021
    with David Gamarnik, Eren C. Kızıldağ.
  9. 2020
  10. Optimal Private Median Estimation under Minimal Distributional Assumptions (10mins NeurIPS video by Manolis)
    Advances in Neural Information Processing Systems, (NeurIPS), 2020
    Selected for a Spotlight Presentation (~5% of submitted papers).
    with Manolis Vlatakis, Christos Tzamos
  11. The All-or-Nothing Phenomenon in Sparse Tensor PCA (Poster)
    Advances in Neural Information Processing Systems, (NeurIPS), 2020
    with Jonathan Niles-Weed
  12. Free Energy Wells and the Overlap Gap Property in Sparse PCA ( 25mins video — Simons workshop)
    Proceedings of the Conference on Learning Theory (COLT), 2020
    with Gèrard Ben Arous, Alex Wein
  13. 2019
  14. The All-or-Nothing Phenomenon in Sparse Linear Regression (Slides, Poster)
    Mathematics of Statistics and Learning (Major Revisions)
    Conference version in the Proceedings of the Conference on Learning Theory (COLT), 2019
    with Galen Reeves, Jiaming Xu
  15. All-or-Nothing Phenomena: From Single-Letter to High Dimensions
    Proceedings of the International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), 2019
    with Galen Reeves, Jiaming Xu
  16. Improved bounds on Gaussian MAC and sparse regression via Gaussian inequalities
    Proceedings of the International Symposium on Information Theory (ISIT), 2019
    with Christos Thrampoulidis, Yury Polyanskiy
  17. A simple bound on the BER of the MAP decoder for massive MIMO systems
    Proceedings of the International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019
    with Christos Thrampoulidis, Yury Polyanskiy
  18. 2018
  19. Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection
    Transactions of Information Theory (Major Revisions)
    Conference version with David Gamarnik, in Advances in Neural Information Processing Systems, (NeurIPS), 2018
    with David Gamarnik, Eren C. Kızıldağ.
  20. Revealing Network Structure, Confidentially: Improved Rates for Node-Private Graphon Estimation (Slides, 1h Simons talk by Adam )
    Symposium of Foundations of Computer Science (FOCS), 2018
    with Christian Borgs, Jennifer Chayes, Adam Smith
  21. Orthogonal Machine Learning: Power and Limitations (Slides, Poster, Code)
    Proceedings of International Conference of Machine Learning (ICML), 2018 (20 minute Presentation)
    with Lester Mackey, Vasilis Syrgkanis
  22. 2017
  23. High-Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition (Slides, Poster, 20mins video)
    Proceedings of the Conference on Learning Theory (COLT), 2017 (20 minutes Presentation)
    with David Gamarnik
  24. Mixed integer convex representability (Slides)
    Mathematics of Operations Research, 2020+ (To appear)
    Conference version in Proceedings of the International Conference of Integer Programming and Combinatorial Optimization (IPCO), 2017
    with Miles Lubin, Juan Pablo Vielma
  25. Pre-2017 (complex analysis):
  26. Universal Padé approximants and their behaviour on the boundary
    Monatshefte für Mathematik, Vol. 182, p.p. 173–193, 2017
  27. Pade approximants, density of rational functions in A^(infinity)(V) and smoothness of the integration operator
    Journal of Mathematical Analysis and Applications; Vol. 423, p.p. 1514–1539, 2015
    with Vassili Nestoridis
Thesis/Notes/Survey Articles
Teaching
  • Fall 2020, DS-GA 1005: Inference and Representation. (Co-instructor with Joan Bruna)
    Advanced graduate-level class on modern theoretical aspects of statistics and machine learning.
    More information can be found on the course's website..
  • Fall 2019, DS-GA 1002: Probability and Statistics for Data Science. (Co-instructor with Carlos Fernandez-Granda )
    Introductory graduate-level class on probability and statistics.
Service
  • I am among the organizers of the MaD+ seminar. Check the website for some great upcoming talks!
  • I have served as a reviewer for Annals of Statistics, Operations Research, SIAM Journal of Discrete Mathematics, SIAM Journal of Optimization, Combinatorica, IEEE Journal on Selected Areas in Information Theory, and for the conferences COLT, NeurIPS, ITCS, ISIT, ICALP and SODA.
  • I am serving in the Program Committee for COLT 2021.
Awards
  • Top 400 Reviewers Award for Neurips, 2019.
  • Honorable Mention for MIT Operations Research Center Best Student Paper Award, 2017
    Paper: High-Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition.
  • Senior Scholarship from Trinity College, Cambridge University, 2014.
  • The Onassis Foundation Scholarship for Master Studies, 2013-2014
  • The Cambridge Home and European Scholarship Scheme (CHESS) award, 2013-2014.
  • International Mathematics Competition for University Students (IMC): First Prize, 2011, Second Prize, 2010.
  • South Eastern European Mathematics Olympiad for University Students (SEEMOUS): Gold Medal (first place), 2011, Silver Medal, 2010.