About me
I am a CDS MooreSloan (postdoctoral) fellow at the Center for Data Science of NYU and a member of it's Math and Data (MaD) group.
Starting in September 2021, I will be a postdoctoral researcher at MIT, Mathematics Department, working with Prof. Elchanan Mossel and Prof. Nike Sun as a part of the NSF/Simons program Collaboration on the Theoretical Foundations of Deep Learning.
Research Interests
My research lies broadly in the interface of high dimensional statistics, the theory of machine learning and applied probability. A lot of my work has the goal to build and use mathematical tools to bring insights into the computational and statistical challenges of modern machine learning tasks.
Three directions that I have been recently focusing on are:
 Geometric barriers for computationally efficient inference (The Overlap Gap Property) (e.g. see papers 4, 12, 16, 18 below).
 (Sharp) information theoretic phase transitions (the AllorNothing phenomenon) (e.g. see papers 10, 11, 13, 17 below).
 The cost of (differential) privacy in statistics (e.g. see papers 6, 14 below).
Short Bio
I received my PhD on September 2019 from the
Operations Research Center of
Massachussets Institute of Technology (MIT) , where I was very fortunate to be advised by Prof.
David Gamarnik. A copy of my PhD thesis can be found
here.
From June 2017 to August 2017 I was an intern at the
Microsoft Research Lab in New England, mentored by
Jennifer Chayes and
Christian Borgs . Prior joining MIT, I completed a Master of Advanced Studies in Mathematics (Part III of the Mathematical Tripos) at the
University of Cambridge and a BA in Mathematics from the Mathematics Department at the
University of Athens.
Recent recorded talks
Research papers (published or under review)
2021+

On the Cryptographic Hardness of Learning Single Periodic Neurons
Submitted
with Joan Bruna, Min Jae Song.

Stationary Points of Shallow Neural Networks with Quadratic Activation Function (30mins video by Eren  MIT MLTea)
Submitted
with David Gamarnik, Eren C. Kızıldağ.

Shapes and recession cones in mixedinteger convex representability
Submitted
with Miles Lubin, Juan Pablo Vielma

The Landscape of the Planted Clique Problem: Dense Subgraphs and the Overlap Gap Property ( 1hr video  NYU Probability Seminar)
Annals of Applied Probability (Major Revisions)
with David Gamarnik

Sparse HighDimensional Linear Regression. Algorithmic Barriers and a Local Search Algorithm
Annals of Statistics (Minor Revisions)
with David Gamarnik

It was “all” for “nothing”: sharp phase transitions for noiseless discrete channels
To appear in Conference on Learning Theory (COLT), 2021
with Jonathan NilesWeed.

Group testing and local search: is there a computationalstatistical gap? (2hrs video by Fotis  IAS)
To appear in Conference on Learning Theory (COLT), 2021
with Fotis Iliopoulos.

SelfRegularity of NonNegative Output Weights for Overparameterized TwoLayer Neural Networks
To appear in International Symposium on Information Theory (ISIT), 2021
with David Gamarnik, Eren C. Kızıldağ.
2020

Optimal Private Median Estimation under Minimal Distributional Assumptions (10mins NeurIPS video by Manolis)
Advances in Neural Information Processing Systems, (NeurIPS), 2020
Selected for a Spotlight Presentation (~5% of submitted papers).
with Manolis Vlatakis, Christos Tzamos

The AllorNothing Phenomenon in Sparse Tensor PCA (Poster)
Advances in Neural Information Processing Systems, (NeurIPS), 2020
with Jonathan NilesWeed

Free Energy Wells and the Overlap Gap Property in Sparse PCA ( 25mins video — Simons workshop)
Proceedings of the Conference on Learning Theory (COLT), 2020
with Gèrard Ben Arous, Alex Wein
2019

The AllorNothing Phenomenon in Sparse Linear Regression (Slides, Poster)
Mathematics of Statistics and Learning (Major Revisions)
Conference version in the Proceedings of the Conference on Learning Theory (COLT), 2019
with Galen Reeves, Jiaming Xu

AllorNothing Phenomena: From SingleLetter to High Dimensions
Proceedings of the International Workshop on Computational Advances in MultiSensor Adaptive Processing (CAMSAP), 2019
with Galen Reeves, Jiaming Xu

Improved bounds on Gaussian MAC and sparse regression via Gaussian inequalities
Proceedings of the International Symposium on Information Theory (ISIT), 2019
with Christos Thrampoulidis, Yury Polyanskiy

A simple bound on the BER of the MAP decoder for massive MIMO systems
Proceedings of the International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019
with Christos Thrampoulidis, Yury Polyanskiy
2018

Inference in HighDimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection
Transactions of Information Theory (Major Revisions)
Conference version with David Gamarnik, in Advances in Neural Information Processing Systems, (NeurIPS), 2018
with David Gamarnik, Eren C. Kızıldağ.

Revealing Network Structure, Confidentially: Improved Rates for NodePrivate Graphon Estimation (Slides, 1h Simons talk by Adam )
Symposium of Foundations of Computer Science (FOCS), 2018
with Christian Borgs, Jennifer Chayes, Adam Smith

Orthogonal Machine Learning: Power and Limitations (Slides, Poster, Code)
Proceedings of International Conference of Machine Learning (ICML), 2018 (20 minute Presentation)
with Lester Mackey, Vasilis Syrgkanis
2017

HighDimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition (Slides, Poster, 20mins video)
Proceedings of the Conference on Learning Theory (COLT), 2017 (20 minutes Presentation)
with David Gamarnik

Mixed integer convex representability (Slides)
Mathematics of Operations Research, 2020+ (To appear)
Conference version in Proceedings of the International Conference of Integer Programming and Combinatorial Optimization (IPCO), 2017
with Miles Lubin, Juan Pablo Vielma
Pre2017 (complex analysis):

Universal Padé approximants and their behaviour on the boundary
Monatshefte für Mathematik, Vol. 182, p.p. 173–193, 2017

Pade approximants, density of rational functions in A^(infinity)(V) and smoothness of the integration operator
Journal of Mathematical Analysis and Applications; Vol. 423, p.p. 1514–1539, 2015
with Vassili Nestoridis
Thesis/Notes/Survey Articles
Teaching
 Fall 2020, DSGA 1005: Inference and Representation. (Coinstructor with Joan Bruna)
Advanced graduatelevel class on modern theoretical aspects of statistics and machine learning.
More information can be found on the course's website..
 Fall 2019, DSGA 1002: Probability and Statistics for Data Science. (Coinstructor with Carlos FernandezGranda )
Introductory graduatelevel class on probability and statistics.
Service
 I am among the organizers of the MaD+ seminar. Check the website for some great upcoming talks!
 I have served as a reviewer for Annals of Statistics, Operations Research, SIAM Journal of Discrete Mathematics, SIAM Journal of Optimization, Combinatorica, IEEE Journal on Selected Areas in Information Theory, and for the conferences COLT, NeurIPS, ITCS, ISIT, ICALP and SODA.
 I am serving in the Program Committee for COLT 2021.
Awards

Top 400 Reviewers Award for Neurips, 2019.

Honorable Mention for MIT Operations Research Center Best Student Paper Award, 2017
Paper: HighDimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition.

Senior Scholarship from Trinity College, Cambridge University, 2014.

The Onassis Foundation Scholarship for Master Studies, 20132014

The Cambridge Home and European Scholarship Scheme (CHESS) award, 20132014.

International Mathematics Competition for University Students (IMC): First Prize, 2011, Second Prize, 2010.

South Eastern European Mathematics Olympiad for University Students (SEEMOUS): Gold Medal (first place), 2011, Silver Medal, 2010.