I am currently a Visiting Member in the Courant Institute of Mathematical Sciences at New York University and a Postdoc Associate at New York University Abu Dhabi, working with Prof. David W. McLaughlin in mathematical and computational neuroscience. I obtained my B.S. in Physics (Zhiyuan College) and a Ph.D. degree in Mathematics from Shanghai Jiao Tong University in China under the supervision of Profs. David Cai and Douglas Zhou.
My research interests lie in computational neuroscience, ranging from theoretical study and simulation to data analysis. I have collaborated actively with theoretical and experimental neuroscientists. Directly from experimental data, we have found a common dynamical state (Probability Polling state or p-polling state) underlying neuronal coding, which reveals a mechanism underlying the application of low order Maximum Entropy Principle (MEP) in neuronal networks. Our study on the effective interactions of the MEP model also shows how a sparse coupling structure can lead to a sparse coding scheme.
My another interest is to study deep learning theoretically. Empirically, we found a Frequency Principle (F-Principle) that deep neural networks (DNNs) often capture target functions from low frequency to high frequency in order during the training. We then develop a theoretical framework by Fourier analysis to understand the F-Principle. In addition, our theory can understand when and how F-Principle can hold or fail. Our theory provides an understanding to why DNNs can have a large capacity to memorize randomly labeled dataset, but still, possess good generalization in real dataset.