New York University

251 Mercer Street

New York, NY 10012-1185

I am currently a Visiting Member in the Courant Institute of Mathematical Sciences at New York University and a Postdoc Associate at New York University Abu Dhabi, working with Prof. David W. McLaughlin in mathematical and computational neuroscience. I obtained my B.S. in Physics (Zhiyuan College) and a Ph.D. degree in Mathematics from Shanghai Jiao Tong University in China under the supervision of Profs. David Cai and Douglas Zhou.

My research interests lie in computational neuroscience, ranging from theoretical study and simulation to data analysis. I have collaborated actively with theoretical and experimental neuroscientists. Directly from experimental data, we have found a common dynamical state (Probability Polling state or p-polling state) underlying neuronal coding, which reveals a mechanism underlying the application of low order Maximum Entropy Principle (MEP) in neuronal networks. Our study on the effective interactions of the MEP model also shows how a sparse coupling structure can lead to a sparse coding scheme.

Another interest of mine is studying deep learning theoretically. Empirically, we found a Frequency Principle (F-Principle) that deep neural networks (DNNs) often capture target functions from low frequency to high frequency in order during the training. We then develop a theoretical framework by Fourier analysis to understand the F-Principle. Our theory provides an understanding to why DNNs can have a large capacity to memorize randomly labeled dataset, but still, possess good generalization in real dataset. The series work of the F-Principle lays a new direction for quantitatively understanding deep learning. A summary of the F-Principle can be found here (PDF).

- 2016 Ph.D. in Mathematics, School of Mathematical Sciences, Shanghai Jiao Tong University, China.
- 2012 B.S. in Physics (major) and Mathematics (minor), Zhiyuan College, Shanghai Jiao Tong University, China.

- New York University Abu Dhabi, United Arab Emirates:
- Aug. 01-Sep. 01, 2013; Jan. 19-Feb. 19, 2014; Jan. 27-Feb. 27, 2015
- Courant Institute, New York University, U.S.:
- Sep. 1st-Dec. 20th, 2015; Sep. 15th-Dec. 20th, 2016; Sep. 10th, 2017-Sep. 30th, 2019.

- F-Principle: Fourier analysis in Deep Learning
- The summary of F-Principle in deep learning [PDF][Note: This describes each paper briefly.]
- An implicit training bias of DNN. F-Principle: DNNs often fit target functions from low to high frequencies. (Each frame is one training step. Red: FFT of the target function; Blue: FFT of DNN output. Abscissa: frequency; Ordinate: amplitude.)
- [7] (Alphabetic order) Tao Luo, Zheng Ma, Zhi-Qin John Xu, Yaoyu Zhang,
*"Theory of the frequency principle for general deep neural networks"*, arXiv preprint, 1906.09235 (2019). - [6] Yaoyu Zhang, Zhi-Qin John Xu*, Tao Luo, Zheng Ma,
*"Explicitizing an Implicit Bias of the Frequency Principle in Two-layer Neural Networks"*, arXiv preprint, 1905.10264 (2019). submitted to Neurips 2019. - [5] Yaoyu Zhang, Zhi-Qin John Xu*, Tao Luo, Zheng Ma,
*"A type of generalization error induced by initialization in deep neural networks"*, arXiv preprint, 1905.07777 (2019). submitted to Neurips 2019. - [4] Zhi-Qin John Xu*, Yaoyu Zhang, Tao Luo, Yanyang Xiao, Zheng Ma,
*"Frequency Principle: Fourier Analysis Sheds Light on Deep Neural Networks"*, arXiv preprint, 1901.06523 (2019). submitted to Neurips 2019.
Note: Most of [2] and [3] are combined into paper [4] and submitted to Neurips 2019.
- [3] Zhi-Qin John Xu*,
*"Frequency Principle in Deep Learning with General Loss Functions and Its Potential Application"*, arXiv preprint, 1811.10146 (2018). - [2] Zhi-Qin John Xu*,
*"Understanding training and generalization in deep learning by Fourier analysis"*, arXiv preprint: 1808.04295, (2018). [arXiv][CODE][Note: theoretical framework] - [1] Zhi-Qin John Xu*, Yaoyu Zhang, and Yanyang Xiao,
*"Training behavior of deep neural network in frequency domain"*, arXiv preprint: 1807.01251, (2018).[arXiv] [CODE][Note: F-Principle] (Submitted to IEEE transactions on neural networks and learning systems) - 科普：线性Frequency Principle动力学：定量理解深度学习的一种有效模型
- 科普：F-Principle：初探深度学习在计算数学的应用
- 科普：F-Principle：初探理解深度学习不能做什么
- 科普：从傅里叶分析角度解读深度学习的泛化能力
- Computational Neuroscience
- [8] Zhi-Qin John Xu, Xiaowei Gu, Chengyu Li, David Cai, Douglas Zhou, David W. McLaughlin,
*"A Common Cortical State Underlying Neuronal Population Coding"*, (2018), under proofreading - [7] Zhi-Qin John Xu, Douglas Zhou, and David Cai,
*"Swift Two-sample Test on High-dimensional Neural Spiking Data"*, (2018), arXiv preprint - [6] Zhi-Qin John Xu, Fang Xu, Guoqiang Bi, Douglas Zhou, David Cai,
*"A Cautionary Tale of Entropic Criteria in Assessing the Validity of Maximum Entropy Principle"*, (2018). (Accepted Europhysics Letters) - [5] Zhi-Qin John Xu, Jennifer Crodelle, Douglas Zhou, and David Cai,
*"Maximum Entropy Principle Analysis in Network Systems with Short-time Recordings"*, Physical Review E, DOI: 10.1103/PhysRevE.99.022409, (2019). [.pdf][PRE] - [4] Zhi-Qin John Xu, Douglas Zhou, and David Cai,
*"Dynamical and Coupling Structure of Pulse-Coupled Networks in Maximum Entropy Analysis"*, Entropy 2019, 21(1). [.pdf][Entropy] - [3] Zhi-Qin John Xu, Guoqiang Bi, Douglas Zhou, and David Cai,
*"A dynamical state underlying the second order maximum entropy principle in neuronal networks"*, Communications in Mathematical Sciences, 15 (2017), pp. 665–692. [.pdf] - [2] Douglas Zhou, Yanyang Xiao, Yaoyu Zhang, Zhiqin Xu, and David Cai,
*"Granger causality network reconstruction of conductance-based integrate-and-fire neuronal systems"*, PloS one, 9 (2014) [.pdf] - [1] Douglas Zhou, Yanyang Xiao, Yaoyu Zhang, Zhiqin Xu, and David Cai,
*"Causal and structural connectivity of pulse-coupled nonlinear networks"*, Physical review letters, 111 (2013) [.pdf] - (∗ indicates the corresponding author)

- Fall 2014 and Spring 2015 Recitation Instructor: Mathematical Analysis.
- Spring 2014 Teaching Assistant: Probability.
- Fall 2013 Teaching Assistant: Analysis.
- Spring 2013 Teaching Assistant: Asymptotic Analysis.
- Fall 2012 Teaching Assistant: Mathematical Physics.

- Mar 2, 2019
*Workshop on Learning and Modeling, Princeton University* - Jan 8, 2019
*Seminar at Southern University of Science and Technology* - Nov 19, 2018
*Seminar at Center for Data Science, New York University* - Nov 13, 2018
*Seminar at Rensselaer Polytechnic Institute.* - Nov 3-7, 2018
*Poster, Society for Neuroscience, San Diego, CA.* - Oct 30, 2018
*Seminar at Purdue University.* - Oct 26, 2018
*Seminar at Illinois Institute of Technology.* - Sep 20, 2018
*Clements Scientific Computing Seminar Series, Southern Methodist University.* - Aug 7, 2018
*In memory of David Cai, SIAM Conference on Life Sciences, Minneapolis, Minnesota.* - Aug 3, 2018
*Seminar, Zhejiang University.* - Jun 12, 2018
*INS Colloquia, Shanghai Jiao Tong University.* - May 9, 2018
*Data Science Seminar at NYU Shanghai.* - May 7, 2018
*Math-Neuroscience Seminar at NYU Shanghai.* - Apr 17, 2018
*Seminar at Rensselaer Polytechnic Institute.* - Nov 28, 2017
*Seminar at Rensselaer Polytechnic Institute.* - Nov 11-15,2017
*Society for Neuroscience, Washington, DC.* - Jun 26, 2017
*Seminar at Wuhan University.* - Jun 27, 2017
*Seminar at Wuhan University.* - Jun 16, 2017
*Workshop on Data Analysis and Nonlinear Dynamic System at Shanghai Lixin University of Accounting and Finance.* - Jun 7, 2017
*Workshop on Data Assimilation and Information Theory at Fudan University.* - May 25, 2017
*SIAM Conference on Applications of Dynamical Systems, Snowbird, Utah.* - Feb 10, 2017
*Colloquium at the University of North Carolina at Chapel Hill.* - Nov 15, 2016
*Seminar at Rensselaer Polytechnic Institute.* - Nov 8, 2016
*Colloquium at Courant Institute, New York University.* - Aug 8-11, 2016
*SIAM Conference on Nonlinear Waves and Coherent Structures, Philadelphia, Pennsylvania.* - Nov 5, 2015
*Seminar at Rensselaer Polytechnic Institute.* - 2015 -- 2018
*Organizer, The INS-ZY Student Workshop on Natural Sciences, Shanghai Jiao Tong University.*