Grants (by year)

External grants:

  1. PI: Shanghai Eastern Scholar Program (Youth Track), Jan 2020 - Dec 2022

  2. PI: Shanghai Overseas Talent Program, 2020

  3. PI: NSFC: Grant No. 12001372, Jan 2021 - Dec 2023

  4. PI: National Young Talent Program, Jan 2022 - Dec 2024

  5. National Key R&D Program of China, 2021YFA1002800, Dec 2021 - Nov 2026, (together shared by Lei Li, SJTU, PI; Zhenfu Wang, PKU)

Institutional grants:

  1. PI: NYU Shanghai Boost Funds, Jan 2021 - Dec 2026

  2. PI: NYU Shanghai Boost Funds, Jan 2022 - Aug 2027

Publication (by year)

Submitted

  1. Neural collapse for unconstrained feature model under cross-entropy loss with imbalanced data.
    W. Hong, S. Ling, submitted, 2023. (arXiv version).

  2. Improved theoretical guarantee for rank aggregation via spectral method.
    Z. S. Zhong, S. Ling, submitted, 2023. (arXiv version).

  3. A metric and its derived protein network for evaluation of ortholog database inconsistency.
    W. Yang, J. Ji, S. Ling, G. Fang, Submitted, 2022. (bioRxiv version)

  4. Generalized power method for generalized orthogonal Procrustes problem: global convergence and optimization landscape analysis.
    S. Ling, submitted, 2021. (arXiv version)

  5. On the critical coupling of the finite Kuramoto model on dense networks.
    S. Ling, submitted, 2020. (arXiv version)

Journal Publications

  1. Near-optimal bounds for generalized orthogonal Procrustes problem via generalized power method.
    S. Ling, Applied and Computational Harmonic Analysis, 66, 62-100, 2023. (arXiv version)(Final)(Talk Recording on Youtube)(Code demo)

  2. Solving orthogonal group synchronization via convex and low-rank optimization: tightness and landscape analysis.
    S. Ling, Mathematical Programming, Series A, to appear, 2022. (arXiv version)(Final)

  3. Near-optimal performance bounds for orthogonal and permutation group synchronization via spectral methods.
    S. Ling, Applied and Computational Harmonic Analysis 60, 20-52, 2022. (arXiv version)(Final)

  4. Improved performance guarantees for orthogonal group synchronization via generalized power method.
    S. Ling, SIAM Journal on Optimization, 32(2):1018-1048, 2022. (arXiv version)(Final)

  5. Strong consistency, graph Laplacians, and the stochastic block model.
    S. Deng, S. Ling, T. Strohmer, Journal of Machine Learning Research, 22(117):1−44, 2021. (arXiv version)(Final)

  6. Certifying global optimality of graph cuts via semidefinite relaxation: a performance guarantee for spectral clustering.
    S. Ling, T. Strohmer, Foundation of Computational Mathematics, 20(3):368-421, 2020. (arXiv version)(Final)(Slides)

  7. When do birds of a feather flock together? k-means, proximity, and conic programming.
    X. Li, Y. Li, S. Ling, T. Strohmer, K. Wei, Mathematical Programming, Series A, 179(1):295-341, 2020. (arXiv version)(Final)(Slides)

  8. On the landscape of synchronization networks: a perspective from nonconvex optimization.
    S. Ling, R. Xu, A. S. Bandeira, SIAM Journal on Optimization, 29(3):1879-1907, 2019. (arXiv version)(Final)(Talk Recording at the CMO)

  9. Learning from their mistakes: self-calibrating sensors.
    B. Friedlander, S. Ling, T. Strohmer, SIAM News, 52(2), 2019. (Final)

  10. Rapid, robust, and reliable blind deconvolution via nonconvex optimization.
    X. Li, S. Ling, T. Strohmer, K. Wei, Applied and Computational Harmonic Analysis, 47(3):893-934, 2019. (arXiv version)(Final)(Slides)(Talk Recording at the CMO)

  11. Regularized gradient descent: a nonconvex recipe for fast joint blind deconvolution and demixing.
    S. Ling, T. Strohmer, Information and Inference: A Journal of the IMA, 8(1):1-49, 2019. (arXiv version)(Final)(Slides)

  12. Self-calibration and bilinear inverse problems via linear least squares.
    S. Ling, T. Strohmer, SIAM Journal on Imaging Sciences, 11(1):252-292, 2018. (arXiv)(Final)

  13. Blind deconvolution meets blind demixing: algorithms and performance bounds.
    S. Ling, T. Strohmer, IEEE Transactions on Information Theory, 63(7):4497-4520, 2017. (arXiv version)(Final)(Slides)

  14. Self-calibration and biconvex compressive sensing.
    S. Ling, T. Strohmer, Inverse Problems, 31(11):115002, 2015. (arXiv version)(Final)(Slides)
    (SIAM Student Paper Award 2017)

  15. Backward error and perturbation bounds for high order Sylvester tensor equation.
    X. Shi, Y. Wei, S. Ling, Linear and Multilinear Algebra, 61(10):1436-1446, 2013. (Final)

Conference Proceedings

  1. Fast blind deconvolution and blind demixing via nonconvex optimization.
    S.Ling, T.Strohmer, International Conference on Sampling Theory and Applications (SampTA), pp.114-118, 2017. (Final)

  2. You can have it all – Fast algorithms for blind deconvolution, self-calibration, and demixing.
    S.Ling, T.Strohmer, Mathematics in Imaging, MW1C.1, 2017. (Final)

  3. Simultaneous blind deconvolution and blind demixing via convex programming.
    S.Ling, T.Strohmer, 50th Asilomar Conference on Signals, Systems and Computers, pp.1223-1227, 2016. (Final)

Dissertation

  • Bilinear Inverse Problems: Theory, Algorithms, and Applications.
    S.Ling, University of California Davis, 2017, (Manucript)(Slides)