My primary area of research is Dynamical Systems, a branch of modern mathematics concerned with time evolutions of natural and iterative processes. Earlier on in my career, my work revolved around the mathematical theory of chaos, focusing on topics such as Lyapunov exponents, entropy, fractal dimension, rates of mixing and strange attractors. A recurring theme was that deterministic, chaotic systems produce statistics much like those produced by random processes. I have also worked with concrete models, such as particle systems (billiards) and kicked oscillators.

In the last 15 years, my interests shifted to dynamical systems that are large, random, and out of equilibrium. Two examples of large systems are networks of smaller systems and semiflows on function spaces defined by PDEs. I work with systems with both deterministic and stochastic components, as well as problems from Nonequilibrium Statistical Mechanics.

I have also developed a serious interest in a particular complex dynamical system, the brain. A significant part of my current research is in Computational Neuroscience.

Below I discuss in more detail my research highlights, in roughly chronological order.

Theory of Chaotic Dynamical Systems

Entropy, Lyapunov exponents, and fractal dimension

Entropy and Lyapunov exponents are two different ways to capture dynamical complexity: entropy measures randomness in the sense of information theory, while positive Lyapunov exponents measure the rates at which nearby orbits diverge. It was known (shortly before I entered the field) that entropy is bounded above by the sum of positive Lyapunov exponents, and that these two quantities are equal for conservative systems. Ledrappier and I completed and clarified the picture: (A) We gave a necessary and sufficient condition for these two quantities to be equal, namely when the measure is SRB (see paragraph on Natural invariant measures); and (B) we proved that in general, the gap between them can be expressed in terms of the dimensions of the invariant measure [1]. These relations are very general; they hold for all diffeomorphisms and flows on finite dimensional manifolds.

The results above are part of a subject called nonuniform hyperbolic theory. I have continued to work on extending this theory to stochastic and infinite dimensional systems. Averaging effects of random noise should simplify the dynamical picture [2]. Extension to infinite dimensions will expand the scope of this theory to include semiflows defined by, e.g., dissipative parabolic PDEs; see [3] for a first step.

[1] (with F. Ledrappier) Annals of Math., 122 (1985), 509-574. [ PDF Part I | PDF Part II ]
[2] (with F. Ledrappier) Prob. Th. Rel. Fields, 80, (1988), 217-240. [ PDF ]
[3] (with Z. Lian) J. Amer. Math. Soc., 25 (2012), 637-665. [ PDF ]

Natural invariant measures

For Hamiltonian systems, Liouville measure is clearly the natural invariant measure. What plays the role of Liouville measure for dissipative systems, such as those with attractors more complicated than periodic sinks? The short answer is: SRB measures, which are observable, also called physically relevant, in the sense that they reflect the properties of positive Lebesgue measure sets -- except that things are a bit more complicated. In the 1970s Sinai, Ruelle and Bowen discovered these measures for Axiom A attractors, a class of chaotic attractors satisfying strong geometric conditions. This body of ideas, minus the assertion of existence, was extended to general attractors by Ledrappier and myself in [1]. Though SRB measures are believed to be prevalent, proving their existence is, unfortunately, very difficult in the absence of strong geometric conditions. The main examples to date are those discussed in [7] below.

Natural invariant measures inspired by the ideas above go beyond the setting of a single map or flow in finite dimensions. I have extended these ideas to infinite dimensions, to random dynamical systems, even time-dependent and leaky dynamical systems. A review of this topic is [4].

[4] J Stat. Phys. (2017) 166:494-515 [ Online version ]

Statistical properties and geometry

By definition, deterministic dynamical systems have memory. The more chaotic a system is, the more rapidly it mixes up its phase space geometry, equivalently, the faster the decay of its time correlations (with respect to smooth test functions). My main contributions to this topic are as follows [5]: (A) Via a so-called "tower" construction, I connected sufficiently hyperbolic (or chaotic) systems to countable state Markov chains. Leveraging these Markov-like structures, I showed that many statistical properties of the system are determined by tail properties of return times to certain reference sets. For example, exponentially decaying tails lead to exponential correlation decay, central limit theorems, large deviation principles, etc. (B) I proposed to connect the tail properties above directly to the geometry of the map, to the feature that poses the strongest obstruction to mixing. (A) and (B) together offered a unified way to get a handle on statistical properties of large classes of dynamical systems. I demonstrated that on a few examples, including the 2D periodic Lorentz gas.

There is ample similarity between chaotic dynamical systems and random stochastic processes though the former is fully deterministic. That being the case, it is natural to borrow techniques from probability. In addition to [5] which relies on spectral gaps, I also ``imported" coupling ideas to dynamical systems [6]. Both sets of ideas have been used a great deal and further developed by others.

[5] Annals of Math., (1998), 585-650. [ PDF | Postscript ]
[6] Israel J Math. 110 (1999), 153- 188. [ PDF | Postscript ]

Strange attractors and shear-induced chaos

Even though there is no formal definition of a "strange attractor", everyone agrees that its dynamics cannot be simple. Wang and I undertook a systematic study of what can be thought of as "strange attractors of the simplest kind". We called them rank-one attractors, referring to the fact that while these attractors can live in phase spaces of any dimension, they have only one direction of instability, with strong contraction in all other directions. Such attractors tend to occur following a system's loss of stability. We showed that they occur naturally in periodically kicked oscillators, in periodically forced systems undergoing Hopf bifurcations, and in certain slow-fast systems. The phenomenon is one of shear-induced chaos, meaning the chaotic behavior comes about as a result of underlying shear being magnified by the forcing, in a system that is otherwise nonchaotic. Shear-induced chaos can occur in systems defined by ODEs as well as PDEs; see e.g. [8].

This project consisted, in fact, of two separate parts, the second of which is reported above. The first part was a 130 page paper in which Wang and I identified a set of geometric conditions and proved that they imply the existence of rank-one attractors [7]. This part of our work benefited from the techniques of Benedicks and Carleson, in their analysis of the Hénon maps. We extracted certain ideas from this one example, and developed them into a general class of dynamical systems that includes all of the examples above. We also offered a full description of the geometric and statistical properties of these attractors, proving in particular the existence of SRB measures.

[7] (with Q.D. Wang) Annals of Math., 167, No.2, (2008), 349-480. [ PDF | Postscript ]
[8] (with K. Lu and Q. D. Wang) AMS Memoirs, (2013) [ PDF ]

Miscellaneous topics

A number of topics did not fit the major themes above. I mention here a few of them, such as [9], which studies coupled map networks, i.e., large dynamical systems made up of many smaller constituent subsystems coupled together, and [10], which studies escape dynamics in open systems, as exemplified in this case by billiard systems with a hole on the table. The purpose of [11] was to demonstrate what small random perturbations can do: The standard map is notorious in that it has extremely large expansion on nearly all of its phase space yet no one has been able to show rigorously that it has a positive Lyapunov exponent. We did that in [11] with the addition of a tiny amount of noise.

[9] (J. Koiller) Nonlinearity, 23, 5 (2010), 1121- 1141. [ Online version ]
[10] (with M. Demers and P. Wright) Commun. Math. Phys., Vol 294, 2, (2010), 353-338. [ PDF ]
[11] (with A. Blumenthal and J. Xue) Annals of Math, 185 (2017) 1-26 [ Journal PDF version | Online version ]

Applications of Dynamical Systems Ideas

Nonequilibrium statistical mechanics

In [12], Eckmann and I investigated the nonequlibrium steady states of a class of mechanical chains connected to two unequal heat baths. In this system, moving particles do not interact directly with one another; instead they interact via an array of rotating disks, which mark the ``temperature" of the system at various locations in the chain. This simplification allowed us to determine explicitly -- modulo a number of technical considerations -- macroscopic observations such as mean energy and particle density profiles. As a dynamical system, the mechanical chain above is far more complicated than e.g. billiards because of the many particles and the energy transfer. Understanding that rigorous analysis is out of reach, I turned to stochastics. Replacing some of the chaotic behavior by randomness while retaining the physical characteristics of the model, my collaborators and I were able to fill in some of the missing analysis. Simplifying further, we proved (i) the Fourier Law and (ii) local thermal equilibrium (LTE) for boundary-driven heat conduction systems in domains of any dimension [13]. LTE is a very fundamental property on which the definition of local temperature relies for systems that are driven out of equilibrium.

[12] (with J.-P. Eckmann) Commun. Math. Phys., 262 (2006) 237-267. [ PDF | Postscript ]
[13] (with Y. Li and P. Nandori) J Stat Phys. 163 (2016) 1: 61-91. [ Online version]

Computational neuroscience

Few will disagree that the brain is a large and complex dynamical system, a vast network of smaller dynamical systems, namely individual neurons, coupled together. Not to downplay the myriads of biochemical reactions on the cellular and molecular levels, it is generally recognized that dynamical interaction among neurons play an important role in shaping function in the cerebral cortex.

With the goal of understanding dynamical mechanisms behind visual information processing, I embarked a few years ago on a computational modeling project the objective of which is to build a next-generation model of the primary visual cortex (V1) of the macaque monkey, whose visual cortex is quite similar to human visual cortex. Working with a team that includes a neurophysiologist (R Shapley) and a postdoc with excellent computing skills (L Chariker), we now have a working model of the input layer (4Ca) to the magnocellular pathway [14], with other publications to follow. The model is realistic in the sense that it incorporates as much neuroanatomy and neurophysiology as possible, and it is benchmarked to reproduce results from dozens of sets of experimental data. We have been able to reproduce so far basic V1 features such as orientation and spatial frequency selectivity, contrast response, behaviors of simple and complex cells, gamma-band rhythms, as well as the diversity of responses among neurons.

Unlike many existing models in neuroscience, I want to stress that all this was achieved in a single network model, using a single set of parameters, and the number of V1 phenomena reproduced is comparable to or larger than the number of parameters used. This is important, because it is only by forcing the model to reproduce simultaneously multiple aspects of cortical behavior, in a way that mimics how cortex processes information, that one can gain insight into how real cortex works. Moreover, this exercise has forced the modelers to identify and delve into mechanisms behind functions. What one does not understand, one cannot build.

[14] (with L. Chariker and R. Shapley) J Neurosci (2016) 36(49):12368 -12384 [ Online version ]

For more information on this project, see Toward a dynamical theory of visual cortex via data-driven modeling


Faced with an unexpected outbreak of an infectious diseases, isolation of infected hosts is one of the most commonly used methods for containment. In practice, however, the effectiveness of isolation is diminished by delays and imperfect implementation. My collaborators and I studied mathematical models in which we sought to estimate the minimum response required to prevent an incipient infection from developing into a full-blown epidemic, and to predict the extent of the infection in eventual endemic states should containment fail [15]. We modeled the disease process using a system of delay differential equations, which gives rise to an infinite dimensional dynamical system the analysis of which led to answers to some of the questions above.

[15] (with S. Ruschel, T. Pereira and S. Yanchuk) arXiv preprint arXiv:1804.0269


  1. (with F. Ledrappier) The metric entropy of diffeomorphisms, Part I: Characterization of measures satisfying Pesin's entropy formula, Part II: Relations between entropy, exponents and dimension, Annals of Math., 122, (1985), 509-574.
    PDF Part I | PDF Part II ]

  2. (with F. Ledrappier) Entropy formula for random transformations, Prob. Th. Rel. Fields, 80, (1988), 217-240.
    [ PDF ]

  3. (with Z. Lian) Lyapunov exponents, periodic orbits and horseshoes for semiflows on Hilbert spaces, J. Amer. Math. Soc., 25 (2012), 637-665.
    [ PDF ]

  4. Generalizations of SRB measures to nonautonomous, random, and infinite dimensional systems, (review article), J Stat. Phys. (2017) 166:494-515
    [ Online version ]

  5. Statistical properties of dynamical systems with some hyperbolicity, Annals of Math., (1998), 585-650.
    [ PDF | Postscript ]

  6. Recurrence times and rates of mixing, Israel J Math. 110, (1999), 153-188
    [ PDF | Postscript ]

  7. (with Q.D. Wang) Toward a theory of rank one attractors, Annals of Math., 167, No.2, (2008), 349-480.
    [ PDF | Postscript ]

  8. (with K. Lu and Q. D. Wang) Strange attractors for periodically forced parabolic equations, AMS Memoirs, (2013).
    [ PDF ]

  9. (J. Koiller) Nonlinearity, 23, 5 (2010), 1121- 1141. [ Online version ]

  10. (with M. Demers and P. Wright) Escape rates and physically relevant measures for billiards with small holes, Commun. Math. Phys., Vol 294, 2, (2010), 353-338.
    [ PDF ]

  11. (with A. Blumenthal and J. Xue) Lyapunov exponents for random perturbations of some area-preserving maps including the standard map, Annals of Math, 185 (2017) 1-26
    Journal PDF version | Online version ]

  12. (with J.-P. Eckmann) Nonequilibrium Energy Profiles for a Class of 1-D Models, Commun. Math. Phys., 262, (2006), 237-267.
    [ PDF | Postscript ]

  13. (with Y. Li and P. Nandori) Local thermal equilibrium for certain stochastic models of heat transport, J Stat Phys. 163 (2016) 1: 61-91
    [ Online version]

  14. (with L. Chariker and R. Shapley) Orientation selectivity from very sparse LGN inputs in a comprehensive model of macaque V1 cortex, J Neurosci (2016) 36(49):12368 -12384
    Online version ]

  15. (with S. Ruschel, T. Pereira and S. Yanchuk) SIQ: A delay differential model for disease control via isolation, arXiv preprint arXiv:1804.0269


Updated: September 25, 2018