Hausdorff Kolloquium 2022/23
Dates: October 19, 2022, and January 18, 2023
Organizers: Ursula Hamenstädt and Christian Brennecke
Venue: Lipschitzsaal, Mathezentrum, Endenicher Allee 60, 53115 Bonn
Wednesday, October 19
in cooperation with the Global Math Network | |
14:15 | Weinan E (Peking University & Princeton University): Towards a Mathematical Theory of Machine Learning (online) |
16:00 | François Charles (l'École Normale Supérieure, Paris): Geometry of numbers in finite and infinite dimension |
Wednesday, January 18
Abstracts
François Charles (l'École Normale Supérieure, Paris): Geometry of numbers in finite and infinite dimension
I will discuss the appearance of Euclidean lattices and their associated Gaussian measures in various questions of diophantine and arithmetic geometry. I will explain how some aspects of the classical study of Euclidean lattices can be extended to infinite dimension, and how such infinite-dimensional objects come up naturally in various arithmetic problems and provide one more facet of the analogy between number fields and algebraic curves. Joint work with Jean-Benoît Bost.
Marcello Porta (SISSA, Trieste, Italy): Collective Behaviour of Mean-Field Fermi Gases
Systems of interest in physics are typically formed by an enormous number of particles, making their analysis from the microscopic laws of motion extremely challenging. For this reason, concrete studies of physical systems are often based on effective theories, involving drastically fewer degrees of freedom, and which are expected to correctly capture the collective behaviour of the system in suitable regimes. Famous examples are the Boltzmann equation, for the dynamics of rarefied gases, or the Thomas-Fermi model, for the description of large atoms and molecules. In this talk I will consider the problem of rigorously justifying effective theories, for many-body quantum systems of fermionic type. Specifically, I will focus on a high-density/mean-field regime; here, the quantum dynamics of uncorrelated enough fermionic states is well approximated by a non-linear Schrödinger equation, called the Hartree-Fock equation. The main limitation of this widely used approximation is that it completely neglects the correlations between the particles, besides those due to the antisymmetry of the many-body wave function. In this talk, I will discuss a rigorous bosonization technique, that allows to efficiently study quantum correlations in high density Fermi gases in terms of an emergent, non-interacting Bose gas. As an application of this method, we put on rigorous grounds the prediction of the random-phase approximation, introduced by Bohm and Pines in the 50s, for the ground state energy of mean-field Fermi gases. Based on joint works with N. Benedikter, P. T. Nam, B. Schlein and R. Seiringer.
Peter Topping (University of Warwick): How is Ricci flow used to solve problems in Differential Geometry and Topology?
Ricci flow is essentially a parabolic partial differential equation that can evolve the shape of a manifold. It has a stunning record of solving open problems in completely different areas of mathematics, as is well known. What is perhaps less well appreciated is how different applications use Ricci flow in completely different ways, particularly over the past 5 years or so. I will try to give an overview of many of these, making time to cover one or two applications in more detail. I will not assume that the audience knows anything about Ricci flow. I will also make an effort to discuss the Differential Geometry in a way that can be understood by non-experts.
Weinan E (Peking University & Princeton University): Towards a Mathematical Theory of Machine Learning
Given a machine learning model, what are the class of functions that can be approximated by this particular model efficiently, in the sense that the convergence rate for the approximation, estimation and optimization errors does not deteriorate as dimensionality goes up? We address this question for three classes of machine learning models: The random feature model, two-layer neural networks and the residual neural network model. During the process, we will also summarize the current status of the theoretical foundation of deep learning, and discuss some of the key open questions.