**Our Cantrell Lectures scheduled for the week of April 15 have been postponed.**

**We hope to reschedule them for next academic year.**

The 2020 Cantrell Lectures will be given by ** Professor Weinan E of Princeton University**. Professor Weinan E is a distinguished applied mathematician noted for his work on a wide range of topics. He has received numerous awards, most recently the 2019 Peter Henrici Prize from SIAM. The SIAM announcement of this price cited "breakthrough contributions in various fields of applied mathematics and scientific computing, particularly nonlinear stochastic (partial) differential equations (PDEs), computational fluid dynamics, computational chemistry, and machine learning. E’s scientific work has led to the resolution of many long-standing scientific problems. His signature achievements include novel mathematical and computational results in stochastic differential equations; design of efficient algorithms to compute multiscale and multiphysics problems, particularly those arising in fluid dynamics and chemistry; and his recent pioneering work on the application of deep learning techniques to scientific computing.

**April 15, 2020** - Physics Bldg. Room 202, 3:30pm (Refreshments will be served at 3:00pm outside of Room 202)

Title: Machine Learning: Mathematical Theory and Scientific Applications

Abstract: Modern machine learning has had remarkable successes in all kinds of AI applications, and is also poised to change fundamentally the way we do science. In this talk, I will give an overview on some of the most exciting applications of machine learning in science. I will discuss representative examples of using machine learning to attack problems in physics, chemistry and other scientific and engineering disciplines. Emphasis will be put on the most important issues about using machine learning to build reliable physical models and algorithms.Towards the end of the talk I will discuss basic issues and ideas involved in order to establish a solid mathematical theory of machine learning.

**April 16, 2020 **- Boyd Graduate Studies Research Bldg., Room 328 3:30pm (Refreshments will be served at 3:00pm outside of Room 328)

Title: A Mathematical Perspective of Machine Learning

Abstract: The heart of modern machine learning is the approximation of high dimensional functions. Traditional approaches, such as approximation by piecewise polynomials, wavelets, or other linear combinations of fixed basis functions, suffer from the curse of dimensionality. We will discuss representations and approximations that overcome this difficulty, as well as gradient flows that can be used to find the optimal approximation. We will see that at the continuous level, machine learning consists of a series of reasonably nice variational and PDE-like problems. Modern machine learning models/algorithms, such as the random feature and shallow/deep neural network models, are all special discretizations of these continuous problems. We will also discuss how to construct new models/algorithms using the same philosophy. At the theoretical level, we will present a framework that is suited for analyzing machine learning models and algorithms in high dimension, and present results that are free of the curse of dimensionality. Finally, we will discuss the fundamental reasons that are responsible for the success of modern machine learning, as well as the subtleties and mysteries that still remain to be understood.

**April 17, 2020** - Boyd Graduate Studies Research Bldg., Room 328 3:30pm (Refreshments will be served at 3:00pm outside of Room 328)

Title: High Dimensional PDEs: Theory and Numerical Algorithms

Abstract: In physics, economics, and control theory, we often encounter PDEs in very high dimensions. This has been a notoriously difficult problem due to the curse of dimensionality. In recent years, two classes of algorithms have emerged for solving nonlinear parabolic PDEs in high dimension, with a complexity that scales algebraically (linear or quadratic) in the dimension: the multi-level Picard method and the deep learning based methods. These algorithms have opened up new possibilities for attacking control and many other problems in hundreds and thousands of dimensions. They have also triggered questions about understanding PDEs in high dimensions. In this talk, I will discuss what we have achieved and understood so far about these problems.

The lectures will be held April 15, 16, and 17. The format of the lectures is as follows:the first lecture is for undergraduate students and a general public audience, the second is a general colloquium level talk, and the third is for faculty and graduate students of similar research interest.