| 9/18 |
Week 1 Course Introduction [slides] [recording]
-
Course syllabus and requirements
-
AI in Science and Engineering
-
Mathematical modeling with PDEs
-
Computational challenges
-
Motivation for AI approaches
|
|
| 9/25 |
Week 2 Introduction to Deep Learning [slides] [recording]
-
Introduction to using deep learning to model physical systems governed by PDEs.
-
Structure of MLPs with layers, weights, biases, and activation functions (sigmoid, tanh, ReLU, etc.).
-
Gradient descent, stochastic gradient descent (SGD), mini-batch SGD
-
Motivation for convolutional neural networks (CNNs) to handle high-dimensional inputs efficiently.
|
|
| 10/2 |
Week 3 Introduction to Physics-Informed Neural Networks [slides] [recording]
-
Introduction to Physics-Informed Neural Networks (PINNs).
-
Extending PINNs to reconstruct unknown solutions or parameters from partial measurements.
-
PINNs unify data-driven learning and physics-based modeling, offering flexible, mesh-free solvers for forward and inverse PDE problems.
|
|
| 10/9 |
Week 4 PINNs - Theoretical insights [slides] [recording]
-
Review of Physics-Informed Neural Networks (PINNs) for solving PDEs.
-
Theoretical analysis of PINN error - relation between training error, PDE residuals, and total approximation error.
-
Conditions ensuring convergence - coercivity, quadrature approximation, and DNN expressivity.
-
Rigorous error bounds for linear and nonlinear PDEs (Kolmogorov, Black-Scholes, Navier-Stokes, Burgers' equation).
-
Gradient descent dynamics and conditioning in PINN training; interpretation via NTK and preconditioning.
-
Practical performance and challenges:successes on smooth PDEs, difficulties on shocks or high-conditioning problems.
-
Overview of acceleration and stabilization techniques (causal learning, hard BCs, multi-stage networks).
|
|
| 10/16 |
Week 5 Operator Learning - Introduction [slides] [recording]
-
Transition from physics-informed learning to data-driven approaches.
-
Introduction to operator learning
-
Parametric PDE learning:deep networks approximating observables for low-dimensional parameterizations.
-
Operator learning:approximating infinite-dimensional mappings from data distributions.
-
Neural Operators generalizing DNNs to function spaces.
-
Fourier Neural Operators (FNOs):convolution in Fourier space, efficient and translation-invariant.
-
Theoretical foundation:universal approximation theorems for FNOs.
-
Practical challenges:bridging continuous operators and discrete numerical data (continuous-discrete equivalence).
|
|
| 10/23 |
Week 6 Operator Learning - FNO [slides] [recording]
-
Continuation of Operator Learning - FNO.
-
Continuous-discrete equivalence and Representation equivalent neural operators (ReNOs).
-
Why CNN and FNO are not ReNO.
|
|
| 10/30 |
Week 7 Operator Learning - ReNO [slides] [recording]
-
Continuation of Operator Learning - ReNO.
-
Convolutional Neural Operator (CNO), which is constructed to be a ReNO
|
|
| 11/6 |
|
|
| 11/13 |
|
|
| 11/20 |
|
|
| 11/27 |
|
|
| 12/4 |
|
|
| 12/11 |
|
|
| 12/18 |
|
|