Scientific Machine Learning: Neural Networks and Neural Differential Equations
This is the GitHub repository for the MATH50002 Group Research Project (M2R) by Group 29.
You can find all relevant codes, figures, and LaTeX files here.
Group Members:
- Jiaru (Eric) Li
- James Tay
- Xinyan Wang
- Jiankuan Liu
- Tianshi Liu
Group Supervisor:
The outline of our report (along with the relevant contributors) is detailed below.
1. Introduction (Jiaru, James)
2. Neural Networks
2.1 History (Jiaru)
2.2 Concepts
- Structure (Jiaru)
- Activation Function (Jiaru)
- Loss Function (James)
2.3 Optimisation Techniques (James)
- Gradient Descent
- Automatic Differentiation and Discrete Backpropagation
2.4 Example: Regression (Jiaru)
2.5 Example: Classification (Jiaru)
3. Neural ODEs
3.1 Motivation (Xinyan)
3.2 Comparing Neural ODEs with ResNet (Xinyan)
3.3 Various Neural ODE Models (Xinyan)
- Augmented Neural ODEs
- Neural ODEs with Scientific Machine Learning
3.4 Solving ODEs Numerically (James)
3.5 Backpropagation in Neural ODEs (James, Xinyan)
- Discretise-then-Optimise (DO)
- Optimise-then-Discretise (OD)
3.6 Adjoint Sensitivity Method (Xinyan)
- Continuous Backpropagation
- Gradient wrt. $\theta$ and $t$
4. Application: Digit Classifier with Neural ODEs
4.1 Implementation (Jiankuan, Jiaru)
4.2 Comparison (Jiankuan, Xinyan)
5. Extension: Neural CDEs (Tianshi)
5.1 Motivation
5.2 Controlled Differential Equations
5.3 Neural CDEs
5.4 Solving Neural CDEs
5.5 Application: Time Series Modelling
- Cubic Splines
- Implementation
6. Conclusion (James)
Acknowledgements (James)
References
Appendices
A Proof of Convergence of Gradient Descent Algorithm (James)
B Comparison of ODE Solvers (James)
C A Simple Example of the Adjoint Method (Xinyan)
D Data Preprocessing of Financial Time Series (Tianshi)