I am interested in stochastic numerics, differential equations and their applications to machine learning.
Since my time as a graduate student, I have particularly enjoyed the numerical analysis of Brownian motion and Stochastic Differential Equations (SDEs). This research has focused on developing numerical methods and applying them to prominent SDEs in data science, such as Langevin dynamics and Neural SDEs.
Alongside my interest in SDEs, I have worked on machine learning projects in collaboration with members of the DataSig team, where we introduced new differential equation models and algorithms, inspired by rough path theory, for multivariate time series problems.
James Foster Preprint (contains new theoretical results, but the numerical experiment is ongoing)
Andraž Jelinčič, Jiajie Tao, William F. Turner, Thomas Cass, James Foster and Hao Ni Preprint [Slides]
James Foster, Gonçalo dos Reis and Calum Strange Preprint [Slides]
James Foster and Karen Habermann Combinatorics, Probability and Computing (2023)
James Foster, Terry Lyons and Vlad Margarint Journal of Statistical Physics (2022)
James Foster, Terry Lyons and Harald Oberhauser Preprint [Slides]
Patrick Kidger, James Foster, Xuechen Li and Terry Lyons Neural Information Processing Systems (2021)
Cristopher Salvi, Thomas Cass, James Foster, Terry Lyons and Weixin Yang SIAM Journal on Mathematics of Data Science (2021)
Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser and Terry Lyons International Conference on Machine Learning (2021)  [Slides]
James Morrill, Cristopher Salvi, Patrick Kidger, James Foster and Terry Lyons International Conference on Machine Learning (2021)  [Slides]
Patrick Kidger, James Morrill, James Foster and Terry Lyons Neural Information Processing Systems, Spotlight (2020)  [Slides]
MA20222: Numerical Analysis (Semester 1) MA50251: Applied Stochastic Differential Equations (Semester 2)