Accelerating spectral simulations with ML


Hugo Frezat, IPGP, Univ. Paris Cité. 7 juin 2024 11:30 TLR edp
Abstract:

When discretizing partial differential equations, one can choose local (finite differences, volumes, elements) or global (spectral) methods. The most common spectral basis is built on trigonometric polynomials, i.e. Fourier series. It constrains the boundary conditions to be periodic and has been an important tool in physics, used for instance to study theoretical scalings of turbulence. While spectral methods show "exponential convergence" for smooth functions, large DNS simulations also become too expensive for e.g. when reaching very large Reynolds numbers. In practice, it is possible to solve a coarser version of the DNS by removing the largest wavenumbers in spectral space (cut-off) and modeling transfers at the smallest (sub-grid) scales instead. The definition of such a model has been an open problem for a long time and classical ones are either too diffusive or unstable. Machine learning started to be an interesting alternative few years ago and people quickly found that learning a model that performs better on a priori (instantaneous) metrics is possible. We have shown that in order to lead to stable simulations in a posteriori tests, the temporal dimension must be taken into account during the learning process. This problem has now been largely explored with periodic boundary conditions, but when it comes to spectral methods with orthogonal polynomials and fixed boundaries, new challenges appear.