Skip to content

jameswoodfield/Point_vortex

Repository files navigation

Point vortex library

  • Autodifferentiable
  • CPU-GPU tested -jax[cpu] - jax[cuda12]
  • Works on nvidia cluster and tested locally on different macOS

image of final solution

References

This code is a usable subset of the code developed for the paper:

Stochastic fluids with transport noise: Approximating diffusion from data using SVD and ensemble forecast back-propagation

@article{woodfield2024stochastic, title={Stochastic fluids with transport noise: Approximating diffusion from data using SVD and ensemble forecast back-propagation}, author={Woodfield, James}, journal={arXiv preprint arXiv:2405.00640}, year={2024} }

If you use this code or find it helpful please cite the above paper, and the additional references (Chorin, Madja + Bertozzi, Rüemelin).

It solves inviscid vortex models, with/without stochastic noise. It is differentiable. Available under a MIT Liscence.

To run:

Attain a copy of the code and run it e.g. :

  • -python3 example_0.py

The below timesteppers are differentiable to optimise solution trajectories. (Ensemble 4Dvar)

  • -python3 example_1.py
  • -python3 example_2.py noting that the entire solution trajectory is saved to local memory.

The below timestepers just give final condition.

  • -python3 example_3.py
  • -python3 example_4.py

Dependencies:

jax, matplotlib... I use some other librarys also.

Example zero solves

$$\boldsymbol{x}(\boldsymbol{X}, t)=\boldsymbol{x}(\boldsymbol{X}, 0)+\int_0^t \boldsymbol{u}(\boldsymbol{x}(\boldsymbol{X}, s), s) d s; \quad \boldsymbol{x}(\boldsymbol{X}, 0)=\boldsymbol{X} $$ Where the Kernel is specified by the Euler Kernel.

Example one solves:

$$\boldsymbol{x}(\boldsymbol{X}, t)=\boldsymbol{x}(\boldsymbol{X}, 0)+\int_0^t \boldsymbol{u}(\boldsymbol{x}(\boldsymbol{X}, s), s) d s+\sum_{p=1}^P \int_0^t \theta_p \boldsymbol{\xi}_p(\boldsymbol{x}(\boldsymbol{X}, s)) \circ d B^p(s) ; \quad \boldsymbol{x}(\boldsymbol{X}, 0)=\boldsymbol{X} $$

Examples two solves:

$$\boldsymbol{x}(\boldsymbol{X}, t)=\boldsymbol{x}(\boldsymbol{X}, 0)+\int_0^t \boldsymbol{u}(\boldsymbol{x}(\boldsymbol{X}, s), s) d s+\sum_{p=1}^P \int_0^t \theta_p \boldsymbol{\xi}_p(\boldsymbol{x}(\boldsymbol{X}, s)) \circ d B^p(s) + \int_0^t \nu d W(s) ; \quad \boldsymbol{x}(\boldsymbol{X}, 0)=\boldsymbol{X} $$

Example three solves:

$$\boldsymbol{x}(\boldsymbol{X}, t)=\boldsymbol{x}(\boldsymbol{X}, 0)+\int_0^t \boldsymbol{u}(\boldsymbol{x}(\boldsymbol{X}, s), s) d s+\sum_{p=1}^P \int_0^t \theta_p \boldsymbol{\xi}_p(\boldsymbol{x}(\boldsymbol{X}, s)) \circ d B^p(s) ; \quad \boldsymbol{x}(\boldsymbol{X}, 0)=\boldsymbol{X} $$

Examples four solves:

$$\boldsymbol{x}(\boldsymbol{X}, t)=\boldsymbol{x}(\boldsymbol{X}, 0)+\int_0^t \boldsymbol{u}(\boldsymbol{x}(\boldsymbol{X}, s), s) d s+\sum_{p=1}^P \int_0^t \theta_p \boldsymbol{\xi}_p(\boldsymbol{x}(\boldsymbol{X}, s)) \circ d B^p(s) + \int_0^t \nu d W(s) ; \quad \boldsymbol{x}(\boldsymbol{X}, 0)=\boldsymbol{X} $$

Numerical Approach:

Kernel convergence in the deterministic setting,

[7] = J Thomas Beale and Andrew Majda. High order accurate vortex methods with explicit velocity kernels. Journal of Computational Physics, 58(2):188–208, 1985

This scheme (in the deterministic setting) has been shown to have the property that if $\delta=h^q$ for $q \in(0,1)$, the order of convergence to the solution of the Euler Equation is given by $O\left(h^{(2 p+2) q}\right)$ see [7].

Stochastic scheme temporal consistency,

To deal with the stochastic Stratonovich term, we discretise in time with the stochastic generalisation of the SSP33 scheme of Shu and Osher, where the forward Euler scheme is replaced with Euler Maruyama scheme in the Shu Osher representation. This time-stepping is weak order 1, strong order 0.5, as can be found by Taylor expanding or as a subcase of the work by Ruemelin[40].

[40] = W Rüemelin. Numerical treatment of stochastic differential equations. SIAM Journal on Numerical Analysis, 19(3):604–613, 1982.

Code History:

This code is based upon: lecture notes and example code provided by John Methven for the NCAR summer school in Cambridge. Then subsequently extended to different notions of stochastic integration for the paper: Lévy areas, Wong Zakai anomalies in diffusive limits of Deterministic Lagrangian Multi-Time Dynamics (in pytorch). Then later extended into JAX, with several different performance improvements using a different temporal integrator for the paper: Approximating diffusion from data using SVD and ensemble forecast back-propagation

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published