google-deepmind / torax

TORAX: Tokamak transport simulation in JAX
https://torax.readthedocs.io
Other
345 stars 31 forks source link

Unittests

What is TORAX?

TORAX is a differentiable tokamak core transport simulator aimed for fast and accurate forward modelling, pulse-design, trajectory optimization, and controller design workflows. TORAX is written in Python using JAX, with the following motivations:

For more comprehensive documentation, see our readthedocs page.

TORAX now has the following physics feature set:

Additional heating and current drive sources can be provided by prescribed formulas, user-provided analytical models, or user-provided prescribed data.

Model implementation was verified through direct comparison of simulation outputs to the RAPTOR [Felici et al, Plasma Phys. Control. Fusion 2012] tokamak transport simulator.

This is not an officially supported Google product.

Feature roadmap

Short term development plans include:

Longer term desired features include:

Contributions in line with the roadmap are welcome. In particular, TORAX is envisaged as a natural framework for coupling of various ML-surrogates of physics models. These could include surrogates for turbulent transport, neoclassical transport, heat and particle sources, line radiation, pedestal physics, and core-edge integration, MHD, among others.

Installation guide

Requirements

Install Python 3.10 or greater.

Make sure that tkinter is installed:

sudo apt-get install python3-tk

How to install

Install virtualenv (if not already installed):

pip install --upgrade pip
pip install virtualenv

Create a code directory where you will install the virtual env and other TORAX dependencies.

mkdir /path/to/torax_dir && cd "$_"

Where /path/to/torax_dir should be replaced by a path of your choice.

Create a TORAX virtual env:

python3 -m venv toraxvenv

Activate the virtual env:

source toraxvenv/bin/activate

Download QLKNN dependencies:

git clone https://gitlab.com/qualikiz-group/qlknn-hyper.git
export TORAX_QLKNN_MODEL_PATH="$PWD"/qlknn-hyper

It is recommended to automate the environment variable export. For example, if using bash, run:

echo export TORAX_QLKNN_MODEL_PATH="$PWD"/qlknn-hyper >> ~/.bashrc

The above command only needs to be run once on a given system.

Download and install the TORAX codebase via http:

git clone https://github.com/google-deepmind/torax.git

or ssh (ensure that you have the appropriate SSH key uploaded to github).

git clone git@github.com:google-deepmind/torax.git

Enter the TORAX directory and pip install the dependencies.

cd torax; pip install -e .

If you want to install with the dev dependencies (useful for running pytest and installing pyink for lint checking), then run with the [dev]:

cd torax; pip install -e .[dev]

Optional: Install additional GPU support for JAX if your machine has a GPU: https://jax.readthedocs.io/en/latest/installation.html#supported-platforms

Running an example

The following command will run TORAX using the default configuration file examples/basic_config.py.

python3 run_simulation_main.py \
   --config='torax.examples.basic_config' --log_progress

To run more involved, ITER-inspired simulations, run:

python3 run_simulation_main.py \
   --config='torax.examples.iterhybrid_rampup' --log_progress

and

python3 run_simulation_main.py \
   --config='torax.examples.iterhybrid_predictor_corrector' --log_progress

Additional configuration is provided through flags which append the above run command, and environment variables:

Set environment variables

Path to the QuaLiKiz-neural-network parameters. Note: if installation instructions above were followed, this may already be set.

$ export TORAX_QLKNN_MODEL_PATH="<myqlknnmodelpath>"

Path to the geometry file directory. This prefixes the path and filename provided in the geometry_file geometry constructor argument in the run config file. If not set, TORAX_GEOMETRY_DIR defaults to the relative path torax/data/third_party/geo.

$ export TORAX_GEOMETRY_DIR="<mygeodir>"

If true, error checking is enabled in internal routines. Used for debugging. Default is false since it is incompatible with the persistent compilation cache.

$ export TORAX_ERRORS_ENABLED=<True/False>

If false, JAX does not compile internal TORAX functions. Used for debugging. Default is true.

$ export TORAX_COMPILATION_ENABLED=<True/False>

The following implements the JAX persistent cache and will cause jax to store compiled programs to the filesystem, reducing recompilation time in some cases:

$ export JAX_COMPILATION_CACHE_DIR=<path of your choice, such as ~/jax_cache>
$ export JAX_PERSISTENT_CACHE_MIN_ENTRY_SIZE_BYTES=-1
$ export JAX_PERSISTENT_CACHE_MIN_COMPILE_TIME_SECS=0.0

Set flags

Output simulation time, dt, and number of stepper iterations (dt backtracking with nonlinear solver) carried out at each timestep.

python3 run_simulation_main.py \
   --config='torax.examples.iterhybrid_predictor_corrector' \
   --log_progress

Live plotting of simulation state and derived quantities.

python3 run_simulation_main.py \
   --config='torax.examples.iterhybrid_predictor_corrector' \
   --plot_progress

Combination of the above.

python3 run_simulation_main.py \
   --config='torax.examples.iterhybrid_predictor_corrector' \
   --log_progress --plot_progress

Post-simulation

Once complete, the time history of a simulation state and derived quantities is written to state_history.nc. The output path is written to stdout.

To take advantage of the in-memory (non-persistent) cache, the process does not end upon simulation termination. It is possible to modify the runtime_params, toggle the log_progress and plot_progress flags, and rerun the simulation. Only the following modifications will then trigger a recompilation:

Cleaning up

You can get out of the Python virtual env by deactivating it:

deactivate

Simulation tutorials

Under construction

Citing TORAX

A TORAX paper is available on arXiv. Cite this paper to cite TORAX:

@article{torax2024arxiv,
  title={{TORAX: A Fast and Differentiable Tokamak Transport Simulator in JAX}},
  author={Citrin, Jonathan and Goodfellow, Ian and Raju, Akhil and Chen, Jeremy and Degrave, Jonas and Donner, Craig and Felici, Federico and Hamel, Philippe and Huber, Andrea and Nikulin, Dmitry and Pfau, David and Tracey, Brendan, and Riedmiller, Martin and Kohli, Pushmeet},
  journal={arXiv preprint arXiv:2406.06718},
  year={2024}
}