Closed matt-graham closed 11 months ago
The abstraction in the FabNeso.py plugin appears to work for this, but I haven't been able to get H3LAPD to actually run. I keep getting segfaults with no associated error codes, so I don't know what the problem is. Have you managed to successfully run H3LAPD @matt-graham ?
Just tried running H3LAPD solver with a couple of example configurations in examples/H3LAPD
directory with and without mpirun
and also seem to be getting segfaults in all cases 😞.
@dleggat - just in case you also missed this, I realised that I had not spotted the README at
https://github.com/ExCALIBUR-NEPTUNE/NESO/tree/main/examples/H3LAPD
which indicates we need to use NekMesh
to convert the .geo
geometry files in the examples directory to XML files that can be used with Nektar++. After installing gmsh
and following the instructions to convert the geometry file using the provided script I've been able to run the examples/H3LAPD/2Din3D-hw_fluid-only
example with my locally built NESO without any segfaults.
Owen Parry indicated in an email that reasonable choice of initial parameters to expose for doing sweeps over / calibrating would be
HW_alpha
over range 0.1 – 2.0,HW_kappa
over range 0.1 – 3.5,Te_eV
over range 2.0 – 20.0.He also indicated some configurations in this parameter space may cause solver to fail so we may need some sort of way of handling this.
As quantities of interest, the energy and enstrophy are written to a CSV file as described here.
Ah hah right - I missed the README and went straight to the run_cmd_template.txt - woops!
Closing this as completed by #7
From discussions with NESO team, seems like H3LAPD should be a more computationally challenging example with both relatively complex continuum / Nektar++ and particle-in-cell components.