EPIC-model / epic

Elliptical Parcel-in-Cell model for fluid dynamics
https://epic-model.github.io/epic/
BSD 3-Clause "New" or "Revised" License
8 stars 1 forks source link

Future ideas #97

Open sjboeing opened 3 years ago

sjboeing commented 3 years ago

List of future ideas:

Efficiency hacks:

Enhancements:

May need looking into:

sjboeing commented 3 years ago

List of future ideas, currently closed as issues (but can be re-openened)

Efficiency hacks:

60

58

May need looking into:

45

matt-frey commented 3 years ago

Steef via e-mail:

Just sending this for reference, in case we ever want to have a fully flexible version in terms of precision: https://www.cosmo-model.org/content/consortium/generalMeetings/general2013/wg6/20130903_waa_GM13_single-prec.pdf

matt-frey commented 3 years ago

David's idea for setting up simulations: Pass the gridded fields which are parsed by EPIC to initialize parcels. Users should not have to worry about parcels. The fields can be provided with any resolution and independent of the resolution of the EPIC simulation. Input format: HDF5?

sjboeing commented 3 years ago

Some hack to make the parcels merge in regions of low vorticity. The following reference may be useful: https://www.sciencedirect.com/science/article/pii/S1270963821003060

daviddritschel commented 3 years ago

If the input data resolution differs from the resolution in EPIC, should we generate the parcel attributes on the input data grid but nevertheless use the different EPIC grid? This would seem the easiest option. Otherwise we need to interpolate the input data to the EPIC grid, then initialise the parcel attributes. What is best?

sjboeing commented 3 years ago

I have been wondering if we could improve the initialisation, by making an estimate of how the "remainder" of a grid-box changes in each iteration. Since the initialisation only needs to be done once, it may not be so important though. It would be more crucial if we would like to use this to "correct" grid2par.

sjboeing commented 3 years ago

Using a single array of scalar properties?

sjboeing commented 2 years ago

On adaptive minimal volume: I think there may be a more affordable way to do something using all prognostics.

We could calculate weighted variances for all prognostics in each grid box (making use of some of the code we have for nearest-neighbours). This is a lot cheaper than deviations with respect to an interpolated field, as it avoids a full grid2par or par2grid. We can then compare these against variances expected from a "smooth" field between the grid box corners and/or variances at larger scales. This may still seem a bit expensive but probably is affordable compared to the other things we are doing.

We then need to translate this into a minimal volume, where we can maybe borrow ideas from the AMR literature (e.g. https://link.springer.com/article/10.1007/s10546-018-0335-9, which is all about downsampling and upsampling).

A starting point could be to plot sigma/sigma_smooth for each grid box and prognostic in a simulation.

sjboeing commented 2 years ago

Possibly: explicitly diagnose TKE destruction during merges.

daviddritschel commented 2 years ago

This would have to be done specially, but what do we expect to find? We know there is energy dissipation, as is expected when mixing is permitted. It may be more useful to diagnose an effective eddy diffusivity based on the rate of decay of energy, which I believe is proportional to the enstrophy (vorticity variance). If we had uniform viscosity, then the dissipation of energy is something like 2nuenstrophy, where 2*enstrophy = <zeta^2>, the domain integrated squared vorticity. I doubt we would get a pretty result that -dE/dt / <zeta^2> is a constant (the eddy diffusivity), but it is easy enough to compute. On this note, we don't monitor <zeta^2>. This quantity, when it reaches a peak, largely determined the limits of predictability for a flow (I worked on this years ago with a colleague, Ali Mohebalhojeh).

daviddritschel commented 2 years ago

For the current paper, it might be worth computing a PDF of parcel aspect ratio immediately following merger (this could be done in a time-integrated way over the entire simulation). We would then be able to quantify the probability of merger resulting in lambda > lambda_max. I would do this only once and not retain it as a feature in EPIC.