Closed rui-coelho closed 11 months ago
Could be also that something's wrong in mom1d = a5.data.active.getdist_moments(dist, "density")
.
Do you also get zeros with:
dist=a5.data.SDNBI.getdist('5d')
mom2d = a5.data.active.getdist_moments(dist, "density")
mom2d.ordinate("density", toravg=True);
mom2d.plot("density")
and is np.sum(a5.data.SDNBI.getdist("5drho").distribution().ravel()) == 0
?
In [252]: np.sum(a5.data.SDNBI.getdist("rho5d").distribution().ravel()) == 0 Out[252]: array(True)
)-:
Is this only for the "large" distributions? I wonder if there is some place that I missed in the fix...
in this case it was a "small/medium" case since i used grids of size 60 except 1 for NPHI. When i was struggling i had NPHI also 60 but decided to go just for 1 since the plasma is in effect axisymmetric in the case of interest. Basically the same settings that Pietro used in ASCOT4 a couple of years back. I currently have a higher resolution case running (750k markers and +100 on grids except 1 on NPHI). Takes roughly 12.6hr to run at the GW.
And the 200k marker grid-60 sized case in the end is a ~360Mb hdf5 file whereas the bbnbi5 output is just ~154Mb. So, data was definitively written and markers seem ok (even if for 738 markers i got 738 markers were aborted with an error message: Input evaluation failed (marker could be outside input data grid)
the high resolution (grid~100 points and 750k markers) also ended up with
In [258]: np.sum(a5.data.SDNBI.getdist("5d").distribution().ravel())
Out[258]: unyt_quantity(0., 'particles*s/(degree*e*kg**2*m**4)')
I will try the tutorials to check if i get the same result....
in the tutorial Dealing with distributions i did manage to get a 5d distribution not packed with 0's. I just tried a new run of bbnbi5 with 30 point grid with 1k markers and ascot5 with much lower mileage and cpu time and got the same list packed with 0's.....so it's clearly not related to the mesh size nor something of the sort...... Here is the options used:
BBNBI5
opt = Opt.get_default()
opt.update({
"ENABLE_DIST_5D":1,
"DIST_MIN_R":min_R, "DIST_MAX_R":max_R, "DIST_NBIN_R":30,
"DIST_MIN_PHI":0, "DIST_MAX_PHI":360, "DIST_NBIN_PHI":1,
"DIST_MIN_Z":min_Z, "DIST_MAX_Z":max_Z, "DIST_NBIN_Z":30,
"DIST_MIN_PPA":-max_p, "DIST_MAX_PPA":max_p, "DIST_NBIN_PPA":30,
"DIST_MIN_PPE":0, "DIST_MAX_PPE":max_p, "DIST_NBIN_PPE":30,
"DIST_MIN_TIME":0, "DIST_MAX_TIME":1.0, "DIST_NBIN_TIME":1,
"DIST_MIN_CHARGE":0.5, "DIST_MAX_CHARGE":1.5, "DIST_NBIN_CHARGE":1
})
a5.data.create_input("opt", **opt, desc="BEAMDEPOSITION")
ASCOT5
a5 = Ascot(ascot_filename)
ids = a5.data.active.getstate("ids", endcond="IONIZED")
mrk = a5.data.active.getstate_markers("gc", ids=ids)
a5.data.create_input('gc',**mrk,desc="PNBI4ASCOT",activate=True)
a5.data.create_input('E_TC',desc="DUMMY_EF",activate=True)
a5.data.create_input('Boozer',desc="DUMMY_BOOZER",activate=True)
a5.data.create_input('MHD_STAT',desc="DUMMY_MHD",activate=True)
#-----Update options for ASCOT simulation-------
#
opt=a5.data.options.active.read()
opt.update({
"ENABLE_DIST_RHO5D":1,
"DIST_MIN_RHO":0, "DIST_MAX_RHO":1, "DIST_NBIN_RHO":30,
"DIST_MIN_THETA":0, "DIST_MAX_THETA":360, "DIST_NBIN_THETA":20,
# Simulation mode
"SIM_MODE":2, "ENABLE_ADAPTIVE":1,
# Setting max mileage above slowing-down time is a good safeguard to ensure
# simulation finishes even with faulty inputs. Same with the CPU time limit.
"ENDCOND_SIMTIMELIM":1, "ENDCOND_MAX_MILEAGE":0.01,
"ENDCOND_CPUTIMELIM":1, "ENDCOND_MAX_CPUTIME":0.1e3,
# The energy limit which separates a fast ion from thermal bulk is not well defined,
# but we usually use Emin = 2 x Tion as the limit. Setting also a fixed minimum energy
# is advised sine plasma temperature at the pedestal can be low.
"ENDCOND_ENERGYLIM":1, "ENDCOND_MIN_ENERGY":2.0e3, "ENDCOND_MIN_THERMAL":2.0,
# Physics
"ENABLE_ORBIT_FOLLOWING":1, "ENABLE_COULOMB_COLLISIONS":1
})
a5.data.create_input("opt", **opt, desc="ASCOT_OPTIONS",activate=True)
Continuing my detective work, now i used a prescribed set of markers from the distributions
tutorial i.e.
from a5py.ascot5io.marker import Marker
nmrk = 5000
mrk = Marker.generate("gc", n=nmrk, species="alpha")
mrk["energy"][:] = 85.0e3
mrk["pitch"][:] = 0.99 - 1.98 * np.random.rand(nmrk,)
mrk["r"][:] = 3.0 + 0.8 * np.random.rand(nmrk,)
a5.data.create_input("gc", **mrk)
merging it with the rest of the dataset instead of my bbnbi5 results. The SIM settings for CPU and MILEAGE were minimal to run for 80sec only and got 4086 markers had end condition Sim time limit and 914 markers had end condition Thermalization.
Still.......
In [6]: np.sum(a5.data.active.getdist("5d").distribution().ravel())
Out[6]: unyt_quantity(0., 'particles*s/(degree*e*kg**2*m**4)')
/&%$#"#$%......there must be something stupidly wrong
Merged now also the plasma1D and the rudimentary 2D wall box...same &%$#"!"#$....
a5.data.create_input("plasma flat", density=1e21,activate=True)
a5.data.create_input("wall_2D",activate=True)
from a5py.ascot5io.marker import Marker
nmrk = 500
mrk = Marker.generate("gc", n=nmrk, species="alpha")
mrk["energy"][:] = 80.0e3
mrk["pitch"][:] = 0.99 - 1.98 * np.random.rand(nmrk,)
mrk["r"][:] = 2.5 + 1 * np.random.rand(nmrk,)
a5.data.create_input("gc", **mrk,activate=True)
Only thing left changing is the equilibrium......which should be the least of problems.....
and with your a5.data.create_input("bfield analytical iter circular",activate=True)
it works...
this is very very suspicious.....the EQDSK i imported is a perfectly valid one as far as the mapping of the plasma1d to (R,Z) is concerned otherwise i would get no decent beam deposition with BBNBI5....nor marker evolution towards thermalisation with ASCOT5. There is likely some limitation/constraint to generate the distribution5d
and its variants....
revising my steps since i need to be consistent with the DIST_MIN_CHARGE":0.5, "DIST_MAX_CHARGE":1.5
only if using deuterium and i was using your markers of alphas
and this is why i couldn't then get any distribution. Changing the range to 1.5,2.5 i could get non-zero values. This with your mock-up markers.
Ok, tried so many options to see what the heck was going on. This is what triggers the "packed with 0s" and "regular" distributions. If i do NOT include the options for DIST_MIN_CHARGE
and DIST_MAX_CHARGE
(0.5 and 1.5) in the settings for ascot5 it returns 0's in the distribution. This is all a bit absurd for two reasons:
DIST_MIN_CHARGE":0.5, "DIST_MAX_CHARGE":1.5
in BBNBI5 options, when accessing these same options in the object it actually returns 0 and 1 respectively. If the type is INT then the user should get clearer instructions on how/what to set when we wish to feature energetic particles of charge Z=1 (and what to do for another Z).options
object will update options that were previously set and written in the object of the hdf5 that is read. All options are actually set by the constructor so effectively we only update the options (and not add new ones...). Therefore, it is absurd that i should need to update again with DIST_MIN_CHARGE":0.5, "DIST_MAX_CHARGE":1.5
since these values should have already been set (and remained as such...).Solution i found: rather than setting (updating) AGAIN the DIST_MIN_CHARGE":0.5, "DIST_MAX_CHARGE":1.5
on the options of ASCOT5 object, i managed to trick the "stupid code" by using instead DIST_MIN_CHARGE":1, "DIST_MAX_CHARGE":2
in the options of BBNBI5. Then these were inherited by ASCOT5 as expected and there was NO NEED to update them on the options of the object.
Request i make: the tutorials all contain the 0.5 and 1.5 values for the MIN and MAX charges. Unless there is a valid reason to do so (very misleading as i showed above), please make sure my solution is actually the correct way to proceed and in this case change the tutorials accordingly.
Thanks for detective work! Using fractional values DIST_MIN_CHARGE":0.5, "DIST_MAX_CHARGE":1.5
should be supported. The reason why they weren't is that those parameters were erronously typecasted as integers before they were written to the HDF5 file. So in the code the interval would be open (1,2) and no physical particle could fit there.
I'll commit the fix and close this issue once it has been merged.
so one should expect in the future release to get the same result either using (0.5 and 1.5) or (1 and 2) ? Right now it does work if i use 1 and 2 as limits since i the lower value is retained. Will this change ?
Yeah (1,2) should work even after the fix but note that this is not really supported. The reason is that in the code, the correct bin is determined like this:
i_q[i] = floor((p_f->charge[i]/CONST_E - dist->min_q) / ((dist->max_q - dist->min_q) / dist->n_q));
So while in principle the interval is semi-open [1,2), the floating point arithmetic could lead to p_f->charge[i]/CONST_E
being either X.000...1 or X.999...9. I guess this could be platform dependent or depend on test particle charge. This is why I'd advise to treat (min_q, max_q) as open interval.
fair enough. Once fixed please mind updating the documentation to highlight this "feature" with simple examples e.g. just D, just Alpha and multi species.
Using the repo branch
hotfix/48-bbnbi-beam-divergence-and-verification
i ran both BBNBI5 and then ASCOT5. The BBNBI5 run as i mentioned before is now free of the beam "bending" but i'm facing something different: dist%5d or dist%rho5d just have zeros inside