Closed felixhekhorn closed 1 year ago
Have you tried to optimize the grid? Or is it that big after optimization?
it's the size of the ekos (which actually is not clear from the command line)
it's the size of the ekos (which actually is not clear from the command line)
But if that is the full EKO for bin00
, remember that there is the dynamic scale inside the bin. So, with the new output each individual operator (for each $\mu^2$ value) might be reasonable enough :)
If you tell me how I should change these lines,
I can try it!
But if that is the full EKO for
bin00
, remember that there is the dynamic scale inside the bin. So, with the new output each individual operator (for each μ2 value) might be reasonable enough :)
exactly! they are ~2MB each - but this improve data structure can not be exposed to Grid::evolve
If you tell me how I should change these lines,
I can try it!
actually, how big is that eko? the problem I can see is passing around 500MB of data (and at some point we may have the double I guess, because Rust will copy from Python, I guess?)
exactly! they are ~2MB each - but this improve data structure can not be exposed to
Grid::evolve
Why not? Maybe not yet.
The final step of the new format was providing a Rust API to load it https://github.com/NNPDF/eko/issues/97, and it should take too much (i.e. I believe I can take care of in a short time).
Since @cschwan already loaded successfully a Numpy array, I only need to unpack the tarball and do the same for the relevant operator.
exactly! they are ~2MB each - but this improve data structure can not be exposed to
Grid::evolve
just to say that at some point we were actually thinking about exposing this data struture https://github.com/NNPDF/eko/issues/97 but this is for sure mid-/long-term and not a short term solution to do https://github.com/NNPDF/pineko/issues/62
Actually, I consider that to be the short term one: the bottleneck is the computation of EKO in the new format, not reading it :)
Could you provide me with a jet-EKO? I'd really like to test it.
You can also try it yourself now, @felixhekhorn: in #184 I added a new subcommand evolve
which should understand the EKO format:
pineappl evolve grid.pineappl.lz4 eko.tar fktable.lz4 PDFset
which evolves grid.pineappl.lz4
using eko.tar
to produce fktable.lz4
and perform a check using PDFset
.
Could you provide me with a jet-EKO? I'd really like to test it.
Sorry for the no reply: problem is that we don't have. In order to evolve them, @felixhekhorn did the trick of splitting the grid by bin, evolve individually, and finally merge the FkTable.
This will be solved by the usual NNPDF/eko#138...
You can also try it yourself now, @felixhekhorn: in #184 I added a new subcommand
evolve
I saw: I believe we will always use the Python API through pineko
.
But it is useful for quick checks, and it is also for external users: they may want FkTables, but pineko
can still be to rigid for them :)
[..] But it is useful for quick checks, [..]
That's exactly the idea. Pineko does much more than just evolve, of course.
I don't see any PineAPPL-related problem open, if there are please open a new Issue.
In order to convolute jets we need to adjust the EKO side since we're limited by memory there (discussed in https://github.com/NNPDF/eko/issues/138 and related issues/PRs)
I wonder, whether we're also limited by memory here, even with the new layout developed in #184. The actual size of the ekos is
An easy solution, if we're limited, would be just to add a
bin_mask
argument toGrid::evolve
(and one toGrid::axis
, since we need to provide the correct ekos in the first place). In practice this was faked to compute theory 200 via the deviation of splitting the bins before computing the eko and rejoining them after the FK table computation.