Open cschwan opened 2 years ago
@cschwan You're way too efficient. The idea is to have a dedicated meeting to discuss this in detail.
The implementation might take a while :smile:. But I anticipate to introduce a new file format soon to accommodate a number of changes/simplications (see #118) and it would be ideal to know what changes roughly are required to support them in said file format. I think the meeting would indeed be perfect to discuss that point.
@enocera: which of the following scenarios do you see to be relevant in the future:
Basically my questions is: how many 'convolutions' (with PDFs and/or fragmentation functions) will we need to support in PineAPPL?
I guess there is also SIA case, i.e. "lepton-lepton collisions with *".
What does SIA stand for?
semi-inclusive annihilation
@enocera: which of the following scenarios do you see to be relevant in the future:
- lepton-hadron collisions with one fragmentation function,
Yes. This is called semi-inclusive deep-inelastic scattering (SIDIS)
- lepton-hadron collisions with two fragmentation functions,
No. This is a process that occurs, but it cannot be described within collinear factorisation. The object that is introduced is called di-hadron fragmentation function. Let's forget about it.
- hadron-hadron collisions with one fragmentation function,
Yes. This is relevant for the LHC.
- hadron-hadron collisions with two fragmentation functions?
No. For the same reason as in 2.
On top of these processes, there is hadron production in electron-positron annihilation (SIA=single-inclusive annihilation), so one FF (and no PDF).
Basically my questions is: how many 'convolutions' (with PDFs and/or fragmentation functions) will we need to support in PineAPPL?
Here's a summary: SIA : one FF (it's like DIS, but with a FF instead of a PDF); SIDIS: one PDF and one FF (it's like DY, but with a FF in lieu of one PDF); PP: two PDFs and one FF (this has no counterpart). In principle, for SIDIS the PDF can be either unpolarised or polarised; for PP the PDFs can be both unpolarised, both polarised, or one unpolarised and the other polarised. The FF is always unpolarised.
I had a brief chat with @t7phy yesterday, and there are a couple of points that require answering:
I had a brief chat with @t7phy yesterday, and there are a couple of points that require answering:
1. In which (file-)format are the fragmentation functions available? Is there a library like LHAPDF that supports it? Or can one (ab)use LHAPDF?
Sets of fragmentation functions are available through LHAPDF in the very same format as PDFs.
2. If the fragmentation function is like a PDF it will have an associated scale, μFrag. How do you set this scale, and specifically do you set it different than μR=μF=μFrag?
In general muR=muF=muFrag, but we may want to do scalel variations (in order to estimate MHOUs) exactly as we do with PDFs.
3. Would polarized FFs/PDFs make a difference in the convolution? If yes, where can I read the details?
The convolution is the same, though of course different objects evolve differently. Specifically, there are the folloiwng case:
4. How do FK tables with FF look like? Can we use the same EKOs as we do for PDFs?
We must use different EKOs (time-like evolution), but these are already available in EKO, in the same format as space-like EKOs.
Hi @cschwan,
1. In which (file-)format are the fragmentation functions available? Is there a library like LHAPDF that supports it? Or can one (ab)use LHAPDF?
The FFs are delivered in the LHAPDF format (you can see for example the NNFFXX sets from the LHAPDF webpage). The structure is exactly the same as in the (un)polarised and nuclear PDFs.
3. Would polarized FFs/PDFs make a difference in the convolution? If yes, where can I read the details?
The convolution is also done in the exact same say. At the end of the day, it is always just a sum over different flavor combinations.
4. How do FK tables with FF look like? Can we use the same EKOs as we do for PDFs?
The structure of the FK tables also should be the same as in the (un)polarised (n)PDFs. EKO can be used in the same way as for PDFs but using time-like evolution. This has already been implemented in EKO (https://github.com/NNPDF/eko/pull/232, https://github.com/NNPDF/eko/pull/245), for this to fully work in the pipeline it is just missing the link in pineko
that recognizes FF grids.
I'd start from the simplest case (SIA), and then possibly move to SIDIS. I think that SIDIS may be similar to the case of proton-ion collisions, in which one has to convolve a proton PDF and a nuclear PDF; in SIDIS, one has to convolve a proton PDF and a FF (the difference being that proton and nuclear PDFs evolve with the same EKOS, while proton PDFs and FF evolve with different EKOs).
Ah, @enocera has already answered all of your questions in the meantime.
@enocera was 1 min faster then me :upside_down_face: and @Radonirinaunimi half a minute
In general muR=muF=muFrag, but we may want to do scalel variations (in order to estimate MHOUs) exactly as we do with PDFs.
just to add: for SIDIS we will still have 2 collinear objects (one PDF and one FF) but 3 scales: µF + µR + µFrag
Which we may want to vary independently.
Which we may want to vary independently.
If we do a 9-pt scale variation, we'd probably want to do a 27-pt ($3^3$) variation with three scales; what replaces a 7-pt scale variation?
If we do a 9-pt scale variation, we'd probably want to do a 27-pt variation with three scales;
Correct.
what replaces a 7-pt scale variation?
A 16-pt variation, in which we exclude the outmost variations in opposite directions, see e.g. Eq. (2) in https://arxiv.org/pdf/1311.1415.pdf and the discussion in https://arxiv.org/pdf/1001.4082.pdf.
Apart from the scale variations bit, it should be just convolute_with_one
and convolute_with_two
(the new one will be the convolute_with_three
, that will require even one more variable in the subgrid...).
Moreover, SIA should not require even any other scale, since you could abuse the $\muF$ for $\mu{Frag}$ (though you might want to go back to $\mu_{Frag}$ the moment you will implement it for SIDIS).
@cschwan I use this issue here instead of #135 to ask a more technical question - feel free to correct me
Do you have already a strategy on how to support more convoloutions? as discussed above we need up to 3; broadly speaking I can see two strategies:
introduce different subgrids for the different dimensions: e.g.
SubGrid1Di.new(tensor_2d, in_scales_1d, in_grid_1d)
for DISSubGrid1Df.new(tensor_2d, out_scales_1d, out_grid_1d)
SIASubGrid2Dii.new(tensor_3d, in_scales_1d, in_grid_1d, in_grid_1d)
for pp collisionsSubGrid2Dif.new(tensor_4d, in_scales_1d, in_grid_1d, out_scales_1d, out_grid_1d)
for SIDISfollow the current approach and always blow up to the highest dimension - see e.g. in yadism https://github.com/NNPDF/yadism/blob/99d35349be5b57c78fa60733cd8d8521a3a68675/src/yadbox/export.py#L75 ; that would be 5 dimensions: 1 initial scale + 2 initial grids + 1 final scale + 1 final grid; e.g. SubGrid.new(blown_up_tensor_5d, [Q2], xgrid, [1.], [random_scale], [1.])
for DIS
trivial statement: a single grid will have a fixed number of collinear distributions
I thought about all of the related problem long and hard, and right now I'm thinking of doing the following:
SparseMatrix3
. This in turn will require a new file format, for which I'm currently preparing the code. I will probably generate a struct SparseMatrix<T, const D: usize>
and then everything depending on this structure will also be generic to support D
dimensions for the numerical type T
Grid
should have an arbitrary number of scales, and each convolution is associated to exactly one scale. For instance, PDFs are associated to the factorization scale, fragmentation functions to a fragmentation scale. In principle I could also offer more coupling functions, which also depend on exactly one (?) scale, for instance to support a running electromagnetic couplingA
and Z
factors here entirely, because this could be done by a PDF librarymuR
, muF
and muFF
, first 'remove' the fragmentation scale by evolving it to the fitting scale of the fragmentation scale with one call to evolve
. The result wouldn't be an FK-table yet, but rather a Grid
muR
, muF
and muFF0
. Finally, you'd evolve
the renormalization and factorization scale, replacing the three previous scales with thee scales muR0
, muF0
and muFF0
. Having only fitting scales we finally have an FK-table. The advantage is that evolve
can operate with one type of EKOs at a time and mixing arbitrarily many can be accomplished by just calling as many evolve
functions as we need.1. support arbitrary many convolutions, which will require a new structure instead of `SparseMatrix3`. This in turn will require a [new file format](https://github.com/NNPDF/pineappl/issues/118), for which I'm currently preparing the code. I will probably generate a `struct SparseMatrix<T, const D: usize>` and then everything depending on this structure will also be generic to support `D` dimensions for the numerical type `T`
as discussed above, it is unlikely we will need more then 3 collinear dimensions in the mid term, so I'm not sure you want to opt for the most general case, given the generated complexity ((very-)long term there are always more options á là double parton scattering)
2. the number of scales will have to be generalized as well. In general a `Grid` should have an arbitrary number of scales, and each convolution is associated to exactly one scale. For instance, PDFs are associated to the factorization scale, fragmentation functions to a fragmentation scale. In principle I could also offer more coupling functions, which also depend on exactly one (?) scale, for instance to support a running electromagnetic coupling
$\alpha_{em}(\mu)$ is maybe still fine, but then again: is the most general case worth the trouble?
3. the type of convolutions will no longer be metadata in the new format but rather built-in properties of the grid, and I suppose for each convolution there should be an identifying particle type, a scale (discussed in 2) and a type to differentiate between polarized and unpolarized protons, for instance. I wonder whether we should drop encoding the `A` and `Z` factors here entirely, because this could be done by a PDF library
so that would be basically my strategy 1, right? as for the nuclear stuff: we still need to state what kind of PDF we expect, i.e. what is accounted for inside the grid and what should the PDF account for (but maybe @Radonirinaunimi knows better)
4. Evolutions for different convolutions would be supported in an iterative fashion, in the following sense: if you have a Grid with two PDFs (the same) and a FF you'd, with the scales `muR`, `muF` and `muFF` , first 'remove' the fragmentation scale by evolving it to the fitting scale of the fragmentation scale with one call to `evolve`. The result wouldn't be an FK-table yet, but rather a `Grid` `muR`, `muF` and `muFF0`. Finally, you'd `evolve` the renormalization and factorization scale, replacing the three previous scales with thee scales `muR0`, `muF0` and `muFF0`. Having only fitting scales we finally have an FK-table. The advantage is that `evolve` can operate with one type of EKOs at a time and mixing arbitrarily many can be accomplished by just calling as many `evolve` functions as we need.
going iteratively is a good idea, I think - also from an EKO point of view ...each of the intermediate objects would still be a grid, as also an FK table is a grid (with special properties, but still a grid)
as discussed above, it is unlikely we will need more then 3 collinear dimensions in the mid term, so I'm not sure you want to opt for the most general case, given the generated complexity ((very-)long term there are always more options á là double parton scattering)
My point is: for three dimensions I basically already have to think about the general case, so the step from 3 to D is trivial (I think, but let's see)
αem(μ) is maybe still fine, but then again: is the most general case worth the trouble?
It's probably not much more work, and solving a more general case will hopefully give us better abstractions.
so that would be basically my strategy 1, right? as for the nuclear stuff: we still need to state what kind of PDF we expect, i.e. what is accounted for inside the grid and what should the PDF account for (but maybe @Radonirinaunimi knows better)
Yes!
Let's start with the following:
initial_state_1: 2212
convolution_particle_1: 2212
convolution_type_1: PDF/FF/polPDF/polFF
The first line is for backwards-compatibility (and should be dropped in the future), and the following lines is what we need to distinguish between various different convolution functions that may describe the same particle though. For instance, the (unpolarized) proton PDFs, the polarized proton PDF and the proton fragmentation function.
@t7phy you need to change the lines here: https://github.com/NNPDF/pineappl/blob/1954e5c1fb076cd12dac2f04081335c12f22275d/pineappl/src/grid.rs#L1325-L1329
the proton fragmentation function
the hadron fragmentation function (might by chance be still proton, but the interesting cases are pions, Kaons, Ds, ...)
It was an example :smiley:.
@Radonirinaunimi I added you here, because your pull request adresses one of the TODOs listed above.
@enocera if we produce FK-tables for predictions involving both PDFs and FFs, are we interested in different fitting scales for the PDFs and FFs?
@cschwan that's a good question. In general I'd say that we want to keep the parametrisation scale uniform across different objects. But I see cases in which we may want to have different parametrisation scales for different processes (e.g. unpolarised PDFs and FFs). I would however see the second option as a sophistication of the first one, therefore I'd go for the first one to start with, unless the implementation of the second is straightforward.
This generalization will require the following changes:
SparseArray3
structure; we needSparseArray4
, and then it's possibly better to even think aboutSparseArrayN
; this is done with #275.Grid::evolve*
) needs to be changed in order to support time-like EKOs. For that to work we should discuss the possible interfaces and probably first merge https://github.com/NNPDF/pineappl/pull/244; partly addressed in #289. The remaining bits will be addressed in #299.