NNPDF / pineappl

PineAPPL is not an extension of APPLgrid
https://nnpdf.github.io/pineappl/
GNU General Public License v3.0
12 stars 3 forks source link

Generalize convolutions for both initial and final states #172

Open cschwan opened 1 year ago

cschwan commented 1 year ago

This generalization will require the following changes:

enocera commented 1 year ago

@cschwan You're way too efficient. The idea is to have a dedicated meeting to discuss this in detail.

cschwan commented 1 year ago

The implementation might take a while :smile:. But I anticipate to introduce a new file format soon to accommodate a number of changes/simplications (see #118) and it would be ideal to know what changes roughly are required to support them in said file format. I think the meeting would indeed be perfect to discuss that point.

cschwan commented 1 year ago

@enocera: which of the following scenarios do you see to be relevant in the future:

  1. lepton-hadron collisions with one fragmentation function,
  2. lepton-hadron collisions with two fragmentation functions,
  3. hadron-hadron collisions with one fragmentation function,
  4. hadron-hadron collisions with two fragmentation functions?

Basically my questions is: how many 'convolutions' (with PDFs and/or fragmentation functions) will we need to support in PineAPPL?

alecandido commented 1 year ago

I guess there is also SIA case, i.e. "lepton-lepton collisions with *".

cschwan commented 1 year ago

What does SIA stand for?

felixhekhorn commented 1 year ago

semi-inclusive annihilation

enocera commented 1 year ago

@enocera: which of the following scenarios do you see to be relevant in the future:

  1. lepton-hadron collisions with one fragmentation function,

Yes. This is called semi-inclusive deep-inelastic scattering (SIDIS)

  1. lepton-hadron collisions with two fragmentation functions,

No. This is a process that occurs, but it cannot be described within collinear factorisation. The object that is introduced is called di-hadron fragmentation function. Let's forget about it.

  1. hadron-hadron collisions with one fragmentation function,

Yes. This is relevant for the LHC.

  1. hadron-hadron collisions with two fragmentation functions?

No. For the same reason as in 2.

On top of these processes, there is hadron production in electron-positron annihilation (SIA=single-inclusive annihilation), so one FF (and no PDF).

Basically my questions is: how many 'convolutions' (with PDFs and/or fragmentation functions) will we need to support in PineAPPL?

Here's a summary: SIA : one FF (it's like DIS, but with a FF instead of a PDF); SIDIS: one PDF and one FF (it's like DY, but with a FF in lieu of one PDF); PP: two PDFs and one FF (this has no counterpart). In principle, for SIDIS the PDF can be either unpolarised or polarised; for PP the PDFs can be both unpolarised, both polarised, or one unpolarised and the other polarised. The FF is always unpolarised.

cschwan commented 8 months ago

I had a brief chat with @t7phy yesterday, and there are a couple of points that require answering:

  1. In which (file-)format are the fragmentation functions available? Is there a library like LHAPDF that supports it? Or can one (ab)use LHAPDF?
  2. If the fragmentation function is like a PDF it will have an associated scale, $\mu\mathrm{Frag} $. How do you set this scale, and specifically do you set it different than $\mu\mathrm{R} = \mu\mathrm{F} = \mu\mathrm{Frag}$?
  3. Would polarized FFs/PDFs make a difference in the convolution? If yes, where can I read the details?
  4. How do FK tables with FF look like? Can we use the same EKOs as we do for PDFs?
enocera commented 8 months ago

I had a brief chat with @t7phy yesterday, and there are a couple of points that require answering:

1. In which (file-)format are the fragmentation functions available? Is there a library like LHAPDF that supports it? Or can one (ab)use LHAPDF?

Sets of fragmentation functions are available through LHAPDF in the very same format as PDFs.

2. If the fragmentation function is like a PDF it will have an associated scale, μFrag. How do you set this scale, and specifically do you set it different than μR=μF=μFrag?

In general muR=muF=muFrag, but we may want to do scalel variations (in order to estimate MHOUs) exactly as we do with PDFs.

3. Would polarized FFs/PDFs make a difference in the convolution? If yes, where can I read the details?

The convolution is the same, though of course different objects evolve differently. Specifically, there are the folloiwng case:

4. How do FK tables with FF look like? Can we use the same EKOs as we do for PDFs?

We must use different EKOs (time-like evolution), but these are already available in EKO, in the same format as space-like EKOs.

Radonirinaunimi commented 8 months ago

Hi @cschwan,

1. In which (file-)format are the fragmentation functions available? Is there a library like LHAPDF that supports it? Or can one (ab)use LHAPDF?

The FFs are delivered in the LHAPDF format (you can see for example the NNFFXX sets from the LHAPDF webpage). The structure is exactly the same as in the (un)polarised and nuclear PDFs.

3. Would polarized FFs/PDFs make a difference in the convolution? If yes, where can I read the details?

The convolution is also done in the exact same say. At the end of the day, it is always just a sum over different flavor combinations.

4. How do FK tables with FF look like? Can we use the same EKOs as we do for PDFs?

The structure of the FK tables also should be the same as in the (un)polarised (n)PDFs. EKO can be used in the same way as for PDFs but using time-like evolution. This has already been implemented in EKO (https://github.com/NNPDF/eko/pull/232, https://github.com/NNPDF/eko/pull/245), for this to fully work in the pipeline it is just missing the link in pineko that recognizes FF grids.

enocera commented 8 months ago

I'd start from the simplest case (SIA), and then possibly move to SIDIS. I think that SIDIS may be similar to the case of proton-ion collisions, in which one has to convolve a proton PDF and a nuclear PDF; in SIDIS, one has to convolve a proton PDF and a FF (the difference being that proton and nuclear PDFs evolve with the same EKOS, while proton PDFs and FF evolve with different EKOs).

Radonirinaunimi commented 8 months ago

Ah, @enocera has already answered all of your questions in the meantime.

felixhekhorn commented 8 months ago

@enocera was 1 min faster then me :upside_down_face: and @Radonirinaunimi half a minute

In general muR=muF=muFrag, but we may want to do scalel variations (in order to estimate MHOUs) exactly as we do with PDFs.

just to add: for SIDIS we will still have 2 collinear objects (one PDF and one FF) but 3 scales: µF + µR + µFrag

enocera commented 8 months ago

Which we may want to vary independently.

cschwan commented 8 months ago

Which we may want to vary independently.

If we do a 9-pt scale variation, we'd probably want to do a 27-pt ($3^3$) variation with three scales; what replaces a 7-pt scale variation?

enocera commented 8 months ago

If we do a 9-pt scale variation, we'd probably want to do a 27-pt variation with three scales;

Correct.

what replaces a 7-pt scale variation?

A 16-pt variation, in which we exclude the outmost variations in opposite directions, see e.g. Eq. (2) in https://arxiv.org/pdf/1311.1415.pdf and the discussion in https://arxiv.org/pdf/1001.4082.pdf.

alecandido commented 8 months ago

Apart from the scale variations bit, it should be just convolute_with_one and convolute_with_two (the new one will be the convolute_with_three, that will require even one more variable in the subgrid...).

Moreover, SIA should not require even any other scale, since you could abuse the $\muF$ for $\mu{Frag}$ (though you might want to go back to $\mu_{Frag}$ the moment you will implement it for SIDIS).

felixhekhorn commented 4 months ago

@cschwan I use this issue here instead of #135 to ask a more technical question - feel free to correct me

Do you have already a strategy on how to support more convoloutions? as discussed above we need up to 3; broadly speaking I can see two strategies:

  1. introduce different subgrids for the different dimensions: e.g.

    • SubGrid1Di.new(tensor_2d, in_scales_1d, in_grid_1d) for DIS
    • SubGrid1Df.new(tensor_2d, out_scales_1d, out_grid_1d) SIA
    • SubGrid2Dii.new(tensor_3d, in_scales_1d, in_grid_1d, in_grid_1d) for pp collisions
    • SubGrid2Dif.new(tensor_4d, in_scales_1d, in_grid_1d, out_scales_1d, out_grid_1d) for SIDIS
    • etc
  2. follow the current approach and always blow up to the highest dimension - see e.g. in yadism https://github.com/NNPDF/yadism/blob/99d35349be5b57c78fa60733cd8d8521a3a68675/src/yadbox/export.py#L75 ; that would be 5 dimensions: 1 initial scale + 2 initial grids + 1 final scale + 1 final grid; e.g. SubGrid.new(blown_up_tensor_5d, [Q2], xgrid, [1.], [random_scale], [1.]) for DIS

trivial statement: a single grid will have a fixed number of collinear distributions

cschwan commented 4 months ago

I thought about all of the related problem long and hard, and right now I'm thinking of doing the following:

  1. support arbitrary many convolutions, which will require a new structure instead of SparseMatrix3. This in turn will require a new file format, for which I'm currently preparing the code. I will probably generate a struct SparseMatrix<T, const D: usize> and then everything depending on this structure will also be generic to support D dimensions for the numerical type T
  2. the number of scales will have to be generalized as well. In general a Grid should have an arbitrary number of scales, and each convolution is associated to exactly one scale. For instance, PDFs are associated to the factorization scale, fragmentation functions to a fragmentation scale. In principle I could also offer more coupling functions, which also depend on exactly one (?) scale, for instance to support a running electromagnetic coupling
  3. the type of convolutions will no longer be metadata in the new format but rather built-in properties of the grid, and I suppose for each convolution there should be an identifying particle type, a scale (discussed in 2) and a type to differentiate between polarized and unpolarized protons, for instance. I wonder whether we should drop encoding the A and Z factors here entirely, because this could be done by a PDF library
  4. Evolutions for different convolutions would be supported in an iterative fashion, in the following sense: if you have a Grid with two PDFs (the same) and a FF you'd, with the scales muR, muF and muFF , first 'remove' the fragmentation scale by evolving it to the fitting scale of the fragmentation scale with one call to evolve. The result wouldn't be an FK-table yet, but rather a Grid muR, muF and muFF0. Finally, you'd evolve the renormalization and factorization scale, replacing the three previous scales with thee scales muR0, muF0 and muFF0. Having only fitting scales we finally have an FK-table. The advantage is that evolve can operate with one type of EKOs at a time and mixing arbitrarily many can be accomplished by just calling as many evolve functions as we need.
felixhekhorn commented 4 months ago
1. support arbitrary many convolutions, which will require a new structure instead of `SparseMatrix3`. This in turn will require a [new file format](https://github.com/NNPDF/pineappl/issues/118), for which I'm currently preparing the code. I will probably generate a `struct SparseMatrix<T, const D: usize>` and then everything depending on this structure will also be generic to support `D` dimensions for the numerical type `T`

as discussed above, it is unlikely we will need more then 3 collinear dimensions in the mid term, so I'm not sure you want to opt for the most general case, given the generated complexity ((very-)long term there are always more options á là double parton scattering)

2. the number of scales will have to be generalized as well. In general a `Grid` should have an arbitrary number of scales, and each convolution is associated to exactly one scale. For instance, PDFs are associated to the factorization scale, fragmentation functions to a fragmentation scale. In principle I could also offer more coupling functions, which also depend on exactly one (?) scale, for instance to support a running electromagnetic coupling

$\alpha_{em}(\mu)$ is maybe still fine, but then again: is the most general case worth the trouble?

3. the type of convolutions will no longer be metadata in the new format but rather built-in properties of the grid, and I suppose for each convolution there should be an identifying particle type, a scale (discussed in 2) and a type to differentiate between polarized and unpolarized protons, for instance. I wonder whether we should drop encoding the `A` and `Z` factors here entirely, because this could be done by a PDF library

so that would be basically my strategy 1, right? as for the nuclear stuff: we still need to state what kind of PDF we expect, i.e. what is accounted for inside the grid and what should the PDF account for (but maybe @Radonirinaunimi knows better)

4. Evolutions for different convolutions would be supported in an iterative fashion, in the following sense: if you have a Grid with two PDFs (the same) and a FF you'd, with the scales `muR`, `muF` and `muFF` , first 'remove' the fragmentation scale by evolving it to the fitting scale of the fragmentation scale with one call to `evolve`. The result wouldn't be an FK-table yet, but rather a `Grid` `muR`, `muF` and `muFF0`. Finally, you'd `evolve` the renormalization and factorization scale, replacing the three previous scales with thee scales `muR0`, `muF0` and `muFF0`. Having only fitting scales we finally have an FK-table. The advantage is that `evolve` can operate with one type of EKOs at a time and mixing arbitrarily many can be accomplished by just calling as many `evolve` functions as we need.

going iteratively is a good idea, I think - also from an EKO point of view ...each of the intermediate objects would still be a grid, as also an FK table is a grid (with special properties, but still a grid)

cschwan commented 4 months ago

as discussed above, it is unlikely we will need more then 3 collinear dimensions in the mid term, so I'm not sure you want to opt for the most general case, given the generated complexity ((very-)long term there are always more options á là double parton scattering)

My point is: for three dimensions I basically already have to think about the general case, so the step from 3 to D is trivial (I think, but let's see)

αem(μ) is maybe still fine, but then again: is the most general case worth the trouble?

It's probably not much more work, and solving a more general case will hopefully give us better abstractions.

so that would be basically my strategy 1, right? as for the nuclear stuff: we still need to state what kind of PDF we expect, i.e. what is accounted for inside the grid and what should the PDF account for (but maybe @Radonirinaunimi knows better)

Yes!

cschwan commented 3 months ago

Let's start with the following:

initial_state_1: 2212
convolution_particle_1: 2212
convolution_type_1: PDF/FF/polPDF/polFF

The first line is for backwards-compatibility (and should be dropped in the future), and the following lines is what we need to distinguish between various different convolution functions that may describe the same particle though. For instance, the (unpolarized) proton PDFs, the polarized proton PDF and the proton fragmentation function.

@t7phy you need to change the lines here: https://github.com/NNPDF/pineappl/blob/1954e5c1fb076cd12dac2f04081335c12f22275d/pineappl/src/grid.rs#L1325-L1329

felixhekhorn commented 3 months ago

the proton fragmentation function

the hadron fragmentation function (might by chance be still proton, but the interesting cases are pions, Kaons, Ds, ...)

cschwan commented 3 months ago

It was an example :smiley:.

cschwan commented 2 months ago

@Radonirinaunimi I added you here, because your pull request adresses one of the TODOs listed above.