Open kosack opened 5 years ago
see also #1035
Regarding your point on #1035:
In any case, we need to write down the calibration and integration steps in a clear way somewhere as well, to be clear in what order we should compute and apply calibrations, gain-channel selections, trace integration + denoising
This is my current understanding of the order:
So perhaps the calib module can be broken into smaller sub-modules then (either under calib/, or at the top-level)
Again, we need to make a dependency diagram or something, since a few of those steps can be iterative and use parts of other modules
Although the charge extraction/trace integration is considered somewhat separate to the calibration...this seperation is potentially a problem for some calibrations.
The more commonly known calibration is the intensity calibration, which includes pedestal subtraction, flat-field and gain-correction (i.e. conversion into p.e.). This can be applied to the charge after it has been extracted from the waveform. Therefore there is no problem here with charge extraction being decoupled from calibration.
However, there is the problem of the time calibration. For example, the peak arrival time may not be aligned per pixel (which is seen with a uniform illumination). This can occur when the signal path is different per pixel, resulting in the pulse appearing a few ns later in the waveform. Therefore there is a timing flat-field that needs to be applied per pixel to the peakpos extracted in the charge extraction step. This closely couples the charge extraction to the DL1 calibration.
Since this was brought up again in #1395, I think the main thing here was that "image" was meant to be generally anything that turns a 3D or 2D image data into either a new 2D/3D image or parameters were in this module. This was anticiapting that we might in the future do things differently than the "standard" pipelines of reducing the waveforms to an image and then parameterizing... Right now there are two things:
but we could also imagine:
So we could split this into 3D vs 2D algorithms (waveforms vs images), or perhaps image (ND) to image algorithms vs image to parameters algorithms?
So we could split this into 3D vs 2D algorithms (waveforms vs images), or perhaps image (ND) to image algorithms vs image to parameters algorithms?
I think I am for the latter. One submodule for all "image->image" algorithms (3D -> 2D, 3D -> 3D), another for all parameterization (2D -> parameters, 3D -> parameters). I would also suggest that the CameraCalibrator is moved into the "image->image" module, as technically this is the operation it is performing.
I think this would solve the two-way dependencies @MaxNoe pointed out in #1395
I think we should keep the calib module and move the image extractors there as a first step.
We may as well decide a system at this stage, instead of making a bunch of staggered API changes. My suggestion also addresses #1394. Do you agree with the proposal? And what should the two modules be called?
Some possibilities:
"image->image": calib
/calibration
/image
/ndimage
"image->parameters": parameterization
/image_parameterization
Should the geometry_converter modules be moved from image
to instrument.camera
?
Sounds like the correct place, yes.
Right now there are some dependency loops between
calib
andimage
. This means we might need to rethink the structure:Both of these may be too broad.