Open arunkannawadi opened 3 days ago
I thought there was a metadetect task that was supposed to do this. Why does you want that code here?
The PipelineTask
that you're referring to calls run_metadetect
in this codebase at some point and doesn't have access to all the various Task
s that are called from therein. The issue is expressing the need to expose them to the PipelineTask
.
We developed this code specifically to be separate from the tasks.
It is by design that it does not provide a Task.
This is similar to how Piff is developed independently from DM and instead there is a wrapper task for Piff, which is developed and maintained separately.
This is so that we can develop this code as independently as possible from DM processing framework, and also DM can develop their wrappers as independently from us as possible.
If there is something about the structure of our code that could be adapted to better facilitate the wrapper Task we should definitely work on that.
I think you are conflating Task
with PipelineTask
that I had hoped I clarified during the shear commissioning call two weeks ago. Refactoring parts of it into Task
s would not limit this to running only within the DM framework. Instead, it would enable running it smoothly within that framework while maintaining the capability of running it on on-the-fly simulations and images without a butler. Not doing such a refactor upstream (i.e., here) but only in the DM fork would make pulling bug fixes and improvements (such as those from Conghao recently) into the DM fork much much harder.
Also, metadetect
isn't comparable to piff
, since we use piff
as a third-party library but treat metadetect
(https://github.com/lsst-dm/metadetect) as part of the Science Pipelines itself.
I specifically said from the beginning, in like 2019, that metadetect is a third party library and should be considered so. This has always been the intention. metadetect is not specific to lsst, we use it for DES as well as other research projects.
In any case if you want to create a PR adding a Task to metadetect.lsst as convenience for DM that would be fine. But it should be built upon the functions and data structures we already have, which are in daily use and don't need any refactoring.
It's tricky to treat this as third-party since changes to Science Pipelines would affect this code, specifically metadetect.lsst
. For the same reason, the PipelineTask
wouldn't just be a thin wrapper around this code. I'll get a PR up soon, so it's easier to talk about some concrete changes than in abstract.
The changes would leave the generic parts of the algorithm, such as the DES aspects untouched.
While neither of this is blocking a metadetection PipelineTask, these changes need to happen before we want to run this in production for Rubin.
I don't follow this statement precisely. Can you expand on this?
The issue is expressing the need to expose them to the PipelineTask.
Maybe you can expand on this too?
Could you have a look (maybe another look?) at the WL SciUnit call (slides/recording) before I expand on them?
Ah I did not realize this had been discussed. I looked over the slides but did not yet get a chance to watch the recording. Will do.
Well, TBH, it was not specific to metadetection
but the material lays the groundwork for what we are proposing here.
In the near medium term, we (Jim Bosch and I) would like to refactor the lsst modules of this package to use the Task and Config framework. This is entirely a modification to the interfaces and will still be compatible with simulations. Because the modification is not an algorithmic one, we should be able to ensure that the results are identical before and after the refactoring.
In the end, we want to have achieved the following:
run_metadetect
function) into a run method of a newMetadetectTask
ConfigClass
of the above mentionedMetadetectTask
Task
s called internally as sub tasks ofMetadetectTask
While neither of this is blocking a metadetection
PipelineTask
, these changes need to happen before we want to run this in production for Rubin. I am happy to start with some of these conversions but will need help from others to complete them. The upcoming DESC sprint week would be a great time to finish this refactoring.