Closed jluethi closed 1 year ago
Thanks for the overview, but I think there's room for some more practical comments about the requirements ;) Before getting into complex cases like the third one, let's start from the basics:
create_ROIs_from_labels
task that takes a label image and writes the bounding-box ROIs into a new AnnData table in tables
? Or is it a code block that will be run at the end of any label-producing task (e.g. cellpose_segmentation
or napari_workflows_wrapper
)?With this answer, I think covering use case 1 shouldn't be hard, if we assume no bounding-box overlaps. But I guess it's impossible to guarantee that same-FOV organoids have non-overlapping ROIs, so that use case 1 in principle also includes use case 3. I would aim for a first implementation for use case 1, in which we add an explicit check that ROIs do not overlap. Then we can look more carefully at how to handle ROI overlaps (as in use case 3), while still checking that there is no actual organoid overlap.
As for use case 2, my understanding is that its implementation is on hold until we implement some stitching - is it correct? I'm not sure of which issue we need to close before we can move on with this. Is it #116, or should there be an even simpler one for non-overlapping FOVs?
let's start from the basics
Agreed. We'll hit use case 3 very fast though, as soon as we enable that. But let's start with 1
Is it a new create_ROIs_from_labels task that takes a label image and writes the bounding-box ROIs into a new AnnData table in tables? Or is it a code block that will be run at the end of any label-producing task (e.g. cellpose_segmentation or napari_workflows_wrapper)?
Both are valid options. But from a user-flow, I like the second option better. If we can have a helper function for that and have a flag for cellpose & napari workflows, that makes for a very easy flow. The flag could also be a named string for the name of the ROIs, if it's empty, they aren't being saved.
I would aim for a first implementation for use case 1, in which we add an explicit check that ROIs do not overlap.
Sounds good for a first implementation! And yes, very hard to guarantee or even know beforehand.
As for use case 2, my understanding is that its implementation is on hold until we implement some stitching - is it correct?
Actually, it runs fine with our current tiling approach for organoids. i.e. even though tiling based on Yokogawa metadata isn't super accurate, it's actually often good enough for organoid-level measurements and organoid ROIs. We just save images based on their metadata coordinates and that is a (very rough) version of placing things correctly. We'll also want to use it in cases where we did fancier stitching. Though the ROI generation shouldn't concern itself with how the well array is being generated :)
In https://github.com/fractal-analytics-platform/fractal-tasks-core/issues/132#issuecomment-1305322494 we have a basic plan of how to proceed to create bounding boxes. Let's keep this issue to discuss the specific organoids use cases.
Refs:
Most of this is addressed by #306 - to be released as 0.9.0
Cases 1-3 should be now supported, as part of the cellpose task. Work on the napari-workflows task has not started yet.
@jluethi Is something still missing from this "overview" issue? If not, let's close it and open more specific ones.
That's great, thanks @tcompa ! I think we cover the main part of this issue. The two major things that remain to enable the full scientific use are: Doing it in napari-workflows: https://github.com/fractal-analytics-platform/fractal-tasks-core/issues/349 2D-to-3D workflows: https://github.com/fractal-analytics-platform/fractal-tasks-core/issues/342
Thus, I'm closing this overview issue, let's continue the work on those other two issues (not the current top priority though)
There are 4 cases for how organoids can map to FOVs and how there ROIs can relate to each other. Here is an overview of the 4. Fractal should cover cases # 1 - # 3, # 4 is out of scope.
1) No ROI overlap, no FOVs crossing
The simplest case. ROIs are all within the FOVs. Potentially, there may be some FOVs that do not contain ROIs. Easy to handle: Some task segments the organoids and we create new bounding-box ROIs based on those segmentations. When processing those ROIs, no special care needs to be taken and just looping over ROIs will make the processing more efficient as the non-organoid regions are not being processed. (Will cover most use-cases of @MaksHess and many people in the Pelkmans lab)
2: No ROI overlap, but ROIs cross FOVs
Slightly more complex case where the ROIs cross boundaries of the field of views. As long as we cover the stitching of ROIs (i.e. we were in modality 1 or modality 2 [when precision is not critical] of this overview: https://github.com/fractal-analytics-platform/fractal-tasks-core/issues/11), we can process them like in Case # 1.
3: ROI overlap, but not organoid overlap
This is the trickiest case we will need to handle. Realistically, we will be using bounding-box ROIs for a while (separate discussion whether eventually more complex ROIs could be defined, but that would index-based processing with dask (see: https://github.com/fractal-analytics-platform/fractal-tasks-core/issues/27) make even more complex). Thus, a bounding-box can overlap while the object within will not overlap. This is quite likely to happen for some use cases in the Liberali lab when organoids grow densely. How do we tackle this? Let's look into masked arrays in dask: https://docs.dask.org/en/stable/generated/dask.array.ma.masked_where.html Maybe one would specify the label image & the label value (e.g. path to label image, integer value of relevant label) as additional data for the ROI and, if a ROI contains
mask_label_img
andmask_label_value
, it uses such masking before reading and writing data to/from a ROI?4: Organoid overlap
This is out of scope. It could happen when the MIP is processed of organoids that are close in 3D. In that case, our segmentation networks would also assign an MIP pixel to one organoid only. Thus, if a user wants to process this in more detail, organoid segmentation should be done in 3D. Other cases of where a 3D voxel should actually belong to multiple objects are out of scope for Fractal