fractal-analytics-platform / fractal-tasks-core

Main tasks for the Fractal analytics platform
https://fractal-analytics-platform.github.io/fractal-tasks-core/
BSD 3-Clause "New" or "Revised" License
11 stars 5 forks source link

[napari workflows] Add napari-apoc to core task #352

Open jluethi opened 1 year ago

jluethi commented 1 year ago

napari apoc is (among other things) a very useful pixel classifier. @adrtsc is currently testing it via the napari workflow wrapper.

Once those tests pass, would be good to add it to the core tasks.

One thing that will certainly be required for that will be a PR with napari apoc to make the QT dependency optional. Otherwise, we run into these types of errors:

  File "/data/homes/atschan/.conda/envs/fractal-client-1.1.0a2/lib/python3.9/site-packages/napari_accelerated_pixel_and_object_classification/_object_merger.py", line 6, in <module>
    from qtpy.QtWidgets import QTableWidget
  File "/data/homes/atschan/.conda/envs/fractal-client-1.1.0a2/lib/python3.9/site-packages/qtpy/__init__.py", line 259, in <module>
    raise QtBindingsNotFoundError()
qtpy.QtBindingsNotFoundError: No Qt bindings could be found

@adrtsc If you find additional issues, let's collect them here :)

adrtsc commented 1 year ago

Hi!

As @jluethi mentioned, I was trying to get napari apoc running via a custom version of the napari workflow wrapper task. The following packages were installed in the environment for the custom task:

napari-accelerated-pixel-and-object-classification==0.12.3 fractal-client==1.1.0a2 fractal-tasks-core==0.8.1 PyQt5==5.15.9 PySide6==6.2.4

I additionally had to install pocl via conda:

conda install -c conda-forge pocl

Running the custom task resulted in the follwing error:

  File "/data/homes/atschan/PhD/Code/Python/fractal/custom_napari_workflow_wrapper/napari_workflows_wrapper.py", line 606, in <module>
    run_fractal_task(
  File "/data/homes/atschan/.conda/envs/fractal-client-1.1.0a2/lib/python3.9/site-packages/fractal_tasks_core/_utils.py", line 91, in run_fractal_task
    metadata_update = task_function(**task_args.dict(exclude_unset=True))
  File "/data/homes/atschan/PhD/Code/Python/fractal/custom_napari_workflow_wrapper/napari_workflows_wrapper.py", line 535, in napari_workflows_wrapper
    mask[mask > 0] += max_label_for_relabeling
  File "/data/homes/atschan/.conda/envs/fractal-client-1.1.0a2/lib/python3.9/site-packages/pyclesperanto_prototype/_tier0/_array_operators.py", line 358, in __getitem__
    result = super().__getitem__(index)
  File "/data/homes/atschan/.conda/envs/fractal-client-1.1.0a2/lib/python3.9/site-packages/pyopencl/array.py", line 2130, in __getitem__
    raise NotImplementedError(
NotImplementedError: multidimensional fancy indexing is not supported

@jluethi suggested checking the output type of napari apoc and casting it to a numpy array in case the type was the issue, The output of napari apoc turned out to be of type pyclesperanto_prototype._tier0._pycl.OCLArray. After casting the array to a numpy array the workflow finished without problems. To make the output a label image with a unique label for each segmented object I then added a labeling step on top.

I added a user-specific-example (21_Adrian_apoc_napari_workflow_wrapper) to my fork of the fractal-demos repository (https://github.com/adrtsc/fractal-demos).

jluethi commented 1 year ago

Ok, so the main steps to get apoc support into the core repo:

@adrtsc Why was a labeling step necessary? Isn't the result already a label image? Or does it just produce a binary mask?

Thanks for sharing a link to your demo-example!

adrtsc commented 1 year ago

For now, I was using the pixel classifier approach from apoc. This yields a label image, but with a unique label per pixel class. In my case label 1 for background and label 2 for nuclear speckles. For my personal use-case, I prefer to have a unique label for each nuclear speckle object, therefore I added the labeling of connected components.

jluethi commented 1 year ago

Ah, interesting point. Can a connected-component relabeling be done as part of a napari workflow? That would be the most elegant solution to this. I'm hesitant to add logic to the core task to start changing the label images we get from napari workflows, because that's not really a wrapper job

Maybe that would also already convert it into a numpy array. Though the type checking & casting would anyway be useful in the wrapper :)

adrtsc commented 1 year ago

small update from my side:

Ah, interesting point. Can a connected-component relabeling be done as part of a napari workflow? That would be the most elegant solution to this. I'm hesitant to add logic to the core task to start changing the label images we get from napari workflows, because that's not really a wrapper job

Good point. I looked a bit more into napari workflows and now changed my approach a bit. Integrating the relabeling in the napari workflow turned out to be quite easy. So this absolutely can be done in the napari-workflow instead of the core task.