Open Mohinta2892 opened 6 months ago
On the image side, probably no issues - it's just about turning the image stack into a properly-oriented, properly-resolutioned xarray over dask (probably over zarr). Pymaid may be the best place for that functionality as CATMAID can reference precomputed stacks either directly or via cloudvolume. If it's a precomputed volume which is fetchable from CATMAID, you can get all the metadata in the same way as the N5 volumes (see the implementation for tile source 11 here https://github.com/navis-org/pymaid/blob/04b5559eafaf5ee82db22be7ee75e2239de6600b/pymaid/stack.py#L285 ). I don't know if there exists a zarr store for precomputed, but you could probably wrap a thin one over cloudvolume. That python module already contains some zarr store shims over JPEG tile stacks.
I'm not so sure about the morphology data, but most of the functionality is just built on top of navis neurons, so if you can convert the morphology data coming out of cloudvolume into navis structures, you'd probably be fine, although that happens at such a low level that you might have to re-implement some logic on top of it.
Cloudvolume was recently made compatible with py3.12, iirc.
If you want a drop-in xarray over dask over zarr without going through pymaid, you could also look at https://github.com/clbarnes/multiscale_read/ , which has such implementations over OME-zarr (possibly outdated by now) and BDV-N5.
Hi Chris,
I am wondering if this can in anyway extended to pull data from gs://precomputed files that are publicly available (e.g., HemiBrain).
It would be super helpful if we can:
Do you foresee any problems with adapting your script for any of the above requirements?
Best, Samia