Open droumis opened 8 months ago
Quoted from https://github.com/holoviz-topics/neuro/issues/87#issuecomment-1908762800:
I have a lot of data like this, and I would love to be able to browse it from a jupyter notebook. In fact, there is (to my knowledge) no python solution for browsing this data in an acceptable way. I'd be super excited to try anything you make on some of our datasets.
Hi @d-v-b , We are looking into your EM/large-volume use case and considering what might be possible given constraints. We'd love to hear more about what an 'acceptable' solution entails. Briefly, in your opinion, what are the necessary features of a viewer? what might be missing from neuroglancer? Would something like neuroglancer, but viewable inside a jupyter notebook be sufficient?
Also, assuming the FIB-SEM fly dataset that you linked is a good/representative place to start, what other datasets would you recommend looking into? For data that you are working with now, what is the typical size/resolution range of a single 2D slice? 3D volume?
Also, assuming the FIB-SEM fly dataset that you linked is a good/representative place to start, what other datasets would you recommend looking into? For data that you are working with now, what is the typical size/resolution range of a single 2D slice? 3D volume?
The group I work for routinely releases 3D ~isotropic images with ~ 10k samples in each dimension (so an image size like (15000, 15000, 15000) would not be weird). In addition to the grayscale EM data, we also release segmentation images that have the same dimensions, but use dtypes like uint32
or uint64
. The images have grid spacing (resolution) in the range of of 2 - 8 nanometers. We publish these datasets on www.openorganelle.org, in case you want to look at more of them.
conda
or pip
to install some python program, and then use the CLI, to look at an image. I can see how this would be hard if you are developing a jupyter-embeddable solution, but ideally that solution would use a JS-only component that could be embedded in a stand-alone site.honestly I would start by copying the design decisions neuroglancer made, and deviate from that when necessary. it's a really good tool, and I wish more tools in bioimaging copied it!
These are really great points; thanks a lot for the response! Based on your suggestions, we will next evaluate what approach might work best given our constraints and go from there.
There's plenty left to figure out (notably the bidirectional link with the served viewer state), but in principle, we should be able to leverage Panel to use Neuroglancer in a Jupyter Notebook. Here is a POC:
https://github.com/holoviz-topics/neuro/assets/6613202/1cc481ad-3f41-4448-80a2-06b68b13ed54
that's super cool! is the source code for that demo available? I think lots of people would use this
Nice, yep the code for this quick demo is in the dropdown above the video.
We welcome any and all feedback. I think getting the JSON panel on the right to stay synchronized with the neuroglancer iframe state is my next priority. Right now it's just parsing the original URL
Regarding your comment about being 'web-based', if the use-case was solely limited to someone visiting a website and interactive with a web app, we could probably make things work without the user having to python install anything, via pyodide. However, neuroglancer seems to have addressed that use-case itself. Given the unaddressed use-case is based on use in Jupyter notebook... I'm thinking that it makes sense to expect our users to be comfortable in Python installation land.
Made some updates.. I'm now starting a new viewer instance from python so that the state of the embedded neuroglancer app now can be kept in sync with other components! For instance, you can see the properties on the right remain updated as I pan the viewer position.
I think this is a pretty promising approach since it allows for two primary workflows. First, it allows anyone with an existing neuroglancer url to just plop it in to the input field and voila, you have your own viewer based on that url. Alternatively, someone could start by just creating an empty viewer with this app and then programmatically build it up however they want using the app's viewer (e.g. app.viewer
).
https://github.com/holoviz-topics/neuro/assets/6613202/31851dad-ab9d-40db-bf6f-470f7279889e
Next steps:
Here is the revised class with the following updates:
https://github.com/holoviz-topics/neuro/assets/6613202/2ab2c2be-cc75-4e44-a5bd-f960ca765909
Next steps:
viewer.app
or viewer.display
and get this app in the notebook without running anything else.We welcome any and all feedback. I think getting the JSON panel on the right to stay synchronized with the neuroglancer iframe state is my next priority. Right now it's just parsing the original URL
I haven't tried this yet but i'm super excited to, and thanks for putting this demo together! When I have feedback I will post it here.
⚠️ This issue is related to https://github.com/holoviz-topics/neuro/issues/87. There, the focus was on handling multi-scale data in the time dimension. In contrast, this issue is focused on multi-scaling volumetric images (x,y,z).
Problem:
See https://github.com/holoviz-topics/neuro/issues/87
Description/Solution/Goals:
See https://github.com/holoviz-topics/neuro/issues/87 for general motivation. In contrast, the goal of this current issue is to focus on multi-scale large image volumes, rather than downscaling in the time dimension.
Potential Methods and Tools to Leverage:
See https://github.com/holoviz-topics/neuro/issues/87 Also:
Tasks:
Use-Cases, Starter Viz Code, and Datasets:
Electron Microscopy (EM):