Inspired to test the proposal structure of #239, and replacing #230, I refactored the ophys submodule to follow the proposed workflow to show how much more readable the code is, as well as how well it can generalize to allow easy creation of subwidgets that differ only slightly in how they setup different parts of the total visualization.
The TwoPhotonSeries visualizations are a good case to use for this because they use essentially the same types of plots for planar view of both single-plane instances (3D data) and volumetric cases (4D data). Thus if any improvements are ever made to either plot, if they didn't have some type of inheritance structure those improvements would have to be duplicated in code. Also we have several project going on right now and coming up (looking at MICrONS) that use these modalities, so further enhancements are no doubt to come.
Several such improvements have been made here...
the addition of a rotation feature allows the user to choose a different axis of orientation
contrast levels can be sensitive to different source data; imshow allows either automated methods (these are just 'OK' I'd say) or manual specification of ranges
the contrast options were starting to add clutter to the window, so for users that aren't looking to control that (and of course, by default) these are all hidden under a 'Simplified' view toggle, which can then be expanded. This is especially useful if we decide to eventually add a faceting, or possibly downsampling, filtering, or coloration controllers to allow maximum user control over the configuration of the call to plotly.express.imshow
Both the SinglePlaneVisualization and the PlaneSliceVisualization share these new features - the only difference between them is the latter has a single extra controller to allow specification of the plane index. As such, the inherited class does as little as possible to produce the key differences.
I also tried to keep all controllers here as general as possible; only the final assembly-level ones are kept in the ophys_controllers.py as being specifically designed for use the other ophys classes. This includes the controllers which have multiple components and their own internally interacting observers, which is one of the design proposals from @h-mayorquin that would up helping a lot to reduce complexity.
Another really helpful suggestion from @h-mayorquin was the MultiController class for assembling these components and allowing an outer exposure of the attributes so we don't have to meticulously (and painfully, sometimes) recurse down through the levels of ipywidgets.Box.children to find an object we want to reference. It also makes these references more verbose, and thus the code more readable.
I also re-organized the file structure to be more readable/navigable instead of everything being in a single file. When you start piling on more visualization for neurodata types from a single modality like that, it gets hard to scroll through looking for items you might want to find; even ctrl-F or GitHub link navigation doesn't always help much.
If this is too much to review all at once, I'd be happy to break the individual controllers into modular PRs accompanied by their own tests and even documentation. Figured it might be nice to see an explicit example of proposal structure #239 in practice and how it can help us keep this project more easily manageable going forward.
Inspired to test the proposal structure of #239, and replacing #230, I refactored the
ophys
submodule to follow the proposed workflow to show how much more readable the code is, as well as how well it can generalize to allow easy creation of subwidgets that differ only slightly in how they setup different parts of the total visualization.The TwoPhotonSeries visualizations are a good case to use for this because they use essentially the same types of plots for planar view of both single-plane instances (3D data) and volumetric cases (4D data). Thus if any improvements are ever made to either plot, if they didn't have some type of inheritance structure those improvements would have to be duplicated in code. Also we have several project going on right now and coming up (looking at MICrONS) that use these modalities, so further enhancements are no doubt to come.
Several such improvements have been made here...
plotly.express.imshow
Both the
SinglePlaneVisualization
and thePlaneSliceVisualization
share these new features - the only difference between them is the latter has a single extra controller to allow specification of the plane index. As such, the inherited class does as little as possible to produce the key differences.I also tried to keep all controllers here as general as possible; only the final assembly-level ones are kept in the
ophys_controllers.py
as being specifically designed for use the otherophys
classes. This includes the controllers which have multiple components and their own internally interacting observers, which is one of the design proposals from @h-mayorquin that would up helping a lot to reduce complexity.Another really helpful suggestion from @h-mayorquin was the
MultiController
class for assembling these components and allowing an outer exposure of the attributes so we don't have to meticulously (and painfully, sometimes) recurse down through the levels ofipywidgets.Box.children
to find an object we want to reference. It also makes these references more verbose, and thus the code more readable.I also re-organized the file structure to be more readable/navigable instead of everything being in a single file. When you start piling on more visualization for neurodata types from a single modality like that, it gets hard to scroll through looking for items you might want to find; even ctrl-F or GitHub link navigation doesn't always help much.
If this is too much to review all at once, I'd be happy to break the individual controllers into modular PRs accompanied by their own tests and even documentation. Figured it might be nice to see an explicit example of proposal structure #239 in practice and how it can help us keep this project more easily manageable going forward.
Video demonstration:
https://user-images.githubusercontent.com/51133164/205493069-e894bb0e-07d9-4003-9172-c7dd877c8ac1.mp4
TODO: