Closed vigji closed 9 months ago
awesome, I'll add that in for now, then when the updated bg-space
comes out I'll update
should work in bg-space now
Where do we want to have this note? I will write something in the readme atm, let me know if it should go somewhere else as well.
I suppose on the docs too?
an additional page in gitbook you mean?
Closing this issue, as I think the issues are either addressed or out of date. If anyone is having further trouble, feel free to create a new issue about the specific problem.
As per the slack discussion:
There is some ambiguity when working on displaying numpy arrays, coming from the fact that image-first applications (eg: matplotlib, brainrender) have the convention that as the numpy array index increase in a slice, you move downward and rightward in the image. As opposed to this, cartesian plotting assumes that, increasing values, you move upward and rightward in the plot. As a result, inconsistencies might happen. Napari takes care of those inconsistencies by adopting the image-first convention for points as well.
this has a series of consequences:
1.
If we want to describe the raw ARA origin as we get it with the bg-space convention, it is actually “asr” and not “asl”. To be sure about this, we can do the following:
and we can confirm that what gets dimmed is the right hem and not the left one. Conclusion: specify how ARA description is confusing, and moving standard BG orientation to “asr”
2.
This introduces confusion for applications using cartesian indexes. As a simple example, this
produces a correct napari view (with 2 dots in the left hem, and 1 in the right one), but a flipped scatterplot, as I guess brainrender would do. The easiest solution, for ugly that it might sound, would be to just invert one axis (does not really matter which one) in cartesian indexed applications such as brainrender. This:
produces a plot with the correct orientation. For 2D applications (we don’t have any), one would have to flip the y when displaying.
3.
Finally, we always need to consider that when looking at the sliced data ( both with matplotlib and napari viewer in 2D mode), we are looking from the perspective of the slicing coordinate at point 0: for asr orientation, it means looking from the front, which always invert left and right hemispheres (left side of the image is right hemisphere).
This is 100% a display problem, as it will arise with matching underling stack and probe coordinates just by using different kinds of views. So I would not solve it messing with data (eg, exporting “brainrender compatible” coordinates), but by making one application (i would suggest brainrender, flipping an axis) compatible with the other.
What needs to be done:
asr
, or better address this concurrent description when instantiating the SpaceConvention frombg-space
, in a way that addresses the ambiguity (see https://github.com/brainglobe/bg-space/issues/6). Such solution would be the neatest because then visualisation tools could reference to someSpaceConvention
methods to figure out correct flips;