Open manoaman opened 2 years ago
It supports Nifti as long as the files aren't too large (the entire file is loaded at once).
There are no JSON files involved for the nifti format.
You can use:
nifti://https://host/path/to/data.nii
gzipped nifti files are also supported:
nifti://https://host/path/to/data.nii.gz
Nice, thank you @jbms . Could I use a segmentation metadata JSON file to load up segmentations properties mapping? Is there a way to associate to the source .nii.gz file? e.g.) segment_properties/info JSON file
Hi @jbms
It supports Nifti as long as the files aren't too large (the entire file is loaded at once).
What would be the appropriate size for Nifti files? I'm also seeing "Unsupported data type: FLOAT64." error so I wonder .nii files need to be converted to precomputed format instead.
Neuroglancer can't use float64 directly since it is not available in webgl, but you could add some code to the nifti datasource to automatically convert float64 to float32 --- that would be pretty easy.
The limit on the size of the files is determined by the limits of your GPU on 3d and 2d textures, which you can find here: https://webglreport.com/?v=2
If only 2 dimensions have a size > 1 it will be stored as a 2-d texture, otherwise as a 3-d texture.
On my system 2-d textures are limited to 16384x16384 and 3-d textures are limited to 2048x2048x2048.
Hi again, in some Nifti files, I am experiencing toNormalized: no matching overloaded function found
in the shader control UI. Could I be missing something here? Thanks.
#uicontrol vec3 color color(default="white")
#uicontrol float min slider(default=0, min=0, max=1, step=0.01)
#uicontrol float max slider(default=1, min=0, max=1, step=0.01)
#uicontrol float brightness slider(default=0, min=-1, max=1, step=0.1)
#uicontrol float contrast slider(default=0, min=-3, max=3, step=0.1)
float scale(float x) {
return (x - min) / (max - min);
}
void main() {
emitRGB(
color * vec3(scale(toNormalized(getDataValue(0))) + brightness, scale(toNormalized(getDataValue(1))) + brightness, scale(toNormalized(getDataValue(2))) + brightness) * exp(contrast)
);
}
Is your data type a signed integer type? toNormalized
is not defined for signed integers at the moment.
However, there is now also a better alternative to using toNormalized
and separate min
/ max
sliders or brightness
/contrast
sliders: you can use an invlerp
control which combines those functions into a single control and also displays the data distribution:
#uicontrol vec3 color color(default="white")
#uicontrol invlerp normalized0(channel=0)
#uicontrol invlerp normalized1(channel=1)
#uicontrol invlerp normalized2(channel=2)
void main () {
emitRGB(color * vec3(normalized0(), normalized1(), normalized2()));
}
hi @jbms , yes, the failing Nifti files are datatype: int16
from what I can see. Should they be converted to float16
or other datatypes?
Thank you for suggesting on the shader control. I see a invalid UI control specification
error. Do I need to update the Neuroglancer code to the latest version?
Neuroglancer does not support float16 currently.
The invlerp
control was added over a year ago, but if you are using a very old version then you will have to update.
Ok, maybe I'm mistaken here what datatypes are supported. But please correct me if I'm wrong @jbms . I have Nifti files for float32
and uint32
and they seem to load okay. Are these the supported data types at the moment?
int16 should work with the invlerp
control, but you may need to use a more recent version of Neuroglancer -- perhaps try with the version we host at https://neuroglancer-demo.appspot.com
Actually, my mistake --- you need to specify the invlerp controls as:
#uicontrol vec3 color color(default="white")
#uicontrol invlerp normalized0(channel=[0])
#uicontrol invlerp normalized1(channel=[1])
#uicontrol invlerp normalized2(channel=[2])
void main () {
emitRGB(color * vec3(normalized0(), normalized1(), normalized2()));
}
Hmm... I thought I tried the latest code but I still get the Invalid UI control error for whatever the reason. Unfortunately, I don't have a Nifti file which I can use publicly. Do you happen to know if there is an open source Nifti file I can try with neuroglancer-demo app?
I think there is a bug whereby the generic "Invalid UI control specification" error is shown on the first line.
You can leave the first line of the shader blank, and then perhaps can see a more informative error message by hovering over the errors shown on later lines.
These are the errors I see.
In order to reference one of the dimensions of your dataset as a "channel" dimension from the shader, you need to rename it to end in ^
. So for example if it is currently named c
, you need to rename it c^
which you can do from the source tab under "output dimensions" by selecting the dimension name and typing in a different name.
Okay, I'm seeing the updated UI but there are still errors. Could I be missing anything here?
Only the single dimension that you want to be treated as a channel dimension should be renamed to end in ^
--- any dimension that you want to be treated as a spatial dimension should be left alone.
However it looks like this dataset does not have more than one channel, though there could be a bug in Neuroglancer's nifti support.
If you think that is incorrect, and are able to share the data file with me, or a similar file but with the actual data values replaced with zero, then I can try to investigate it.
Also I noticed that the m
source dimension is listed as having a scale of 0 --- you should change that to some non-zero value.
Yes, @jbms , thank you for pointing out and this is definitely my lack of knowledge on the shader's control UI and the use of source tab. Indeed the Nifti file only has one channel. After playing around with the default shaders control (grayscale), and adjusting the normalization bar, I figured out the image appears as expected for integer16
datatype.
As you pointed out, ^
on t
in the screenshot, and setting non-zero on m
source dimension resolved the errors appeared on GUI.
Thank you very much for offering your help! -m
Hi @jbms ,
I am loading 50um resolution images and I cannot seem to figure out why loading a Nifti file does not appear in a correct scale as shown with the annotation mesh example down below. "10um Lower Upper Scale" is not appearing in each row of the source matrix. I have doubts if Nifti file has a correct header. Do you know what attributes could be affecting this? Any suggestions on how to debug this would be appreciated. Thanks! -m
Nifti image (50um image resolution)
Annotation mesh (50um image resolution)
There may be a problem with the Nifti header, but in any case you can just override the resolution:
For the nifti image, under "source dimensions", just set the "scale" to 50um for all 3 --- and then you can also then use the little arrow button to set the output dimension resolution to match.
If your mesh is supposed to be at 50um resolution why does your file have it at 10um?
For the nifti image, under "source dimensions", just set the "scale" to 50um for all 3 --- and then you can also then use the little arrow button to set the output dimension resolution to match.
Setting the scale seems straightforward to me but here is how the view looks. I cannot seem to get the view to the correct zoom level or the position so that the image appear as the second screenshot. (The second screenshot is simply loading the Nifti file and zoomed in.) How can I fix this?
If your mesh is supposed to be at 50um resolution why does your file have it at 10um?
Thank you for mentioning this. Good point. After looking at the info file, I think the precomputed files are generated in 10um resolution. Instead, the files need to be reprocessed in 50um. That explains why the scale is not appearing as expected.
You can fix the position by clicking on each of the x, y, and z positions at the top and using the dropdown slider to set it to something in range. For the zoom level there is unfortunately not at the moment a "reset zoom" button.
It is odd that the scale bar is showing just "100" --- no units. That looks like a bug. You might try opening a new blank neuroglancer window, and then copying the layer (by dragging its layer tab) to the new window. That will bring the layer but reset other view parameters.
After a few adjustments, I think I reach to the Neuroglancer state which I was expecting in the first place. Thanks for the tips. Do you happen to know which Nifti header I can fix so that Neuroglancer picks up the values in the source matrix as shown in this screenshot?
It is odd that the scale bar is showing just "100" --- no units. That looks like a bug. You might try opening a new blank neuroglancer window, and then copying the layer (by dragging its layer tab) to the new window. That will bring the layer but reset other view parameters. Yes, creating a new layer changed "100" to "100 m".
hmm, I suspect the xyzt_units
needs to be set to mm
, sec
? (just a stab in the dark)
I personally have seen a lot of nifti's produced with the header field unset.
hmm, I suspect the
xyzt_units
needs to be set tomm
,sec
? (just a stab in the dark)I personally have seen a lot of nifti's produced with the header field unset.
Interesting. What about the orientation and the translation values? @xgui3783
Honestly, I have not paid much attention to the sform or qform. I only notice something is awry when the nifti is not displaying properly. For us, this is usually caused by one of the following:
xyzt_units
unset (for us, it needs to be mm, sec
, rather than unknown, unknown
)dims[4]
and dims[5]
needs to be 1 (some of our nifti sets it to 0, for whatever reason)You're right @xgui3783 . Once I set xyzt_units
and qform
, source UI in the Neuroglancer loads up as expected. Thanks for shedding a light into this.
import nibabel as nib
import numpy as np
image = nib.load('test.nii.gz')
hdr = image.header
# reorientation
ort = np.full((4, 4), 0.0)
ort[0,0] = 50
ort[1,1] = 50
ort[2,2] = 50
hdr.set_qform(ort)
# xyzt units
hdr.set_xyzt_units('micron','sec')
# save the file
nib.save(image, 'reoriented.nii.gz')
Hi,
Does the current version of Neuroglancer support NIfTI file format? If so, do you have a working example on how to specify source url and associated template json file(s) to use? e.g.) info
Thank you, m