Open namautravail opened 1 year ago
Hi @namautravail
Yes, I've tried this. For the threejs, they use nrrd image's pixels data as input (or you can deal with all 3d format images and get their pixels as arraybuffer sent to threejs new THREE.DataArrayTexture(arraybuffer, width, height, depth);).
For my package, I also use the threejs shader to display images. But the benefit is I use dicom format images as input, so if you have a series of Dicom images, you can put them into a list using my copperScene loadDicom function to display.
scene.loadDicom(urls, {
gui,
getMesh(mesh) {
console.log(mesh);
},
getCopperVolume(copperVolume, updateTexture) {
copperVolume.windowWidth = 424;
copperVolume.windowCenter = 236;
updateTexture(copperVolume);
},
setAnimation(currentValue, depth, depthStep, copperVolume) {
currentValue += depthStep;
if (currentValue > depth) {
currentValue = 0;
}
return currentValue;
},
});
For more information, you can see:
Hope these can help you!
Thanks
Hi Linkun,
Correct me if I am wrong please.
For NRRD files, in your package, is the slice at depth (Z) not done by shader anymore (but using Volume & VolumSlice of ThreeJs) ? If yes, could you tell me why you do not slice by using shader anymore ? The slice in shader is interesting for me because it could be faster.
Your work on NRRD is interesting because we can slice and show in 2D when dealing with an 3D volume in input.
The dicom file seems to support only 2D format.
Thanks for all.
Hi @namautravail,
Just wander to confirm is you want to render NRRD as dynamic 2d or 3d?
In my package, to deal with NRRD files, I use Volume & VolumSlice of ThreeJs. This is because we need some segmentation purposes, and we want each slice (on axial, sagittal, and coronal views) to be static. And also, we want to render it in real size (mm), so use VolumSlice is suitable for displaying images at a correct distance.
Then for the 2d dynamic images, we choose to render the Dicom images via threejs DataArrayTexture, and ShaderMaterial, and use the total number of Dicom images as the 2d dynamic images (shader) 's depth.
As for the NRRD file, if we want to render it in 2d dynamic image, we can do the same things:
Then send them into DataArrayTexture, and you can get the nrrd 2d dynamic image base on the z direction.
Currently, my package hasn't supported the 2d dynamic images function using NRRD file, but I can update it later! However, as for the render NRRD in 3d dynamic images, I have no idea about it now. : (
The left panel is a normal nrrd render in threejs, The right panel is the nrrd file rendered in a 2d dynamic image.
@namautravail Switch slices by depth (z).
Hi Linkun,
For your question, I want to render in 2D a single file NRRD (3d) . Semi-dynamic, i.e: the user can visulize it on 3 axis, tranverse, sagittal, and coronal (not on the same view because we want to show in 2d).
Could we store all voxels in Data3DTexture then render the any of 3 plans and make slicing operations, all using shader (on graphic card) so more faster ?
Something like that, but renderring in 2D: https://github.com/mrdoob/three.js/blob/297724e33e8c22b00a3b05a8042bbb8f874243aa/examples/webgl2_materials_texture3d.html#L95
https://threejs.org/examples/?q=Texture#webgl2_materials_texture3d
To slice a given plan of three , at a specific depth and render it, how doing it... Some ideas below, I wonder whether is a good approach: https://blog.promaton.com/three-steps-to-slice-a-mesh-with-threejs-a7879ed54664) but not sure whether it is a good approach.
And effectively, how to keep the original dimension, as you have pointed out , thank you! ( I suppose you talked about the field named space directions in NRRD , and the canvas dimension https://github.com/mrdoob/three.js/blob/46754408eb02fa9b22210d391c7d45c053cbf063/examples/jsm/misc/Volume.js#L358).
Hi @namautravail,
Thanks for sharing this resource! I got your meaning.
Seems the approach is reasonably practical. I will dig deep into it later! : )
Then for texture3d I also tried using our NRRD file, but it didn't work. In the threejs example, it seems that the shader was only developed for that particular NRRD file. If we want to render our NRRD like the example, seems we need to write a new shader for it (not sure). And I also find the NRRD file used in threejs texture3d example is Float32Array format, but our NRRD file after loading into threejs is Uint16Array, seems they are using different data formats for rendering texture3d. However, After I convert our NRRD pixel array to Float32Array, it still not work. And it looks like the below images, not fully correct!
I think if we can finger out this issue, things might be easier, and we may easily achieve Semi-dynamic display.
Cheers!
Hi Linkun,
You have put me on the way when pointing out the data type issue.
Have you tried to specify the type = THREE.UnsignedShort4444Type (https://github.com/mrdoob/three.js/blob/297724e33e8c22b00a3b05a8042bbb8f874243aa/examples/webgl2_materials_texture3d.html#L103) ? And remove the format (https://github.com/mrdoob/three.js/blob/297724e33e8c22b00a3b05a8042bbb8f874243aa/examples/webgl2_materials_texture3d.html#L102))
By doing these 2 changes, the below image rendered (it is coming from a Nrrd whose type field is unsigned short , then voxel data will be stored in Uint16Array).
The image is rendered by default on coronal plane, it seems.
Hi @namautravail,
Awesome, happy to see it works for you! : )
For my nrrds, it is very strange the header shows it is unsigned short
type, But when I use THREE.UnsignedShort4444Type and also removed the THREE.RedFormat, it not work. It only works when I convert the Uint16 to Float32, after I adjust the clim1 clim2 and isothreshold will show the image shape.
And one thing, I don't know whether you have noticed it yet. That is the Data3DTexture only shows the pixels, it does not show the image millimetres, which means the model shown here is not real, but is compressed or stretched.
You can check the model width/height/depth, which is equal to the nrrd dimensions. This is incorrect because the nrrd dimensions shore the nrrd pixels' width/height/depth, however the real width/height/depth for the nrrd is widthspace_x, heightspace_y, depth*space_z.
So I think before we put the data into the Data3DTexture, we still need to consider dealing with the data.
You meant probably the "space directions" field of NRRD format is currently not taken in account in the the volume shader (VolumeShader1). There is something to do with repeat function of texture (https://stackoverflow.com/questions/33803280/three-js-how-do-i-scale-and-offset-my-image-textures).
Or to adjust the BoxGeometry dimension, not to voxel array size as currently done anymore, but mutliplied by the Space Direction field of Nrrd, as it is done here (https://github.com/mrdoob/three.js/blob/297724e33e8c22b00a3b05a8042bbb8f874243aa/examples/jsm/misc/VolumeSlice.js#L210).
The first option seems ok, the dimension seems ok but I should test it more.
The texture.repeat function seems to be working on 2d images, not sure it is still working on 3D.
One way I think we can do that is we can scale the 3D image via its spacing.
Hi Linkun,
Can you tell me whether you have tried to slice the depth of the volume, using shader as following ?
https://github.com/mrdoob/three.js/blob/297724e33e8c22b00a3b05a8042bbb8f874243aa/examples/webgl2_materials_texture2darray.html#LL35C1-L43C3
Thanks