Closed Lauenburg closed 1 year ago
@william-silversmith I have the same issue. Do you have any ideas why this could happen? This is how we convert the NumPy array into a precomputed file.
igneous image create <source> <target> --resolution 8,8,30 --chunk-size 50,50,5 --compress none
Ah, I think I know what's going on here. I was trying to be clever here and am reading .npy files as np.memmap so that way you can easily work with very large npy file. However, there are two functions you can use. np.memmap treats the file as an array and so the header gets treated as data and probably causes the weird shifts you are seeing. Another function np.lib.format.open_memmap handles npy files with headers. I'll check to see if the file has a header and then use the appropriate function.
Hi, I updated the create function and it should work a bit better now in 4.19.0. There is also support for hdf5 and crackle files. You'll need to manually install h5py. Give it a try and let me know how it goes.
Ack, I boofed it. I'll release a fixed version in a couple hours.
Check out the latest version!
Hi @william-silversmith, Now it works like a charm! Thank you so much for all your work!
I am trying to recover the original Numpy dataset from an NG precompute dataset.
For this, I am running through the following steps:
In code:
However, the assert fails... Checking the volumes, it seems that the final column of voxels (128X128Xz) of the precompute gets shifted to the front. Plotting part of the volumes confirms this:
When applying a
img_csubvol = np.roll(img_csubvol, -128, axis=2)
,img_cv_subvol
andimg_np
look the same. But the assert still fails.Creating a heatmap shows that the last column is not only shifted to the front but also some filter is applied:
For the assert to succeed, you have to cut off the first column from the recovered and the last column from the original dataset:
Could someone tell me what is going wrong here and how I can prevent it?