PreibischLab / BigStitcher

ImgLib2/BDV implementation of Stitching for large datasets
GNU General Public License v2.0
66 stars 14 forks source link

Pairwise shift calculation throws exception if pixel values outside 0-32768 range #60

Open VolkerH opened 4 years ago

VolkerH commented 4 years ago

I have recently been using https://github.com/nvladimus/npy2bdv from @nvladimus to create Big Stitcher xml/h5 files directly from Python. I have been doing this with various datatypes (uint8, uint16, int32) etc. All of these projects read fine in Big Stitcher and display fine in Big Data Viewer.

However, there is an issue with some data types during pairwise stitching. If any values in the dataset are outside the range 0...32768 a type conversion that appears to happen during the pairwise phase shift calculation throws an Exception. Now that I found it, I can work around it by clipping the values to the allowed range but ideally this would be fixed in the BigStitcher code.

(Thu Oct 10 22:09:03 AEDT 2019): Requesting Img from ImgLoader (tp=0, setup=70)
1: [0, 0] -> [539, 639], dimensions (540, 640)
1: [1782.489, 3651.7675] -> [2321.489, 4290.7675], dimensions (539.0, 639.0)
2: [0, 0] -> [539, 639], dimensions (540, 640)
2: [1336.773, 3651.7675] -> [1875.773, 4290.7675], dimensions (539.0, 639.0)
O: [1782.489, 3651.7675] -> [1875.773, 4290.7675], dimensions (93.28399999999988, 639.0)
1: [0.0, 0.0] -> [93.28399999999988, 639.0], dimensions (93.28399999999988, 639.0)
1: [1, 1] -> [92, 638], dimensions (92, 638)
2: [445.7160000000001, 0.0] -> [539.0, 639.0], dimensions (93.28399999999988, 639.0)
2: [447, 1] -> [538, 638], dimensions (92, 638)
FFT
PCM
expand
cross 14
sort
done
1.0,1.2839999999998781,1.0
1.0,1.0,-2.0
shift (pixel coordinates): (2.8640000000004875, -8.0, 0.0)
shift (global coordinates): (1.0, 0.0, 0.0, 2.8640000000004875, 0.0, 1.0, 0.0, -8.0, 0.0, 0.0, 1.0, 0.0)
cross-corr: -0.11947696403771602java.util.concurrent.ExecutionException: ncsa.hdf.hdf5lib.exceptions.HDF5DatatypeInterfaceException: Datatype:Can't convert datatypes ["..\..\src\H5Tconv.c line 5377 in H5T__conv_ushort_short(): can't handle conversion exception
"]
    at java.util.concurrent.FutureTask.report(FutureTask.java:122)
    at java.util.concurrent.FutureTask.get(FutureTask.java:192)
    at net.preibisch.stitcher.algorithm.globalopt.TransformationTools.computePairs(TransformationTools.java:622)
    at net.preibisch.stitcher.plugin.Calculate_Pairwise_Shifts.processPhaseCorrelation(Calculate_Pairwise_Shifts.java:179)
    at net.preibisch.stitcher.gui.popup.CalculatePCPopup$MyActionListener$1.run(CalculatePCPopup.java:180)
    at java.lang.Thread.run(Thread.java:748)
Caused by: ncsa.hdf.hdf5lib.exceptions.HDF5DatatypeInterfaceException: Datatype:Can't convert datatypes ["..\..\src\H5Tconv.c line 5377 in H5T__conv_ushort_short(): can't handle conversion exception
"]
    at ch.systemsx.cisd.hdf5.hdf5lib.H5.H5Dread(Native Method)
    at ch.systemsx.cisd.hdf5.hdf5lib.H5D.H5Dread(H5D.java:381)
    at bdv.img.hdf5.HDF5AccessHack.readShortMDArrayBlockWithOffset(HDF5AccessHack.java:198)
    at bdv.img.hdf5.HDF5AccessHack.readShortMDArrayBlockWithOffset(HDF5AccessHack.java:183)
    at bdv.img.hdf5.Hdf5ImageLoader$SetupImgLoader.loadImageCompletely(Hdf5ImageLoader.java:438)
    at bdv.img.hdf5.Hdf5ImageLoader$SetupImgLoader.getImage(Hdf5ImageLoader.java:506)
    at bdv.img.hdf5.Hdf5ImageLoader$SetupImgLoader.getFloatImage(Hdf5ImageLoader.java:566)
    at net.preibisch.stitcher.algorithm.DownsampleTools.openAndDownsample(DownsampleTools.java:148)
    at net.preibisch.stitcher.algorithm.RAIProxy.loadIfNecessary(RAIProxy.java:54)
    at net.preibisch.stitcher.algorithm.RAIProxy.dimension(RAIProxy.java:179)
    at net.preibisch.stitcher.algorithm.PairwiseStitching.getShift(PairwiseStitching.java:234)
    at net.preibisch.stitcher.algorithm.globalopt.TransformationTools.computeStitching(TransformationTools.java:281)
    at net.preibisch.stitcher.algorithm.globalopt.TransformationTools$2.call(TransformationTools.java:570)
    at net.preibisch.stitcher.algorithm.globalopt.TransformationTools$2.call(TransformationTools.java:553)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    ... 1 more
Exception in thread "Thread-475" java.lang.NullPointerException
    at net.preibisch.stitcher.plugin.Calculate_Pairwise_Shifts.processPhaseCorrelation(Calculate_Pairwise_Shifts.java:201)
    at net.preibisch.stitcher.gui.popup.CalculatePCPopup$MyActionListener$1.run(CalculatePCPopup.java:180)
    at java.lang.Thread.run(Thread.java:748)
nvladimus commented 4 years ago

I encountered very similar bug today. Fiji/BigSticther crashes while opening datasets with high pixel values. When any pixel intensity is above 32k => crash at the file opening. example dataset, 32M

StephanPreibisch commented 4 years ago

Hi, what is the pixel type?

nvladimus commented 4 years ago

uint16 in my case.

StephanPreibisch commented 4 years ago

Then you maybe do not save it "right", i.e. how we save a uint16. Maybe make a ramp TIFF image [0 ... 65535] in BigStitcher and save it as HDF5 to understand the convention? Java has only signed int16, which needs to encode an uint16

VolkerH commented 4 years ago

I just had a look. It seems that @constantinpape has identified and addressed this issue of mapping types between numpy and java in his code here, including value range adjustments.

https://github.com/constantinpape/pybdv/blob/master/pybdv/dtypes.py

What still confuses me is that the one tile I had in my projects is visualized correctly in BDV (so maybe there is some value range checking and clipping happening before visualization). It only crashes during pairwise shift computation (in the hdf5lib).

StephanPreibisch commented 4 years ago

Hi @tpietzsch, do you maybe have any idea what could cause the above exception. The HDF5 is written from python and there are type conversion problems which should be fixed. However, I am puzzled with this concurrency exception too ... maybe it just vanishes once the type is fixed. Thank you so much!

constantinpape commented 4 years ago

I just had a look. It seems that @constantinpape has identified and addressed this issue of mapping types between numpy and java in his code here, including value range adjustments.

@VolkerH,. yes, I implemented this a while ago and it should take care of shifting the value ranges from the unsigned representation [0...65535] to the signed representation [-32k...32k] or throw a value error if the input data does not fit the value range. This works for me, but I haven't checked this extensively for different data type combinations. If you use this code and find any issues, please let me know.

nvladimus commented 4 years ago

Fixed the uint16->int16 conversion in my code, npy2bdv. Thanks everyone for helpful discussion.

VolkerH commented 4 years ago

@nvladimus thanks, will test in the coming days and raise an issue in your repo if problems arise. I know how to work around it on the Python side when generating the file. It just confused me that BDV would interpret and display the generated file correctly but the phase correlation code would throw an exception.