Open balintbalazs opened 4 years ago
How was your hdf5 file created?
There is a similar issue with tips on resolving it here: https://github.com/PreibischLab/BigStitcher/issues/60
The files are from the Luxendo software. We directly copy the uint16
camera images to a uint16
HDF5 dataset using the C API. When opening them with the HDF5 Fiji plugin they are okay.
It seems the bdv reader assumes int16
datatype without checking the actual datatype of the dataset: https://github.com/bigdataviewer/bigdataviewer-core/blob/97983caee4fc8ae6c664e981e1e51d9668a01993/src/main/java/bdv/img/hdf5/HDF5AccessHack.java#L197
I will try the suggestions in the linked thread to see if that helps, thanks for the info.
I made a minimal example that shows this breakage (attached) and discussed it a bit at https://forum.image.sc/t/bigstitcher-image-fusion-produces-black-bars/85726/10 and https://github.com/PreibischLab/BigStitcher/issues/129. The int16 file opens perfectly, while the uint16 file crashes BDV.
Ultimately, I fixed my writer in the same fashion as described at https://github.com/PreibischLab/BigStitcher/issues/60: I explicitly cast everything to int16 before writing to file. Should we be able to pass things into BDV as int16? Or should only uint16 be supported.
The behaviour you describe (assuming int16
always) was how BDV worked for a long time.
I added proper support for more datatypes a while ago in https://github.com/bigdataviewer/bigdataviewer-core/pull/157. However, for this to be picked up, the datatype must be added as an attribute to each setup group. E.g., the info (pyramid resolutions, etc) for the first setup is under group "/s00" in the h5 file. If "/s00" has a "dataType" string attribute with value "uint8", that means that the datasets for this setup are UnsignedByteType
. If this is missing, then BDV falls back to the legacy behaviour (to keep supporting the old files).
So, you can make your test_uint16.h5
work by putting adding attributes to the s00
.. s05
groups.
Ultimately, I fixed my writer in the same fashion as described at https://github.com/PreibischLab/BigStitcher/issues/60: I explicitly cast everything to int16 before writing to file. Should we be able to pass things into BDV as int16? Or should only uint16 be supported.
Please don't do this... Moving forward, it would be better to add the dataType
= uint16
attributes.
Something like
IHDF5Writer hdf5Writer = HDF5Factory.open( "test_uint16.h5" );
hdf5Writer.string().setAttr( "s00", "dataType", "uint16" );
in Java, or
hdf5_writer['s00'].attrs['dataType'] = 'uint16'
in python
This issue has been mentioned on Image.sc Forum. There might be relevant details there:
https://forum.image.sc/t/bigstitcher-image-fusion-produces-black-bars/85726/14
Yep--that attribute specification fixes it. Thanks!
This is great, thanks a lot! I think this issue can be closed, since the new spec seems to cover this use case.
Is there a place with an up to date specification for the BigDataViewer format? Until now we just used the original Nature Methods paper, plus the export from Fiji function, but those don't cover all use cases. Thanks again!
When opening a
unit16
dataset, BigDataViewer tries to convert it toint16
, which can fail if a value is too large for anint16
. It produces the error below for each pixel where the value is larger than 32767. Because the error happens for each pixel, this can make Fiji unresponsive and it can crash. The datasets can be loaded with the HDF5 plugin correctly.Environment:
Update Fiji
on 2020-08-03