Open kkyoda opened 1 month ago
Hi The error "Incompatible chunk size constraints for dimension 3: read size of 1024, write size of 1538" indicates that the shard size is 1538 but the chunk size is 1024. You'll need to specify compatible shard and chunk sizes: See https://github.com/ome/ome2024-ngff-challenge?tab=readme-ov-file#optimizing-chunks-and-shards
@will-moore Thank you for your quick response! I would like to try to set the shard and chunk size.
I am trying to update metadata such as license, organism, and modality in an OME-Zarr filesets converted with bioformats2raw using the ome2024-ngff-challenge tool. Now I am facing an issue where an error occurs when applying it to the following data.
https://ome.github.io/ome-ngff-validator/?source=https://ssbd.riken.jp/100118-dcacbb41/zarr/v0.4/Fig3a_FIB-SEM_synapse.zarr
I apologize if this is a basic question, but I would appreciate it if anyone could provide a solution. For reference, Zarr filesets with updated metadata were successfully generated for the following data without any issues.
https://ome.github.io/ome-ngff-validator/?source=https://ssbd.riken.jp/100118-dcacbb41/zarr/v0.4/Fig5AC_Mitochondrial_MitoPB.zarr