Open tischi opened 3 years ago
Including all scale layers and segmentation:
$ du -sh em-raw.ome.zarr/
12G em-raw.ome.zarr/
Only the raw data at 100nm^3 (to be more precise this is actually 80x80x100)
$du -sh em-raw.ome.zarr/s0/
8,2G em-raw.ome.zarr/s0/
Thanks!
hi @constantinpape cc @joshmoore
cc @K-Meech (this is also relevant for your writer code, I think, because we should propose some good defaults).
- Could you also upload a version where the downsampling is based on powers of 3?
I can try to upload something later, but I would start from the same initial resolution, so we don't produce too much data.
- Do you already do something to account for initial anisotropy during downsampling?
Yes, I first downsample by [1, 2, 2] (in z, y, x axis convention).
- Do do you average bin or something more fancy, like blurring and picking the central sample values?
Here, I just take the average. I tried a couple of different options a while back and in my experience average worked best for EM data.
maybe then start [1,3,3,] ok?
maybe then start [1,3,3,] ok?
Sorry, I think what I wrote wasn't quite clear.
For the [3, 3, 3]
downscaling I would start from (80, 80, 100) nm
, which is (approximately) isotropic and always downscale by [3, 3, 3]
.
To start with [1, 3, 3]
, I would need to start from the full resolution again, which is (10, 10, 25) nm
.
I would rather avoid doing that, because it would take a while and produce a large amount of data.
For the [3, 3, 3] downscaling I would start from (80, 80, 100) nm, which is (approximately) isotropic and always downscale by [3, 3, 3].
That's good!
@tischi I have uploaded the version downscaled with factors of 3 to embl/i2k-2020/em-raw3.ome.zarr/
.
@constantinpape
How big is the em-raw (100nm^3) data set in MB?