Closed keller-mark closed 1 month ago
I've run into this in spades with a use case I'm implementing over in https://github.com/DOI-USGS/rnz
The pattern has to iterate through all the arrays in a zarr group and read all the metadata -- essentially building consolidated metadata.
My toy example is:
> zdump(r)
zarr {
dimensions:
t = 5 ;
y = 3 ;
x = 5 ;
variables:
<f8 data(t, y, x) ;
<f8 i(y, x) ;
<f8 j(y, x) ;
<f8 t(t) ;
// global attributes:
}
But when there are lots of variables, it's painfully slow.
How did you imagine this working? I can start working up an implementation.
Worked up a proposed pattern to pull down and cache consolidated metadata on initialization of an http store. Does this work for you @keller-mark ?
I think with #82 closed, we could call this done?
User story
Preferred solution
Possible alternatives