Open brianthomas opened 10 years ago
Attributes are assigned to dataset objects in HDF5 and they are similar to those in the headers of FITS files. Attributes can have arbitrary names and any kind of data type that is supported for datasets in HDF5. I have edited now the usecase to cover this usecase.
Oops, accidently opened a new discussion. I've just tried to extract requirements from the use case. Please take a look at requirement 1.4 and related sub-requirements to see if I've missed anything or not.
I think that the requirement 1.4 captures well the point of the usecase. Perhaps you can also mention collections of interferometry data explicitely in section 1.4.1 because of their special characteristics (multi-dimensional datasets). In section 1.4.2 you could add that the data products of varying types can also be arbitrarily nested (no maximum depth of nesting).
OK, added your refinements to requirement 1.4.2. As for 1.4.1, aren't interferometry data covered by admixture of "images, data cubes, spectra, tables, etc"? Have I been missing a fundamental data product type we need to add to Requirement 1?
would a dimension scale count as a different data product type? It is a pretty useful feature: http://docs.h5py.org/en/latest/high/dims.html particularly for something like interferometry data.
This is essentially a mapping of one array onto another as a scale, somewhere in between metadata and actual data. This could go in this usecase or case 4?
The updated version of the requirement looks good! I think that the dimension scale looks like something useful as well for labelling interferometry data or other multi-dimeenional datasets.
This use case is basically a request that we have some kind of organized container. Can the container itself also be held within a higher level container? Im not so familiar with HDF, can you describe the metadata needs a bit more? What kind of associations might you want between the data within the structures? Is that covered by the HDF bit or could it be further illuminated?