Open tcompa opened 11 months ago
Note: ome-zarr-py already does something more than just type 1 validation, see e.g. https://github.com/ome/ome-zarr-py/blob/master/ome_zarr/format.py#L220
def validate_well_dict(
self, well: dict, rows: List[str], columns: List[str]
) -> None:
super().validate_well_dict(well, rows, columns)
if len(well["path"].split("/")) != 2:
raise ValueError("%s path must exactly be composed of 2 groups", well)
row, column = well["path"].split("/")
if row not in rows:
raise ValueError("%s is not defined in the plate rows", row)
if well["rowIndex"] != rows.index(row):
raise ValueError("Mismatching row index for %s", well)
if column not in columns:
raise ValueError("%s is not defined in the plate columns", column)
if well["columnIndex"] != columns.index(column):
raise ValueError("Mismatching column index for %s", well)
A relevant example for case 2 is the following: image metadata includes multiscales, that include datasets, that have path properties. Do these match to some actually-existing zarr subgroups? How can we check?
Definitely! see https://github.com/JaneliaSciComp/pydantic-ome-ngff/commit/90507a970b4f9264a212eb24792fe6143c1e6e17 for for an example of this kind of check
Validating a zarr group takes place at least at two levels:
A relevant example for case 2 is the following: image metadata includes multiscales, that include datasets, that have
path
properties. Do these match to some actually-existing zarr subgroups? How can we check?Option A
Option B