Seems like there's no harm in allowing dtype be anything, at worse we will just
fail an isinstance() check or fail an encoding check later, but it should be possible to
declare arbitrary dtypes if people want to implement their own serialization logic on top of that.
We just use the Any schema for the inner type if we can't generate a schema for it.
Tests are simple, just that we allow a pydantic model and reject an incorrect model. We also test for schema generation (though we trust pydantic to generate the json schema, we just test that it is in fact created).
Seems like there's no harm in allowing
dtype
be anything, at worse we will just fail anisinstance()
check or fail an encoding check later, but it should be possible to declare arbitrary dtypes if people want to implement their own serialization logic on top of that.Currently tested with numpy and dask, hdf5 won't support this and zarr currently has a bug but otherwise should support it: https://github.com/zarr-developers/zarr-python/issues/2081
We just use the
Any
schema for the inner type if we can't generate a schema for it.Tests are simple, just that we allow a pydantic model and reject an incorrect model. We also test for schema generation (though we trust pydantic to generate the json schema, we just test that it is in fact created).