First stab at this, it's a little more complicated by the pydantic validation to use it in BaseModel
When you write the validation method on the Bytes32 class it gets called in the BaseModel where it's being used, so the cls is the implementing class, not Bytes32, so you don't have access to the bytes_length. I'm open to alternative ways of making this a more flexible class, i.e. allowing for passing the number of expected bytes. All I ask is that we call the class ByteSized if we're able to get this dynamic validation working.
As for the JSON validation, it works well and we can pass the serialization for it in the BaseModelConfig class by using the json_encoders dictionary and pointing it to the __serialize__ method in Bytes32. The issue arises when you want the dict() to also serialize differently, pydantic doesn't currently support this behavior but it's being talked about.
What I did
fixes: #20
First stab at this, it's a little more complicated by the pydantic validation to use it in
BaseModel
When you write the validation method on the
Bytes32
class it gets called in theBaseModel
where it's being used, so thecls
is the implementing class, notBytes32
, so you don't have access to thebytes_length
. I'm open to alternative ways of making this a more flexible class, i.e. allowing for passing the number of expected bytes. All I ask is that we call the classByteSized
if we're able to get this dynamic validation working.As for the JSON validation, it works well and we can pass the serialization for it in the
BaseModel
Config
class by using thejson_encoders
dictionary and pointing it to the__serialize__
method inBytes32
. The issue arises when you want thedict()
to also serialize differently, pydantic doesn't currently support this behavior but it's being talked about.https://github.com/samuelcolvin/pydantic/issues/1409 https://github.com/samuelcolvin/pydantic/issues/951 https://github.com/samuelcolvin/pydantic/issues/811
How I did it
How to verify it
Checklist