Open mlevkov opened 4 years ago
This would be great to have included as proper documentation, otherwise it might get lost as an issue :_S Please consider writing similar to this instead of a GH issue? (WIP): https://github.com/media-io/yaserde/issues/87
@brainstorm This should probably live in the WiKi. I did not know where to place it as I did not want to lose this information for other people to consider. So, I placed it here for now. @MarcAntoine-Arnaud Any suggestions?
@mlevkov I've recently moved the examples
repo, which should be archived/deleted, right @MarcAntoine-Arnaud? Into its own folder in the main yaserde
repo (see PR #106).
I think it would be great if you moved this great writeup and code snippets into this new examples
folder? Chances to have it lost are lower and the examples will be part of CI, so you code would definitely be used, seen and maintained better across versions, I reckon.
I hope I've convinced you to issue a pull request at this point? :)
@brainstorm I've not touched this crate for a while now. I will take a look at what changed as I'm starting back at using the crate. Hopefully, I will be able to make the commit that considers your suggestion and put the example to the location you've indicated. Thank you for the note.
This issue, hopefully, acts as an example for someone who is writing custom serializer and deserializer for this crate.
XML standard allows for an attribute to hold the value of a specific type denotes as an array (i.e. vector). The values of the type are space-separated and can only be of one type. Not sure about the mixed scenarios, but my case was specific to one type signifier.
The YaSerde library has YaSerializer and YaDeserializer traits, which can be specified against your own type with specific code conditions for types that are not already defined within the library itself. In this case, the attempt to serialize/deserialize the attribute with the following values, as an example,, would return an error. This is due to no support within a library for Vector of
attributeX="1 2 3 6 5 0"
, into or from Vec::uint32
. The same goes for the list of values inattributeY="a b c d e f g"
.The way to resolve this issue by constructing a separate struct depicting a specific value intention. For example, if your list contains only
uint32
types, then you would create the following struct:Then you'd create a custom de/serialization implementation that looks like the following:
However, if you happen to have a list of values that are alphanumeric and can be defined as Vec::, such as
attributeY="a b c d e f g"
. Then you'd have to create the following struct and respective implementation:At this point, when you have two implementations, the
rustfmt
andclippy
will complain of the code duplication. At which point any additional type de/serlization would further increase the code duplication and you start to wonder if such is a good approach. Since all I'm really doing here is implementing the same (similar) code for a specific type, can't I just try to make the type assertion generic while keeping the logic in place? Generic code comes to the rescue. The two types that I'm referencing above can be implemented with the following, where I define a structXMLVector
of type<T>
:I now can replace my custom implementations for each type by simply indicating the following
UintVector
becomesXMLVector::<unint32>
andStringList
becomesXMLVector::<String>
. The code duplication issue goes away and this generic code can be applied to types supported by the library. However, one caveat is thatVec::<String>
andVec<String>
is auto treated as the same according to rustfmt rules. YaSerde library supportsVec::<String>
notation, notVec<String>
notation, at the time of this note. To avoid autoformat issues, you'd want to denote the struct field with#[rustfmt::skip]
to allow your type to be treated by the library. The struct field might look like the following:I hope this note helps other folks with creation of custom de/serializers. If I wrote something inaccurately or you have a better suggestion, all comments and suggestions are welcomed. I also want to credit @alechenthorne for helping with this effort. @MarcAntoine-Arnaud and team, welcome to comment.