Open theli-ua opened 2 weeks ago
Something like this should pass once this is implemented
#[test]
fn human_readable() {
// IpAddr has different repr based on if codec is considered
// human readable or not {true: string, false: byte array}
let ip: IpAddr = "127.0.0.1".parse().unwrap();
let expected_binary = [
224, 1, 0, 234, 235, 129, 131, 216, 134, 113, 3, 135, 179, 130, 86, 52, 233, 129, 138,
182, 33, 127, 32, 32, 33, 1,
];
let expected_s = "\"127.0.0.1\" ";
let binary = to_binary(&ip).unwrap();
assert_eq!(&binary[..], &expected_binary[..]);
let s = to_string(&ip).unwrap();
assert_eq!(s, expected_s);
assert_eq!(&from_ion::<IpAddr, _>(binary).unwrap(), &ip);
assert_eq!(&from_ion::<IpAddr, _>(s).unwrap(), &ip);
}
Thank you for reporting this issue @theli-ua! I was able to reproduce the issue with your provided sample test. I will investigate more on the issue and also look at your reference on is_human_readable
.
is_human_readable
by default is implemented as returningtrue
Ion's (de)serializer should returnfalse
for binary encodings. This is used by serialize/deserialize to use different encoding in serde (and/or custom serialize/deserialize implementations). One such example would be std::net::IpAddr which uses string representation in human readable version and byte arrays otherwise.