Closed s-zeng closed 4 years ago
It is indeed intentional. Haskell has built-in support for numbers of arbitrary size, but rust and the serde ecosystem don't. It would have been a lot of work for nothing since serde would have rejected it. Do you have a use for such huge numbers?
I personally don't, but I found this while writing tests for dhall-python, which is an implementation I maintain that depends on dhall-rust w/ FFI, since Python supports arbitrarily-sized numbers. I guess it's unfeasible to support arbitrarily large numbers while serde doesn't support it. Thanks for the info
We could probably add support for deserializing it as a u128
, no?
Probably not actually, since it's being parsed as a u64
currently:
https://github.com/Nadrieril/dhall-rust/blob/8be3891b1e30f61b1f38b96e1ed200f367066032/dhall/src/syntax/ast/expr.rs#L11
I could yeah. But I'd rather wait until someone really needs to store a huge number, to understand what they really need.
I'm not sure if unbounded naturals is within the scope of this project, but this is a bit of behaviour that did bite me when using dhall-rust.
Consider the following program:
Running this with cargo run produces a panic, due to surpassing the max size of u64:
However, the haskell implementation is able to handle this perfectly fine:
Since this is a difference in behaviour between the implementations, I figured it should be reported as an issue. Let me know if this is intentional.