Closed haf closed 9 years ago
This one I'm torn on. I can definitely see what you mean, but worry about breaking the very 1-1 mapping between .net and json types (despite the json number type being a horror story). Might sleep on this :)
For me, if I use decimal I've made a decision that I need the accuracy; otherwise I'd use a 64 bit type; also the PR still allows numeric input (which is what you might expect in API usage terms)...
Yeah, that's not unreasonable - the allowance for either input is probably enough to make it sane given a major version bump :)
On 20 April 2015 at 22:23, Henrik Feldt notifications@github.com wrote:
For me, if I use decimal I've made a decision that I need the accuracy; otherwise I'd use a 64 bit type; also the PR still allows numeric input (which is what you might expect in API usage terms)...
— Reply to this email directly or view it on GitHub https://github.com/xyncro/chiron/issues/22#issuecomment-94572456.
I just realised JSON numbers have potentially infinite number of decimals (according to its spec). Let's not use string; but instead parse numbers to 128 bits accuracy if we're deserialising to a decimal? http://json.org/
I'm going to close this and create a new issue (I've reverted the string roundtripping in #30) with a more specific case of parsing to accuracy.
If we serialise decimal, we lose the 128 bit accuracy; better to serialise it to a string by default and parse that string.