Open toshke994 opened 2 years ago
I cannot confirm.
Doing a quick test on dotnetfiddle (https://dotnetfiddle.net/JfJMxt) using Newtonsoft.Json 13.0.1, token.ToObject<decimal>()
preserves all twelve decimal places.
If you are using an older Newtonsoft.Json version, try updating to 13.0.1 or newer.
If you use the current version and the problem still appears, could you perhaps provide a compact, self-contained and runnable code example (for example on dotnetfiddle) that demonstrates the issue?
Here is example https://dotnetfiddle.net/qKexXD
Thanks for the demo.
The observed behavior is an unfortunate product of both Newtonsoft.Json's default behavior of reading floating-point numbers from the json data as double
and the behavior of .NET's built-in double
--> decimal
conversion. The issue is not really clear-cut: Is it a problem of Newtonsoft.Json, or is it a problem of .NET's built-in double
--> decimal
conversion? (see also end of this post...)
Fortunately, if you are interested in obtaining decimal
numbers from the json data, you can instruct Newtonsoft.Json to read floating-point numbers from the json data as decimal
instead of double
, and thus avoid the observed issue altogether.
You can do so by setting the FloatParseHandling
property of either the JsonTextReader or the JsonSerializerSettings to FloatParseHandling.Decimal
.
The rest of the post below is just some diving into the technical details underlying the observed behavior, but you don't need to read it if you don't want to. The important takeaway is that you might solve your issue in a simple manner by utilizing FloatParseHandling.Decimal
for the deserialization of your data.
The observed behavior is essentially reliant on Newtonsoft.Jsons default behavior of digesting floating-point numbers from the json data as double
. The observed behavior is also depending on the actual numeric value (more specifically, how many significant digits the value has).
When choosing the right -- or perhaps "wrong" value, even token.ToString()
will be affected by a rounding error due to the inherit precision limitation in double
.
Choosing the number 12345.123456789012, for example, both token.ToObject<decimal>()
as well as decimal.Parse(token.ToString())
suffer from rounding errors, albeit different rounding errors:
token.ToObject<decimal>() = 12345.123456789
decimal.Parse(token.ToString()) = 12345.123456789011
Note due to the JsonReader's default behavior of digesting a floating-point number as a double
, rounding errors can already be introduced even before attempting to convert such a number to decimal
.
Dotnetfiddle: https://dotnetfiddle.net/bBmWVi. And for posterity, in case the dotnetfiddle would disappear, here the respective demo code:
using var sr = new StringReader("12345.123456789012");
using var reader = new JsonTextReader(sr);
var token = JToken.Load(reader);
var d1 = token.ToObject<decimal>();
Console.WriteLine($"token.ToObject<decimal>() = {d1}");
var d3 = double.Parse(token.ToString());
Console.WriteLine($"token.ToObject<double>() = {d3}");
var d2 = decimal.Parse(token.ToString());
Console.WriteLine($"decimal.Parse(token.ToString()) = {d2}");
The issue of seeing differing rounding errors for the same number is related to using System.Convert.ToDecimal(...)
in the explicit decimal conversion operator of JToken:
https://github.com/JamesNK/Newtonsoft.Json/blob/52190a3a3de6ef9a556583cbcb2381073e7197bc/Src/Newtonsoft.Json/Linq/JToken.cs#L1108-L1124
as well as in JToken's explicit decimal? conversion operator.
Here it is worth to know that a double
can only represent 15 significant digits with any degree of reliability. Some numbers with more than 15 significant digits can be represented by a double
, whereas other numbers with more than 15 significant digits cannot. Basically, any 16th or 17th digit in a double
being correct or wrong is kinda incidental. (In other words, roundtripping from decimal to double and back to decimal it is not guaranteed to succeed for every number with more than 15 significant digitis.)
As far as i can tell, this is/was the reason the designers of .NET decided that the double
to decimal
conversions will only take into account the "reliable" 15 significant digits. (Such as done by the Decimal(double) constructor and System.Convert.ToDecimal(...) method -- see the remarks section of the linked documentation.)
Converting the double
into a string and then letting it parse as a decimal
will avoid this double
--> decimal
conversion and its limitation to the 15 significant digits, thus possibly getting a different conversion result.
Whether the observed behavior would be a bug/issue with Newtonsoft.Json is not that easily answerable, as the issue can also be demonstrated easily without using Newtonsoft.Json at all:
string number = "12345.123456789012";
double dbl = double.Parse(number);
Console.WriteLine($"source: {number}");
Console.WriteLine($"double: {dbl}");
var d1 = new decimal(dbl);
Console.WriteLine($"decimal(dbl) = {d1}");
var d2 = (decimal)dbl;
Console.WriteLine($"(decimal)dbl = {d2}");
var d3 = decimal.Parse(dbl.ToString());
Console.WriteLine($"decimal.Parse(dbl.ToString()) = {d3}");
will yield the following:
source: 12345.123456789012
double: 12345.123456789011
decimal(dbl) = 12345.123456789
(decimal)dbl = 12345.123456789
decimal.Parse(dbl.ToString()) = 12345.123456789011
I personally tend to see it not so much as a Newtonsoft.Json bug but rather as case of rather unfortunate circumstances, where both Newtonsoft.Json's default behavior as well as .NET's built-in conversion behavior conspire to yield such a surprising result.
Although, this rises the question of whether it would be better and more robust to have Newtonsoft.Json digest floating-point numbers as decimal
by default instead of defaulting to double
. Hmm...
Thanks for the detailed answer.
FloatParseHandling.Decimal
is definitely the right way for me to handle this.
As for dillema if this is an newtonsoft issue or not, it's up to you guys to decide, but from my point of view default behavior shouldn't produce loss of information, because some users (like me) aren't aware of it. In case smaller precision is explicitly needed, user would investigate a way to handle it and come to the point where he learns about default behavior and values to override it with.
Best regards!
default behavior shouldn't produce loss of information
Oh, i absolutely agree with this sentiment and i, too, wish decimal
were the default parsing mode for floating-point numbers, but...
I am not in any way associated with the Newtonsoft.Json project, but here is my 2c worth of opinion:
There is a huge elephant in the room with asking to change the default behavior of the library. Newtonsoft.Json is one of the most popular (if not the most popular) json library for .NET, with an extremely wide adoption both in open/libre as well as commercial software since a very long time already. As such, changing the default behavior of the library, no matter how good the argument for doing so, incurs a huge risk of breaking compatibility and potentially introducing subtle bugs with those existing software projects and deployments. With libraries that popular and wide-spread, it is a very good strategy to be extremely conservative and defensive against changing established behaviors or APIs. Since the solution is relatively cheap and simple by instructing Newtonsoft.Json to read floating-point numbers as decimal
, i don't see compelling reasons so far that would justify taking the risk involved in breaking with long established default behavior (even if that default behavior happens to really suck sometimes...). Kinda like not enough straws that broke the back of not enough camels yet, so to speak... But, as i said, i am not involved with the Newtonsoft.Json, project, so who knows what's going to happen with future Newtonsoft.Json versions. :-D
(In any case, despite my view, i would suggest to keep this issue open here, and the author and maintainer of the library can then decide whether it would be feasible and worth to change this default behavior...)
The right behavior is to not convert the value to anything when parsing and leave it as a char[]
until a .NET type is requested. But that is a design decision that needed to be made in 2007. It's too late to change it now.
Calling
ToObject<decimal>()
onJToken
of typeJTokenType.Float
where value has 12 decimal places will round the number on 11 decimal places. CallingToObject<double>()
will result in correct value with 12 decimal placesExample
ReadJson
method in customJsonConverter
My workaround