Closed DamianEdwards closed 1 year ago
Tagging subscribers to this area: @dotnet/area-system-text-json, @gregsdennis See info in area-owners.md if you want to be subscribed.
Author: | DamianEdwards |
---|---|
Assignees: | - |
Labels: | `area-System.Text.Json` |
Milestone: | - |
The JSON specification doesn't support integers larger than 64-bits
Which spec are you looking at? For its history, JSON has supported arbitrarily large and precise numbers.
RFC 7159 recognizes that implementations may not be able to support this, so it allows implementations to enforce limitations. But the spec itself imposes no such limitations.
This specification allows implementations to set limits on the range and precision of numbers accepted. Since software that implements IEEE 754-2008 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision. A JSON number such as 1E400 or 3.141592653589793238462643383279 may indicate potential interoperability problems, since it suggests that the software that created it expects receiving software to have greater capabilities for numeric magnitude and precision than is widely available.
@gregsdennis ah indeed, I wasn't looking at the latest it seems, or was inferring too much from related specs like OpenAPI.
So then, given the JSON spec does allow such numbers, what should the .NET behavior be, given it's likely expected to balance some level of platform interoperability with accurate round-tripping of values between same versions of .NET?
Related to #60780. TL;DR it's possible to support arbitrarily large numeric values using a custom converter:
var options = new JsonSerializerOptions { Converters = { new BigIntegerConverter() } };
var value = new MyPoco { Value = BigInteger.Parse("1111111111111111111111111111111111111111111111111111111111111111") };
string json = JsonSerializer.Serialize(value, options);
Console.WriteLine(json);
value = JsonSerializer.Deserialize<MyPoco>(json, options);
Console.WriteLine(value.Value);
public class MyPoco
{
public BigInteger Value { get; set; }
}
public class BigIntegerConverter : JsonConverter<BigInteger>
{
public override BigInteger Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
if (reader.TokenType != JsonTokenType.Number)
{
throw new JsonException();
}
return BigInteger.Parse(Encoding.UTF8.GetString(reader.ValueSpan));
}
public override void Write(Utf8JsonWriter writer, BigInteger value, JsonSerializerOptions options)
{
writer.WriteRawValue(value.ToString());
}
}
This is just like DateOnly and TimeOnly. When we add new data types like this there's usually a bunch of follow up work items:
I wonder if we have a checklist like this somewhere...
We don't have a checklist, but I just had the same thought, @davidfowl. I'd support doing this in RC1/RC2 ask mode with the bar check of "incomplete feature."
I get that these are new low level primitives, but do we have folks/projects that actually need all these APIs on the lower-level (reader, writer) and DOM types at this time (as proposed in https://github.com/dotnet/runtime/issues/74139)? Is support only in JsonSerializer
sufficient?
I don't believe anyone is using APIs which haven't shipped yet. I think the question is how frequently we expect people to use it in JSON and how... Assuming those types met the bar to be added to System namespace I'd assume there was high enough demand but perhaps @tannergooding will have better data
I've talked with @eiriktsarpalis about this and we'll delay adding support for this in System.Text.Json until we have more data on scenarios and how frequently this will be used. I initially assumed that given these APIs met the bar to be added to System namespace they have high expected usage but seems that might be incorrect assumption. Instead for 7.0 I'll add test validating custom converter can be added and share code here in case someone needs it and if we get strong feedback that we should add this support we can scavenge the PR.
Please see tests added in the PR above to see how users can add converters for Int128/UInt128/Half/BigInteger. If we get enough data proving those are not niche scenarios we will add those to System.Text.Json.
Closing in favor of https://github.com/dotnet/runtime/issues/87994
The JSON specification doesn't support integers larger than 64-bits as far as I can tell, which makes sense given it's JavaScript-based origins. This leaves the question of how to represent larger numbers somewhat open today, leading to different approaches in the wild for dealing with such numbers, e.g. encode as string, ignore spec and use customized JSON parser, etc.
What do we expect folks to do in the case of
System.Text.Json
, and by extension, ASP.NET Core (dotnet/aspnetcore#43119)? Create and use a customJsonConverter<Int128>
? Would we consider including a default converter in the runtime, and if so, what would it be?