dotnet / runtime

.NET is a cross-platform runtime for cloud, mobile, desktop, and IoT apps.
https://docs.microsoft.com/dotnet/core/
MIT License
14.86k stars 4.63k forks source link

The JSON value of length n is too large and not supported #2 #39953

Open albracko opened 4 years ago

albracko commented 4 years ago

I'm referring to this issue https://github.com/dotnet/runtime/issues/30746 that was closed with limit of 125MB staying fixed opposed to being configurable.

It was argued that there would be no common cases hitting the 125MB limit. Such cases probably never occur on UI but there are such cases...for example if you have a REST API that returns JSON document which has some document encoded as a base64 in a single field or it has another JSON document nested inside a single field as a string which is used for digital signing and verification of it. And i have such a case where i'm digitally signing some JSON document and the creating a new JSON where the payload field (contains my JSON document) is larger than the 125MB limit.

Dotnet-GitSync-Bot commented 4 years ago

I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label.

thracx commented 3 years ago

Why set arbitrary limitations that cannot be adjusted, when all use-cases can be accommodated by having default values that cannot be overwritten in those use-cases? System.Text.Json is designed to be standards compliant, and the standard does not specify such limits - from rfc8259#section-9, implementations may set limits, but no limiting values are suggested or recommended. This only adds a limitation to this implementation and this is not clearly specified in the documentation.

For my use-case, the migration guide did not specify these limitations and I am only learning about this hard-stop issue late during testing after having long completed the development.

I suggest adding options similar to JsonSerializerOptions.MaxDepth to replace or override the constants such as JsonConstants.MaxCharacterTokenSize, JsonConstants.MaxBase64ValueTokenSize, etc.

If support for larger values is not added, then the second suggestion is to update all the documentation so that these limitations are clear and upfront, so others can know in advance if they need to stay with JavaScriptSerializer/NewtonSoft.Json/etc.

eiriktsarpalis commented 2 years ago

Agree that this should likely be made configurable.

ghost commented 2 years ago

This issue has been marked with the api-needs-work label. This may suggest that the proposal requires further refinement before it can be considered for API review. Please refer to our API review guidelines for a detailed description of the process.

When ready to submit an amended proposal, please ensure that the original post in this issue has been updated, following the API proposal template and examples as provided in the guidelines.

minhtuanit commented 2 years ago

Any update with this issue?

inf9144 commented 2 years ago

This hardcoded thing just crashed my production site. Fix ahead or do i need a new serializer?

erdihu commented 2 years ago

Is the only solution to this issue is to go back to Newtonsoft? I really do not want to lose all the perks of using System.Text.Json but I also need to support large JSON data.

albracko commented 2 years ago

Yeah, sadly for the time being, the only solution is to use Newtonsoft for big JSON fields

kasperk81 commented 2 years ago

Agree that this should likely be made configurable.

instead of making it configurable, how about remove the limit? let it scale with system's available resources and let the stack overflow in worse case.

albracko commented 2 years ago

i'm for any of those 2 options as long as it doesn't have the hard limit

Luk164 commented 2 years ago

Just hit the issue as well, wanted to serialize a dataset and it was just over the hard limit.

eiriktsarpalis commented 1 year ago

This discussion https://github.com/dotnet/corefx/pull/40792/files#r320546088 might provide insight into why the limits were introduced originally and why they might actually not be necessary.

eiriktsarpalis commented 1 year ago

Related issue: #61089

osexpert commented 1 year ago

"640K ought to be enough for anybody."

albracko commented 1 year ago

Sorry, but if YOU don't need it that doesn't automatically mean everybody else doesn't need it. YOU is not ANYBODY/EVERYBODY ;)

krwq commented 1 year ago

Would any of these two help here?

albracko commented 1 year ago

None of those would help in my case since they are both talking about streaming. I on the other had need to serialize the object into a JSON string which i also have to store. If it were just a case of returning this JSON through REST API, then yeah, those 2 issues that are talking about streaming would help.

AlanMacdonald commented 1 year ago

@stephentoub this is pretty painful. I just got this on a dotnet 6 production system with a background job that processes customer documents that calls out to other internal APIs for some work and uses the httpClient.PostAsJsonAsync method. In this case the customer document was huge and way above average size. You had comments on https://github.com/dotnet/corefx/pull/40792/files#r320546088 that sounded like you were hoping these limits could be removed in a future version.

It is very counterintuitive to hit an arbitrary limit in an otherwise enterprise level framework. I will need to look into whether I can change the serializer to NewtonSoft and keep these http client calls the same or change the code to not use them so we can control the serializer or look into alternatives to handle rare cases of extremely large documents.

It would be great if the limit here was available memory and not an arbitrary restriction.

eiriktsarpalis commented 1 year ago

Reopening to track some of the other hardcoded limits such as the following:

https://github.com/dotnet/runtime/blob/39e022d338e3d78c79b938aef8b14201a4546ab6/src/libraries/System.Text.Json/src/System/Text/Json/JsonConstants.cs#L71-L73

EmanueleBaron commented 10 months ago

Good night!!

Has the limit on json been removed in the latest version on System.Text.Json?

I have the problem in my API in production :(

eiriktsarpalis commented 10 months ago

If you're asking about the fix in https://github.com/dotnet/runtime/pull/85334, it should be available with .NET 8. You can try it out today using the RC2 NuGet package: https://www.nuget.org/packages/System.Text.Json/8.0.0-rc.2.23479.6

sxotney commented 7 months ago

I'm not sure if it's resolved or I'm missing something. I came across this problem in .NET 8.0 GA. My use case being users of our systems wish to download a csv format of their data from our Blazor based reporting tool. These data downloads could easily stretch to over 125mb

I send a base64 string representation of the csv to the JSRuntime and a download is then triggered via a javascript call. It's a neat solution that I'd prefer to keep and can't think of any workarounds due to the JSRuntime dependency.

Luk164 commented 7 months ago

I'm not sure if it's resolved or I'm missing something. I came across this problem in .NET 8.0 GA. My use case being users of our systems wish to download a csv format of their data from our Blazor based reporting tool. These data downloads could easily stretch to over 125mb

I send a base64 string representation of the csv to the JSRuntime and a download is then triggered via a javascript call. It's a neat solution that I'd prefer to keep and can't think of any workarounds due to the JSRuntime dependency.

Honestly for that case it might be better to instead generate a file in multiple steps and download that, this issue has been with us for way too long to wait for it to get resolved

rohkhann commented 2 months ago

What is the defined limit for S for Deserialization of a json string into a json document? I see the max limit be defined as : 166_666_666 for MaxCharacterTokenSize but in the fix in this pr:#85534 it seems to be Int.MaxValue/4 * 3