Closed Hawxy closed 9 months ago
Tagging subscribers to this area: @dotnet/area-system-text-json, @gregsdennis See info in area-owners.md if you want to be subscribed.
Author: | Hawxy |
---|---|
Assignees: | - |
Labels: | `area-System.Text.Json` |
Milestone: | - |
Per https://github.com/dotnet/runtime/issues/63747#issuecomment-1158624112 this is a known limitation of the metadata reader. In the future we might consider adding read-ahead support specifically for type discriminators (I don't believe reference preservation to have such requirements), but this would be at the expense of potentially buffering excessive data in the case of async serialization.
Is there any known workaround for this? I want to deserialize an object that does not come from System.Text.Json
but has a discriminator that can be used that is of course not guaranteed to be first. The only thing I managed so far is to just deserialized first that one field and then specify type explicitly (i.e. not using the new polymorphic deserialization at all).
Maybe there is something else that can be done, like e.g. explicit reordering or something like that?
No workaround that doesn't involve writing a custom converter, unfortunately. In the future we might consider exposing a flag that reads ahead for type discriminators, but obviously that would come at the expense of performance so it would be turned off by default.
Thanks for the answer! I assume the "future" will not be in .Net 7, so not coming anytime soon, right?
Correct, .NET 7 is currently in RC so feature development has been concluded.
Why can't I use my own property for the type discriminator?
My object has a string Type
property already. According to this documentation
it says:
Avoid a JsonPolymorphicAttribute.TypeDiscriminatorPropertyName if it conflicts with a property in your type hierarchy.
I would like to tell System.Text.Json
that there is a property (the first one) named Type
, just use it.
[JsonPolymorphic(TypeDiscriminatorPropertyName = "type")]
[JsonDerivedType(typeof(ChatbotSettings), typeDiscriminator: ChatbotSettings.TypeName)]
[JsonDerivedType(typeof(UnattendedSettings), typeDiscriminator: UnattendedSettings.TypeName)]
[JsonDerivedType(typeof(AttendedSettings), typeDiscriminator: AttendedSettings.TypeName)]
internal abstract class RobotSettings
{
/// <summary>
/// Do not use. This is for Json serialization.
/// </summary>
internal RobotSettings() { }
internal RobotSettings(string type) => Type = type;
internal required string Type { get; init; }
}
Why can't I use my own property for the type discriminator?
The value of a string property can be arbitrary and might not necessarily reflect the runtime type of the object you are trying to serialize. In your example I would simply annotate the Type
property with JsonIgnoreAttribute
.
I have this issue as well. When serializing JSON to MySQL using ef core, it seems that the ordering is not preserved so the data cannot be deserialized anymore.
There is a bit of a hack to work around this since MySQL seems to do deterministic ordering:
[JsonPolymorphic(TypeDiscriminatorPropertyName = "!")]
You might be able to get away with " " (space) as well, but basically a very low value ascii character that's valid will cause it to be sorted first. Definitely would not recommend doing this in production, but I am using this for myself.
With .NET 7 I have a solution that should be at least fast.
I have a custom converter (in this case it is abstract because I have different ways to resolve type and discriminator) to handle writing and reading:
public abstract class InheritanceConverterBase<T> : JsonConverter<T>, IInheritanceConverter where T : notnull
{
private readonly JsonEncodedText discriminatorProperty;
public string DiscriminatorName { get; }
protected InheritanceConverterBase(string discriminatorName)
{
discriminatorProperty = JsonEncodedText.Encode(discriminatorName);
DiscriminatorName = discriminatorName;
}
public abstract Type GetDiscriminatorType(string name, Type typeToConvert);
public abstract string GetDiscriminatorValue(Type type);
public override T? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
// Creating a copy of the reader (The derived deserialisation has to be done from the start)
Utf8JsonReader typeReader = reader;
if (typeReader.TokenType != JsonTokenType.StartObject)
{
throw new JsonException();
}
while (typeReader.Read())
{
if (typeReader.TokenType == JsonTokenType.PropertyName &&
typeReader.ValueTextEquals(discriminatorProperty.EncodedUtf8Bytes))
{
// Advance the reader to the property value
typeReader.Read();
if (typeReader.TokenType != JsonTokenType.String)
{
ThrowHelper.JsonException($"Expected string discriminator value, got '{reader.TokenType}'");
return default!;
}
// Resolve the type from the discriminator value.
var type = GetDiscriminatorType(typeReader.GetString()!, typeToConvert);
// Perform the actual deserialization with the original reader
return (T)JsonSerializer.Deserialize(ref reader, type, options)!;
}
else if (typeReader.TokenType == JsonTokenType.StartObject || typeReader.TokenType == JsonTokenType.StartArray)
{
if (!typeReader.TrySkip())
{
typeReader.Skip();
}
}
}
ThrowHelper.JsonException($"Object has no discriminator '{DiscriminatorName}.");
return default!;
}
public override void Write(Utf8JsonWriter writer, T value, JsonSerializerOptions options)
{
JsonSerializer.Serialize<object>(writer, value!, options);
}
}
public interface IInheritanceConverter
{
string DiscriminatorName { get; }
Type GetDiscriminatorType(string name, Type typeToConvert);
string GetDiscriminatorValue(Type type);
}
The problem was always the writing part, because you have to buffer the resulting object:
public override void Write(Utf8JsonWriter writer, T value, JsonSerializerOptions options)
{
var name = GetDiscriminatorValue(value.GetType());
writer.WriteStartObject();
writer.WriteString(discriminatorProperty, name);
using (var document = JsonSerializer.SerializeToDocument(value, value.GetType(), options))
{
foreach (var property in document.RootElement.EnumerateObject())
{
property.WriteTo(writer);
}
}
writer.WriteEndObject();
}
So you effectively write the object, then you deserialize it and then you serialize it again. So my idea with .NET 7 was to add a custom property to the contract with a custom resolver. You cannot use modifiers, because you need the type info of the base class.
public sealed class PolymorphicTypeResolver : DefaultJsonTypeInfoResolver
{
public override JsonTypeInfo GetTypeInfo(Type type, JsonSerializerOptions options)
{
var typeInfo = base.GetTypeInfo(type, options);
var baseType = type.BaseType;
if (baseType == null || baseType == typeof(object))
{
return typeInfo;
}
var baseInfo = GetTypeInfo(baseType, options);
if (baseInfo.Converter is IInheritanceConverter inheritanceConverter)
{
var discriminatorField = typeInfo.CreateJsonPropertyInfo(typeof(string), inheritanceConverter.DiscriminatorName);
var discriminatorValue = inheritanceConverter.GetDiscriminatorValue(type);
discriminatorField.Get = x =>
{
return discriminatorValue;
};
typeInfo.Properties.Insert(0, discriminatorField);
}
return typeInfo;
}
}
I don't like that you need the type resolver and the converter right now, but I do not see another option.
Currently running into this issue. We are required to consume JSON from a predefined contract from an Open API-spec provided by a third party. They are using type discriminators liberally, and we cannot consume their requests because of this without making a workaround.
Type discrimination is part of Open API specs, but ordering of elements in JSON is by definition not a thing. I would prefer correctness by default on this issue, and rather have an optional spec-violating performance mode when you van control both the consumer and producer.
There also seem to be an issue with the error messages.
If the Base type is abstract and the payload doesn't have the typeDiscriminator as the first property then this error is shown:
System.NotSupportedException: 'Deserialization of types without a parameterless constructor, a singular parameterized constructor, or a parameterized constructor annotated with 'JsonConstructorAttribute' is not supported.
However if the Base type is not abstract and the payload doesn't have the typeDiscriminator as the first property then this error is shown:
The metadata property is either not supported by the type or is not the first property in the deserialized JSON object.
It should say the latter in both cases.
This would also break things like cosmos' patching, where you dont control the order that properties are stored in.
Honestly, I would say this feature is not ready for prime-time at all without this issue resolved.
This issue makes using the new JsonDerivedType
attribute with EF Core impossible. Seems to me that two core .NET libraries should work together, right?
I wrote a custom converter I'd like to share. I'm not sure how performant it is or if the error handling is up to snuff, but the code is straightforward. I think it works in .Net 6 too. Any feedback would be appreciated.
EDIT/UPDATE: I updated the code because serialization would only work if the type parameter was the base type (like JsonSerializer.Serialize<BaseType>(subtypeInstance, options)
).
New .Net fiddle to see it in action. Here's the old fiddle
Some features:
[JsonIgnore]
and [JsonPropertyName]
are respected[JsonPolymorphicAttribute]
/[JsonDerivedTypeAttribute]
isn't necessaryBasically it's:
JsonDocument
(JsonDocument.TryParseValue(...)
)doc.RootElement.GetProperty(discriminatorPropName)
)JsonObject
(JsonSerializer.SerializeToElement
, JsonObject.Create
)jObj[discriminatorPropName] = ...
)The code:
#nullable enable
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Text.Json.Serialization.Metadata;
sealed public class PolymorphicJsonConverter<T> : JsonConverter<T>
{
private readonly string discriminatorPropName;
private readonly Func<Type, string> getDiscriminator;
private readonly IReadOnlyDictionary<string, Type> discriminatorToSubtype;
public PolymorphicJsonConverter(
string typeDiscriminatorPropertyName,
Func<Type, string> getDiscriminatorForSubtype,
IEnumerable<Type> subtypes)
{
discriminatorPropName = typeDiscriminatorPropertyName;
getDiscriminator = getDiscriminatorForSubtype;
discriminatorToSubtype = subtypes.ToDictionary(getDiscriminator, t => t);
}
public override bool CanConvert(Type typeToConvert)
=> typeof(T).IsAssignableFrom(typeToConvert);
// When a custom converter is defined for a JsonSerializerOptions instance,
// you can't use the options to get the JsonTypeInfo for any types the
// converter can convert, so unfortunately we have to create a copy with
// the converters removed.
JsonSerializerOptions? originalOptions = null;
JsonSerializerOptions? optionsWithoutConverters = null;
JsonTypeInfo getTypeInfo(Type t, JsonSerializerOptions givenOpts)
{
if (optionsWithoutConverters is null)
{
originalOptions = givenOpts;
optionsWithoutConverters = new(givenOpts);
optionsWithoutConverters.Converters.Clear();
}
if (originalOptions != givenOpts)
{
throw new Exception(
$"A {typeof(PolymorphicJsonConverter<>).Name} instance cannot " +
$"be used in multiple {nameof(JsonSerializerOptions)} instances!");
}
return optionsWithoutConverters.GetTypeInfo(t);
}
public override T Read(
ref Utf8JsonReader reader, Type objectType, JsonSerializerOptions options)
{
using var doc = JsonDocument.ParseValue(ref reader);
JsonElement root = doc.RootElement;
JsonElement typeField = root.GetProperty(discriminatorPropName);
if (typeField.GetString() is not string typeName)
{
throw new JsonException(
$"Could not find string property {discriminatorPropName} " +
$"when trying to deserialize {typeof(T).Name}");
}
if (!discriminatorToSubtype.TryGetValue(typeName, out Type? type))
{
throw new JsonException($"Unknown type: {typeName}");
}
JsonTypeInfo info = getTypeInfo(type, options);
T instance = (T)info.CreateObject!();
foreach (var p in info.Properties)
{
if (p.Set is null) continue;
if (!root.TryGetProperty(p.Name, out JsonElement propValue))
{
if (p.IsRequired)
{
throw new JsonException($"Required property {p.Name} was not found.");
}
else
{
continue;
}
}
p.Set(instance, propValue.Deserialize(p.PropertyType, options));
}
return instance;
}
public override void Write(
Utf8JsonWriter writer, T? value, JsonSerializerOptions options)
{
Type type = value!.GetType();
if (type == typeof(T))
{
throw new NotSupportedException(
$"Cannot serialize an instance of type {typeof(T)}, only its subtypes.");
}
writer.WriteStartObject();
writer.WriteString(discriminatorPropName, getDiscriminator(type));
JsonTypeInfo info = getTypeInfo(type, options);
foreach (var p in info.Properties)
{
if (p.Get is null) continue;
writer.WritePropertyName(p.Name);
object? pVal = p.Get(value);
JsonSerializer.Serialize(writer, pVal, options);
}
writer.WriteEndObject();
}
}
@dragorosson you might want to check out https://github.com/wivuu/Wivuu.JsonPolymorphism/
it does basically what you're doing above but with a source generator
This is a huge issue. Azure Logic Apps adds the discriminator as the last property. So if we want to Deserialize a workflow that is already created using Azure Portal, we can't deserialize just because of this. it's a bummer.
We shouldn't rely on the order of properties.
I can understand the performance impact, but my thinking is primary functionality isn't working is a bigger issue than that.
I'm using Marten which under the hood uses Postgres its jsonb
columns.
For my scenario, this currently prevents me from using the built-in polymorphism support with jsonb columns in Postgres, as object properties have no guaranteed order.
jsonb
columns (re-)order JSON object keys by length:
By contrast, jsonb does not preserve white space, does not preserve the order of object keys, and does not keep duplicate object keys. If duplicate keys are specified in the input, only the last value is kept. [...] In general, most applications should prefer to store JSON data as jsonb, unless there are quite specialized needs, such as legacy assumptions about ordering of object keys. [...] Note that object keys are compared in their storage order; in particular, since shorter keys are stored before longer keys
So a workaround for the case of Postgres is choosing a discriminator that's shorter than any other object key (i.e. property name) like [JsonPolymorphic(TypeDiscriminatorPropertyName = "$t")]
What an unbelievably unreasonable requirement. You barely ever have any control over the ordering of properties in a JSON document, since most of the time it's being received from some external API. This limitation makes no sense whatsoever, and is making this feature practically unusable in countless perfectly valid scenarios.
This honestly needs to be prioritized.
@eiriktsarpalis Any chance this makes it to .NET 8?
@aradalvand Agreed, this feature should either be deprecated with alternative recommendations put into the documentation, or should be or properly implemented. JSON spec doesn't guarantee property order.
Per the spec:
An object is an unordered collection of zero or more name/value pairs, where a name is a string and a value is a string, number, boolean, null, object, or array.
Emphasis mine. Source: https://www.rfc-editor.org/rfc/rfc7159.html
In the meantime, if anyone's interested, I've written a minimalistic JsonPolymorphicConverter
that doesn't have this limitation, which you can set up and use pretty easily if it fits your use case:
public class JsonPolymorphicConverter<TBaseType> : JsonConverter<TBaseType>
{
private readonly string _discriminatorPropertyName;
private readonly Dictionary<string, Type> _discriminatorToSubtype;
private readonly Dictionary<Type, string> _subtypeToDiscriminator;
public JsonPolymorphicConverter(
string discriminatorPropertyName,
Dictionary<string, Type> mappings
)
{
_discriminatorPropertyName = discriminatorPropertyName;
_discriminatorToSubtype = mappings;
_subtypeToDiscriminator = _discriminatorToSubtype.ToDictionary(kvp => kvp.Value, kvp => kvp.Key);
}
public override bool CanConvert(Type typeToConvert)
=> typeToConvert.IsAssignableTo(typeof(TBaseType)); // NOTE: By default (in the parent class's implementation), this only returns true if `typeToConvert` is *equal* to `T`.
public override TBaseType? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
using var jsonDoc = JsonDocument.ParseValue(ref reader);
string typeDiscriminator = jsonDoc.RootElement
.GetProperty(_discriminatorPropertyName)
.GetString()!;
var type = _discriminatorToSubtype[typeDiscriminator];
return (TBaseType?)jsonDoc.Deserialize(type, RemoveThisFromOptions(options));
}
public override void Write(Utf8JsonWriter writer, TBaseType value, JsonSerializerOptions options)
{
var type = value!.GetType();
writer.WriteStartObject();
writer.WriteString(_discriminatorPropertyName, _subtypeToDiscriminator[type]);
using var jsonDoc = JsonSerializer.SerializeToDocument(value, type, RemoveThisFromOptions(options));
foreach (var prop in jsonDoc.RootElement.EnumerateObject())
{
writer.WritePropertyName(prop.Name);
prop.Value.WriteTo(writer);
}
writer.WriteEndObject();
}
private JsonSerializerOptions RemoveThisFromOptions(JsonSerializerOptions options)
{
JsonSerializerOptions newOptions = new(options);
newOptions.Converters.Remove(this); // NOTE: We'll get an infinite loop if we don't do this
return newOptions;
}
}
var obj = JsonSerializer.Deserialize<Parent>(input, new JsonSerializerOptions
{
Converters =
{
new JsonPolymorphicConverter<Parent>(
discriminatorPropertyName: "type",
new()
{
["child-1"] = typeof(Child1),
["child-2"] = typeof(Child2),
}
),
},
});
Is it possible to encode this differently, so we can keep the memory usage low
Is it possible to encode this differently, so we can keep the memory usage low
The library linked above (mine) uses a source generator to generate the polymorphic deserialization at compile time
Does it avoid reading the whole object into memory first? I don’t see how you can avoid that since the order can’t be guaranteed.
Does it avoid reading the whole object into memory first? I don’t see how you can avoid that since the order can’t be guaranteed.
Not practical without being able to rewind a JSON reader
Agreed, this feature should either be deprecated with alternative recommendations put into the documentation, or should be or properly implemented.
I don't mind the decision of prioritizing performance over spec compliance. Performance was one of the reasons why System.Text.Json was developed from scratch. It's only really important that we get a compliant deserializer in the end.
So let's upvote this issue 👍 and hope we get it with .NET 8!
Agreed, this feature should either be deprecated with alternative recommendations put into the documentation, or should be or properly implemented.
I don't mind the decision of prioritizing performance over spec compliance. Performance was one of the reasons why System.Text.Json was developed from scratch. It's only really important that we get a compliant deserializer in the end.
So let's upvote this issue 👍 and hope we get it with .NET 8!
Perhaps it could peak to see if the type is on top and fall back to less performant behavior if not
Perhaps it could peak to see if the type is on top and fall back to less performant behavior if not
So it should fallback to suboptimal performance for, what, 90%, 95%, 99% of the JSON out there not containing any metadata fields? (only the types decorated with a JsonDiscriminator, obviously.)
@eiriktsarpalis put it pretty well, already:
In the future we might consider exposing a flag that reads ahead for type discriminators, but obviously that would come at the expense of performance so it would be turned off by default.
I think it becomes a question of whether we want System.Text.Json (STJ) to be spec compliant by default or performant by default.
Users handling JSON without metadata should IMHO get the best performance possible without having to enable a feature they don't even know about. Users handling JSON containing metadata fields will run into these limitations and have to make a decision depending on their requirements.
E.g. when deciding between Postgres json
and jsonb
columns and combining them with STJ, I can either have:
jsonb
columns with better indexing and query performance but worse STJ deserialization performancejson
columns with worse indexing and query performance but better STJ deserialization performanceIt's always a trade-off ...
So it should fallback to suboptimal performance for, what, 90%, 95%, 99% of the JSON out there not containing any metadata fields?
No, only the types decorated with a JsonDiscriminator, obviously. I would be fine with being able to disable fallback behavior on the attribute for cowboys who dont believe in specs
The nature of the JSON reader being a struct makes peaking seem pretty doable; I could definitely be missing something obvious here
var reader = new Utf8JsonReader(Encoding.UTF8.GetBytes(json));
// Copy the reader struct by value to avoid advancing the `reader`
var peakCopy = reader;
Assert.True(peakCopy.Read());
// Start reading the object
Assert.Equal(JsonTokenType.StartObject, peakCopy.TokenType);
Assert.True(peakCopy.Read());
Assert.Equal(JsonTokenType.PropertyName, peakCopy.TokenType);
// Peak the property name
var propertyName = peakCopy.GetString();
if (propertyName == "$type")
{
// Assign the input ref to the peak
reader = peakCopy;
// Read the type value
Assert.True(reader.Read());
Assert.Equal(JsonTokenType.String, reader.TokenType);
var typeName = reader.GetString();
// Read the remaining properties for the descendent type here...
}
else if (propertyName != null)
{
// Read the remaining properties for the descendent type here...
var element = JsonElement.ParseValue(ref reader);
// Get the $type property
if (!element.TryGetProperty("$type", out var typeName))
{
throw new JsonException("Missing $type property.");
}
// Deserialize the `element` as the desired type
}
The nature of the JSON reader being a struct makes peaking seem pretty doable; I could definitely be missing something obvious here
You're right that reader instances can be checkpointed using a copy, however the problem lies with async deserialization. In that case it isn't guaranteed that Utf8JsonReader
contains the entire payload for the object being read, so to support proper read-ahead we first need to fill the buffer with all necessary data. This is already being done for custom converters, but we try to avoid it for something as pervasive as the built-in object converter, hence the requirement to have all metadata up-front.
@eiriktsarpalis Then can't there be an opt-in option for this (which perhaps could also be turned on by default for synchronous deserializations)? The rationale behind the limitation doesn't apply to synchronous deserialization as you explained (which makes up a great portion of use cases), and even for async deserialization, in some scenarios, you do need this nonetheless.
Then can't there be an opt-in option for this
Yes, that's probably what we would do once we get around to addressing the issue.
Running into the same issue. Enphase's Envoy communication gateway returns:
///...
"network": {
"web_comm": true,
"ever_reported_to_enlighten": true,
"last_enlighten_report_time": 1684757038,
"primary_interface": "eth0",
"interfaces": [
{
"type": "ethernet", // <<< Type
"interface": "eth0",
"mac": "XXX",
"dhcp": true,
"ip": "XXX",
"signal_strength": 1,
"signal_strength_max": 1,
"carrier": true
},
{
"signal_strength": 0,
"signal_strength_max": 0,
"type": "wifi", // <<< Type
"interface": "wlan0",
"mac": "XXX",
"dhcp": true,
"ip": null,
"carrier": false,
"supported": true,
"present": true,
"configured": false,
"status": "connecting"
}
]
},
//...
Let me be clear though; I think whoever came up with this (Enphase) should be consistent in the first place, but, hey, here we are. Now I need to implement custom converters and whatnot just because they are inconsistent.
@RobThree isn't that type
at the start of the object (representing an interface), as required by System.Text.Json? Your problem may be with recognizing type
as opposed to $type
- in that case take a look at the docs.
@roji Good spot! No, I actually have specified the type discriminator property name explicitly:
[JsonPolymorphic(TypeDiscriminatorPropertyName = "type")]
[JsonDerivedType(typeof(EthernetInterface), typeDiscriminator: "ethernet")]
[JsonDerivedType(typeof(WiFiInterface), typeDiscriminator: "wifi")]
public record NetworkInterface
(
//...
);
I should've mentioned that. My bad. Good find!
@eiriktsarpalis Do you have any estimate on when this issues might be resolved? Will it e.g. make it into net8?
It's currently not planned for inclusion in .NET 8
One thing that would help MASSIVELY is a better error message. I was aware of the current limitations but still spent the better part of an hour debugging a problem, until I realized that the fault lies in some old persisted JSON in which the (custom) type property wasn't at the beginning of the JSON object.
Something like "type is polymorphic, but no type discriminator found" or "type discriminator not at beginning of the object" would have made the problem immediately clear, whereas the current message
Deserialization of types without a parameterless constructor, a singular parameterized constructor, or a parameterized constructor annotated with 'JsonConstructorAttribute' is not supported.
is just mystifying, as the objects in question have an attributed constructor
@EnCey check out the library I linked above; you should be able to set a 'default' type to deserialize objects to if there is no matching type
In the meantime, if anyone's interested, I've written a minimalistic
JsonPolymorphicConverter
that doesn't have this limitation, which you can set up and use pretty easily if it fits your use case:public class JsonPolymorphicConverter<TBaseType> : JsonConverter<TBaseType> { private readonly string _discriminatorPropertyName; private readonly Dictionary<string, Type> _discriminatorToSubtype; private readonly Dictionary<Type, string> _subtypeToDiscriminator; public JsonPolymorphicConverter( string discriminatorPropertyName, Dictionary<string, Type> mappings ) { _discriminatorPropertyName = discriminatorPropertyName; _discriminatorToSubtype = mappings; _subtypeToDiscriminator = _discriminatorToSubtype.ToDictionary(kvp => kvp.Value, kvp => kvp.Key); } public override bool CanConvert(Type typeToConvert) => typeToConvert.IsAssignableTo(typeof(TBaseType)); // NOTE: By default (in the parent class's implementation), this only returns true if `typeToConvert` is *equal* to `T`. public override TBaseType? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) { using var jsonDoc = JsonDocument.ParseValue(ref reader); string typeDiscriminator = jsonDoc.RootElement .GetProperty(_discriminatorPropertyName) .GetString()!; var type = _discriminatorToSubtype[typeDiscriminator]; return (TBaseType?)jsonDoc.Deserialize(type, RemoveThisFromOptions(options)); } public override void Write(Utf8JsonWriter writer, TBaseType value, JsonSerializerOptions options) { var type = value!.GetType(); writer.WriteStartObject(); writer.WriteString(_discriminatorPropertyName, _subtypeToDiscriminator[type]); using var jsonDoc = JsonSerializer.SerializeToDocument(value, type, RemoveThisFromOptions(options)); foreach (var prop in jsonDoc.RootElement.EnumerateObject()) { writer.WritePropertyName(prop.Name); prop.Value.WriteTo(writer); } writer.WriteEndObject(); } private JsonSerializerOptions RemoveThisFromOptions(JsonSerializerOptions options) { JsonSerializerOptions newOptions = new(options); newOptions.Converters.Remove(this); // NOTE: We'll get an infinite loop if we don't do this return newOptions; } }
Usage:
var obj = JsonSerializer.Deserialize<Parent>(input, new JsonSerializerOptions { Converters = { new JsonPolymorphicConverter<Parent>( discriminatorPropertyName: "type", new() { ["child-1"] = typeof(Child1), ["child-2"] = typeof(Child2), } ), }, });
@aradalvand This almost works for me, however my model contains nested polymorphic types (due to the RemoveThisFromOptions method which strips the type converter for any nested objects) Any recomendations to enable type converters for the nested polymorphic classes?
cough the library I wrote should support nested polymorphic types...
cough the library I wrote should support nested polymorphic types...
@onionhammer You're right, the term "code generation" competly put me off without investigating it further. I'll give it a go now! :)
In the meantime, if anyone's interested, I've written a minimalistic
JsonPolymorphicConverter
that doesn't have this limitation, which you can set up and use pretty easily if it fits your use case:public class JsonPolymorphicConverter<TBaseType> : JsonConverter<TBaseType> { private readonly string _discriminatorPropertyName; private readonly Dictionary<string, Type> _discriminatorToSubtype; private readonly Dictionary<Type, string> _subtypeToDiscriminator; public JsonPolymorphicConverter( string discriminatorPropertyName, Dictionary<string, Type> mappings ) { _discriminatorPropertyName = discriminatorPropertyName; _discriminatorToSubtype = mappings; _subtypeToDiscriminator = _discriminatorToSubtype.ToDictionary(kvp => kvp.Value, kvp => kvp.Key); } public override bool CanConvert(Type typeToConvert) => typeToConvert.IsAssignableTo(typeof(TBaseType)); // NOTE: By default (in the parent class's implementation), this only returns true if `typeToConvert` is *equal* to `T`. public override TBaseType? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) { using var jsonDoc = JsonDocument.ParseValue(ref reader); string typeDiscriminator = jsonDoc.RootElement .GetProperty(_discriminatorPropertyName) .GetString()!; var type = _discriminatorToSubtype[typeDiscriminator]; return (TBaseType?)jsonDoc.Deserialize(type, RemoveThisFromOptions(options)); } public override void Write(Utf8JsonWriter writer, TBaseType value, JsonSerializerOptions options) { var type = value!.GetType(); writer.WriteStartObject(); writer.WriteString(_discriminatorPropertyName, _subtypeToDiscriminator[type]); using var jsonDoc = JsonSerializer.SerializeToDocument(value, type, RemoveThisFromOptions(options)); foreach (var prop in jsonDoc.RootElement.EnumerateObject()) { writer.WritePropertyName(prop.Name); prop.Value.WriteTo(writer); } writer.WriteEndObject(); } private JsonSerializerOptions RemoveThisFromOptions(JsonSerializerOptions options) { JsonSerializerOptions newOptions = new(options); newOptions.Converters.Remove(this); // NOTE: We'll get an infinite loop if we don't do this return newOptions; } }
Usage:
var obj = JsonSerializer.Deserialize<Parent>(input, new JsonSerializerOptions { Converters = { new JsonPolymorphicConverter<Parent>( discriminatorPropertyName: "type", new() { ["child-1"] = typeof(Child1), ["child-2"] = typeof(Child2), } ), }, });
I am trying to use this with a reference handler and it throws on Serialize saying that the converter does not supports metadata writes or read, any idea :( ?
Speaking of solutions, I wrote a source generator for that: https://github.com/aviationexam/json-converter-source-generator/
which generates this and PolymorphicJsonConvertor
I was digging a bit and it seems that reference handling in custom converters is not possible. We do not have access to the state
and therefore to the instance of the ReferenceResolver
:
I found this: https://github.com/dotnet/docs/issues/21777#issuecomment-736751404 but it's a big hack and it will break when you use options multiple times.
There's lots of interesting ways the stj system can be modded and made better but ms made too much of it internal. I wish they made it public but put it under an Internal namespace with a disclaimer (like ef core) to let the community at least have seamless ways to plug the gaps between year long gulfs between .net releases.
On Thu, Nov 23, 2023, 12:55 PM Jan Trejbal @.***> wrote:
Speaking of solutions, I wrote a source generator for that: https://github.com/aviationexam/json-converter-source-generator/
which generates this https://github.com/aviationexam/json-converter-source-generator/blob/main/src/Aviationexam.GeneratedJsonConverters.SourceGenerator.Tests/JsonPolymorphicConverterGeneratorSnapshotTests.SimpleWorks%23BaseContractJsonPolymorphicConverter.g.verified.cs and PolymorphicJsonConvertor https://github.com/aviationexam/json-converter-source-generator/blob/main/src/Aviationexam.GeneratedJsonConverters.SourceGenerator.Tests/JsonPolymorphicConverterGeneratorSnapshotTests.SimpleWorks%23PolymorphicJsonConvertor.g.verified.cs
I was digging a bit and it seems that reference handling in custom converters is not possible. We do not have access to the state and therefore to the instance of the ReferenceResolver: [image: image] https://user-images.githubusercontent.com/3594540/285182451-b6aeb9e7-efaa-454d-befb-a8818285ad27.png
I found this: dotnet/docs#21777 (comment) https://github.com/dotnet/docs/issues/21777#issuecomment-736751404 but it's a big hack and it will break when you use options multiple times.
— Reply to this email directly, view it on GitHub https://github.com/dotnet/runtime/issues/72604#issuecomment-1824217647, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABDK5PPBRYARURN7UVZQIFLYF4TRBAVCNFSM54HUHIR2U5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCOBSGQZDCNZWGQ3Q . You are receiving this because you are subscribed to this thread.Message ID: @.***>
I'm also unfortunately blocked on migrating to STJ due to this. It would be nice to have a temporary package that allows the reader to find the discriminator in whichever position it is. We could use a package till the limitation is lifted
That's why https://github.com/aviationexam/json-converter-source-generator/ happen :) Sure it's not a MS package, but it does the job
Great job. You implement support for polymorphism in System.Text.Json, only to immediately make it unusable by implementing an unrealistic ordering requirement that is rarely met in the wild. So you can basically only use it with JSON you control the creation of yourself. Why bother at all, then? Per the spec, JSON has no ordering, so why require it in System.Text.JSON? Performance is always welcome, but not at the expense of functionality.
At least some opt-in workaround should've been offered for those that can sacrifice some performance for compatibility. How the heck can you guarantee that discriminator will come first if JSON has no ordering?? That's some poor engineering decision. And I thought STJ was ready for prime time, finally, after many years of development.
EDIT See https://github.com/dotnet/runtime/issues/72604#issuecomment-1904569859 for an API proposal.
Description
Attempting to deserialize a polymorphic structure with System.Text.Json that doesn't feature
$type
at start of the object results in an exception.For my scenario, this currently prevents me from using the built-in polymorphism support with
jsonb
columns in Postgres, as object properties have no guaranteed order.Developer comment on this issue: https://github.com/dotnet/runtime/issues/63747#issuecomment-1158624112
Reproduction Steps
Expected behavior
Deserialization should work.
Actual behavior
Regression?
Limitation of initial implementation
Known Workarounds
I currently use PolyJson as an alternative, as the implementation reads ahead to find the discriminator.
Configuration
Impacts any version of STJ 7.0
Other information
No response