dotnet / orleans

Cloud Native application framework for .NET
https://docs.microsoft.com/dotnet/orleans
MIT License
10.14k stars 2.04k forks source link

Version-Tolerant Serialization #2653

Closed ReubenBond closed 3 years ago

ReubenBond commented 7 years ago

Our current serializer is very fast & flexible, but it does not offer any guarantees about tolerance as users change data types. Currently, the advice is that users should use a version-tolerant serializer such as Bond or Protocol Buffers if that's required (as it is in upgrade scenarios and for storage). These serializers require that users specify a unique id for each field/property. They have the following downsides:

Those serializers do offer cross-language support, whereas ours does not.

Our serializer is special in that is supports:

The question is, can we retain some of this low-fuss nature while also improving robustness with version-tolerance? I believe we can while adding only minimal fuss.

The benefits are clear: much greater ability to reason about upgrade scenarios and data integrity.

These are the changes I would propose if we were to implement this:

while (consumed < length) { var fieldId = input.ReadFieldId(); switch (fieldId) { case : result.field1 = input.(); // For primitive types (we do this today) break; case : result.field2 = serializationManager.DeserializeInner(input, typeof(field2Type)) // For complex types. break; default: // If result type has field with '[ExtensionData]' if (result. == null) result. = new ExtensionData(); result..Add(tag, input.ReadUnknown()) // Else, if the type does not support extension data input.ReadUnknown(); break; }

consumed = input.Position - start; }

byte[] ReadUnknown(input) { // Find length of unknown // Copy & return raw bytes

var typeTag = input.PeekTypeTag() byte[] result; switch (typeTag) { case TypeTag.LengthPrefixed: // Complex type { var length = input.PeekInt(offset: sizeof(TypeTag)); return input.ReadRaw(sizeof(TypeTag) + sizeof(int) + length); } case TypeTag.Int32: return input.ReadRaw(sizeof(TypeTag) + sizeof(int)) // ... other primitive types } }

The code for serializers is conceptually similar, but simpler.

A version-tolerant object would then look like this:
```C#
[TypeId(41592714)] // Omitting this means that the type would be serialized using its full name.
public class MyType<T>
{
  [FieldId(1)]
  public int MyInt { get; set; }

  [FieldId(2)]
  public string MyString { get; set; }

  [FieldId(3)]
  public T SomeOtherType { get; set; }
}

In the past I've suggested that we shouldn't invest in these changes, but my mind has been slowly changing over time for a few reasons:

What are your thoughts?

jason-bragg commented 7 years ago

I agree with your goals, but I'm not terribly giddy about the proposed solution. I'd very much prefer not to be in the serializer business, not only because it's a duplication of effort, but also because having our own serialization tech will inevitably lead to integration issues.

The backwards compatibility problem in Orleans has bothered me for some time, and I've given a fair amount of thought to the problem, to little avail. I can't advocate that we solve this with our own custom serialization logic because it's duplicating existing technologies like Bond, and Protobuf, but as you mentioned these technologies hinder application development.

What has bounced around in my head, and I can't defend as a good idea, merely the least bad of the bad ideas I've considered, is that we have a two phase serializer generation. In the first phase, we detect attributed types and types used in grain interfaces, then, instead of generating serializers, we generate simplified type definitions (Orleans IDL?). The second phase we generate serializers from the type definitions. The second phase should be configurable at build or run time, so that integration with other protocols (bond, protobuf, next-new-hotness) can be used instead of our default serializers.

For bond integration, an Orleans bond nuget would include a converter that generated .bond idl files from the Orleans idl, then run bond codegen on the bond types. Protobuf would do the same. Each idl based serialization tech would need only provide a Orleans idl-> their idl converter to integrate with Orleans.

ReubenBond commented 7 years ago

@jason-bragg I went through the same thing. We're not a serialization library, we're a distributed systems library/framework. The problem is that we have certain requirements on our serializer, eg: that we support a type system which involves ambiguous types like non-sealed classes & generics. Neither ProtoBufs or Bond support that with full fidelity.

Hopefully, we can someday separate the serialization code into its own library & others can use it.

I believe that creating a bond-idl generator + generating interop code would be more complicated & wouldn't provide a better outcome than implementing versioning ourselves.

We would still have to improve type serialization & make users mark up their types & fields with well-defined, stable identifiers. We would also have to write (generate?) code to convert bond-generated types to user-defined types and vice-versa, as well as generating the IDL used by Bond to create those types. The IDL markup could possibly be specified on the interop type so that we only need to generate one class and have Bond generate its own code at runtime (if it supports that).

It's really not a case of 'not invented here' syndrome - as I've said before, I'd love to just delete all of our serialization code & use someone else's.

jason-bragg commented 7 years ago

we have certain requirements on our serializer, eg: that we support a type system which involves ambiguous types like non-sealed classes & generics. Neither ProtoBufs or Bond support that with full fidelity.

I've not taken the time to deeply consider this, but it seems likely that there are reasons behind why neither bond nor protobuf support these requirements. Should we take the hint and also not support them?

I believe that creating a bond-idl generator + generating interop code would be more complicated & wouldn't provide a better outcome than implementing versioning ourselves.

I may have been unclear or am misunderstanding what you are saying here but I was suggesting generating our own idl, which our own minimal serialization logic (with no backwards compatibility support) would read to create serializers. Bond, Protobuf, or any other external serialization tech would need to convert our idl to serializers as well. In the case of bond and protobuf this would be done by generating their respective idls, then using their idl to generate the classes used in serialization. This may well be more complicated all up in that it means integrating with every serialization tech we want to support, but in the narrows sense, I don't see it being very complicated. It's a matter of defining an idl for our classes then splitting the serialization generation to convert to, then from the idl. That first minimal piece is all we need at first. We could then add support for bond as a proof of concept, then leave it to the community to add support for protobuf or any other serialization protocol we need.

In the long run, IMO, this saves us a lot of work and complexity because we can leverage all the work being done by others in the serialization space, without having to solve or duplicate those advances in our system. It also allows us to integrate with existing solutions that are likely already being used in the environments Orleans is being adopted into. In short, I suspect it would be less complicated, and better.

We would still have to improve type serialization & make users mark up their types & fields with well-defined, stable identifiers.

Could you provide a little more detail here. I'm unsure what you mean? I suspect that you may be referring to data version information that is needed in the bond or protobuf idls, the assumption being that we'd need to support a super set of this data to support all protocols. I don't agree with this assumption. The serialization integration would need to have some version delta logic to perform basic versioning and we'd need some generic property passing system to pass custom tags from user defined classes intended for specific serialization technologies, but our framework classes would never use anything specific. That is, we'd limit our framework classes (and any shared extensions like storage providers) to a minimal feature set, while enabling application code to specify serialization specific markers.

For instance, a framework class may look like:

class A
{
    public int MyInt {get;set}
    public string MyString {get;set}
}

then be changed to

class A
{
    public int MyInt {get;set}
    public long MyLong {get;set}
}

For versioning serialization integration (like bond) the integration would need to track the last released version and build it's IDL to signal the version change (in bond this would mean having all three fields).

If a user of bond wanted to include A as a bonded object they could write a class

class MyClass
{
    public int MyInt {get;set}
    [SerializationPropertyTag("bonded")
    public A MyLong {get;set}
}

The serialization tag would be passed to the idl, and be interpreted by the bond idl converter, enabling serialization specific signals, but no framework or extension class could be defined as such, because those types should be limited to minimal common functionality. This type of tagging is only for application code and depends on the serialization tech they are using.

If adding minimal versioning support to our serialization generation is not hard, and we want to support that in our idl via some attribution scheme, I'm all for that, but that doesn't get us a good integration story or enable us to leverage the ongoing efforts of externally developed serialization technologies.

ReubenBond commented 7 years ago

@jason-bragg

I've not taken the time to deeply consider this, but it seems likely that there are reasons behind why neither bond nor protobuf support these requirements. Should we take the hint and also not support them?

It's way too convenient for users and there is no fundamental reason why it cannot be supported. The thing which we need in order to support our type system - which they don't have - is a type catalog. We already catalog types

For versioning serialization integration (like bond) the integration would need to track the last released version and build it's IDL to signal the version change (in bond this would mean having all three fields).

Maintaining a database of schema changes (wherever we store that) is incredibly complicated compared to implementing 'sequence of length-prefixed, tagged fields' support ourselves. If we go with the Code->IDL->IDL->Code approach, then I think we should still have some markup in fields and require that users follow some kind of discipline. The approach you're suggesting is similar to EF Migrations, but EF has a database which they can store schemas/metadata in. Would we store some kind of blob in Git?

If we have generate Bond/Protobuf IDL, then we still need to also generate some interop classes to convert between IBonded and string or whatever the runtime field type is.

I say we need to improve type serialization under the assumption that we still serialize type info before fields.

ReubenBond commented 7 years ago

@jason-bragg it might be worthwhile to have a vid call to discuss all of this, what do you think?

jason-bragg commented 7 years ago

@ReubenBond Not sure a call is necessary. In general I trust your judgment on this subject. My goal here was only to present my thinking thus far. Follow-up comments have mainly been to clarify my suggestion and reasoning with the intention of being understood, rather than being convincing. I'm not entirely convinced of my own idea. :)

I do have a follow-up question if I still have your patience. How do you envision Orleans being used in an environment already using an existing serialization technology like bond? Imagine a distributed service that has many types defined using bond and wishes to use Orleans. The service engineers are accustomed to relying on the versioning support bond provides and change types regularly. Grain calls pass existing bond objects as well as new POCO classes for the new Orleans code. How would the integration work? What would this look like?

ReubenBond commented 7 years ago

@jason-bragg you always have my patience :) I much prefer when a suggestion is challenged versus when everyone simply agrees.

How do you envision Orleans being used in an environment already using an existing serialization technology like bond?

They should use the Bond serializer - that way the semantics are identical with their other systems. We support Bond, and we definitely shouldn't cut support. This suggestion will not affect that, it will just mean that the envelope around those Bond messages are serialized in a more stable manner.

I added the suggestion to use Bond/ProtoBuf to the bottom of the Serialization Docs:

Serialization Best Practices

Serialization serves two primary purposes in Orleans:

  1. As a wire format for transmitting data between grains and clients at runtime.
  2. As a storage format for persisting long-lived data for later retrieval.

The serializers generated by Orleans are suitable for the first purpose due to their flexibility, performance, and versatility. They are not as suitable for the second purpose, since they are not explicitly version-tolerant. It is recommended that users configure a version-tolerant serializer such as Protocol Buffers or Bond for persistent data. The best-practices for the particular serializer of choice should be used in order to ensure version-tolerance. These third-party serializers can be configured using the SerializationProviders configuration property as described above.

jdom commented 7 years ago

I just started reading the thread, but there are a few things that I'd like to clarify from the main post:

Our serializer is special in that is supports: Ambiguous types such as object Generic types Read-only/Get-only/inaccessible fields Minimal fuss, so users can serialize just about any POCO without additional markup.

I'm not really sure ambiguous types at the application level are that valuable, and would prefer less surprises than just assuming we support any object, but then not always be able to fulfill that promise (which we can't always, and it could be even worse with different versions of the type/framework/platform). Generic types are supported by Bond at least. Read-only/Get-only/inaccessible fields strikes me that we only support because we are doing a "magic" serializer, but it can surprise you big time if the environments are not identical in both ends of the wire (which is now more likely with .NET Core and also our intention to support heterogeneous clusters and versioning).

The big win for our serializer is indeed less fuss when there are NO differences on each end of the wire. Other than that, I see no reason why not to embrace these other well known serialization protocols (if users intend to support no-downtime upgrades or different platforms in each end of the wire). That would mean really endorse one of these other serializers, and yes, users will not be able to send just any random object in their messages (when not opting into the magic serializer), but instead send a message that was created with serialization in mind (eg: a bond object). They get the benefit of all the documentation and community out there on how to use it (and we can augment that community if we endorse 1 solution). It also means easier code reviews for end-users, as it is easier to notice when you update a class meant for serialization, as opposed to update a random class that a grain just happens to pass through inadvertently in a message that may fail silently or noisily (depending on weather conditions 😛 ).

ReubenBond commented 7 years ago

In my opinion, we can do better than to force users to choose between:

  1. Being restricted to very limited type system; and
  2. Losing the option for sensible upgrades.

The end result looks almost identical to how a class would look with Bond, except that more types can be used more easily.

Bond:

[Bond.Schema]
public class Foo
{
    [Bond.Id(0)]
    public string message { get; set; }
}

This proposal:

[Id(923045)] // Unique per asm
public class Foo
{
    [Id(0)]
    public string message { get; set; }
}

So the code review process looks the same. If a user wants, they can disable 'non-version-tolerant' mode via an option and have the serializer only accept version-tolerant types, but still give them the advantage of an expressive, flexible type system.

Whatever the outcome, version-tolerant serialization also needs to be implemented in the framework itself so that our types don't have to remain forever frozen.

I wonder what the experience of deserializing a GrainReference nested in some Bond type would look like once non-static clients are in place.

jason-bragg commented 7 years ago

Been thinking on this and have suggestions (I know, so many opinions on this!?)

I’d like to narrow this down by requirements, because I think there are several issues intertwined here. I’d also like to break this down to now, and future needs.

Messaging We need to preserve the ease of use and resemblance of object oriented programing in the actor model. This means passing simple POCO objects with no special knowledge.

now For this our current serializer is suffice.

future When we move to in-place upgrades that may involve changes to data types, versioning may be an issue. Even in that case, I suggest we do not solve it. I suggest we keep the serializer as is, and suggest users leverage a version tolerant serializer if they need data versioning. I don’t think we should just leave it there though. I think we should practice what we preach. For all grain and system targets we support and maintain we should use bond for their data. I’ve been playing with bond more and the tech has come a long way. I don’t think it will be a challenge to do this, and if it is, bond is a MS project and open source, so we should work with them to remove any pain points.

This allows novice users to ramp up quickly with simple poco objects, as well as provide more serious efforts with a working example of how to use Orleans with a version tolerant serializer (bond).

Storage An aggravating factor to the versioning problem is that we’ve used our serializer to write data to storage. To resolve this, the simplest answer is -don’t do that-.
I’ll break this into 2 parts:

Work Modify serialization manager to allow for external serializer to be passed (probably through the context) for serialize and deserialize calls Take hard dependency on bond and replace all storage serialization with bond. Replace data in all grain and system target calls (except some in test) with bond.

Results Novice users and simple services maintain ease of use. Users that need versioning can use bond and have plenty of examples. Users with pre-existing serialization or want to use different serialization (protobuf) can do so by modeling their efforts off how we use bond. Storage is version safe.

PS: I know I did a bit of hand waving when it comes to the storage problem. shhh! Plz pretend you didn’t notice..

ReubenBond commented 7 years ago

Thanks, @jason-bragg :)

I'm fine with this - we can try to use Bond for everything (every type which is allowed to be transferred over the wire) and see how we go.

"Users who want versioning" is every user who ends up building anything other than a demo/PoC, There are no users who don't need versioning, only users who don't know they will need versioning. Our job is to guide all users into The Pit of Success.

jason-bragg commented 7 years ago

"Users who want versioning" is every user who ends up building anything other than a demo/PoC

When it comes to storage, agreed, but for messaging, which is what our serializer should be for, I don't think this is the case. Only users that need in-place upgrades should need versioned messaging.

guide all users into The Pit of Success

+1

yevhen commented 7 years ago

Only users that need in-place upgrades should need versioned messaging.

"in-place upgrade" is the most common setup for virtually anyone. And I don't see why I need to bother with B/G deployment. It's costly (not everyone have Halo budget), it creates a lot of other problems. For example, our application is reactive-interactive. With B/G I would need much more elaborate deployment routines, like when Green cluster is started I should not run any pollers until switch, since that will create duplicate activations. That's complicated.

Despite, I'm already using message-based communication I found modelling every single (even internal only) message with Protobuf/Bond to be time wasting and limiting. The power of those protocols lies in xplat support which nobody really need. I can't imagine calling Orleans cluster from Python. If such need arises - it's much cheaper to create special (json-based) endpoint than to compromise in richness and simplicity everywhere.

I would make a bold claim and will say that Orleans owes such popularity mainly due to serializer. I can imagine how frustrated majority of the users will be, if you tell them to use ProtoBuf/Bond. What about OOPish grains interfaces? What about F# users? I'm not aware of any support from Protobuf/Bond for F#.

I understand desire to "not be in a serialization business" but perhaps as an alternative other OSS .NET binary serializers could be considered? I have good experience with Akka Hyperion (it Apache 2 licensed). It's a binary serializer with option for version-tolerant format with performance comparable to Orleans serializer. It lacks some minors (like readonly fields and ISerializable) but it's polymorphic, support surrogates, F# discriminated unions and much more. Maybe join forces and make it a de-facto standard for .NET binary serialization?

ReubenBond commented 7 years ago

I would make a bold claim and will say that Orleans owes such popularity mainly due to serializer.

I agree - our serializer and RPC are huge benefits in terms of ease-of-development and I think they go unnoticed a lot of the time. That's what I've come to realize.

I found modelling every single (even internal only) message with Protobuf/Bond to be time wasting and limiting.

This is what I want to avoid - we shouldn't force users to use such a restricted type system. Let them model how they want. I say force because (and I hope we can agree on this) everyone needs version tolerance - some just don't know it when they start out.

From looking at Hyperion, I can't see where their version tolerance comes into play - it looks as though it's not actually implemented. EDIT: Asking in their Gitter, the response I got was that it's not implemented.

yevhen commented 7 years ago

Ye, Hyperion is very immature at the moment ((

ReubenBond commented 7 years ago

Rough proposal for a new serialization library: https://gist.github.com/ReubenBond/edffb6615e02a87ac22f00131290e2c2

pisees commented 7 years ago

Did you look as well WCF Data Contracts does not meet need, with its support for forward and backward compatibility? I am not saying that it would work, I have not investigated, just wondering if it was considered. https://docs.microsoft.com/en-us/dotnet/framework/wcf/feature-details/using-data-contracts https://docs.microsoft.com/en-us/dotnet/framework/wcf/feature-details/version-tolerant-serialization-callbacks

ReubenBond commented 7 years ago

@pisees when you say WCF Data Contracts, do you mean using just the attributes in our own serializer or do you mean using DataContractSerializer?

talarari commented 7 years ago

Any progress on this issue? whats the recommended solution for version tolerance?

We're using the built in serializer and the last thing i can say about it is that its easy to use... Things are really easy at the start since everything just magically works, but then you deploy an upgrade and everything breaks. So now we use versioned grains and keep everything backwards compatible, using multiple deployments to change field types without breaking backward compatibly.

We're considering just switching out the serializer for a Json serializer and then we could just change types a lot more freely.

Feels like most people would run into this sooner or later, unless you shutdown your cluster before deploying a new version. How do most people solve this?

ReubenBond commented 7 years ago

@talarari the recommended solution for version tolerance is to use ProtoBuf or Bond, as mentioned at the bottom of the doc page here: http://dotnet.github.io/orleans/Documentation/Advanced-Concepts/Serialization.html Many people use JSON for the same purpose, which is fine, too.

I am preparing a library for version tolerant serialization as described in the gist link above. The library is coming along well. It's about 90% feature complete. I work on it in my spare time, so I don't have an ETA yet.

talarari commented 7 years ago

Ok good to know. We will probably try a json serializer since its easier to work with (no need to define special files like.proto) and it might be fast enough for us.

As a side note, this looks impressive https://github.com/rpgmaker/NetJSON if it works as good as they claim it does it might be a good option for a fast dynamic serializer.

ReubenBond commented 7 years ago

@talarari also check out this one: https://github.com/neuecc/Utf8Json I'm not sure how they compare

talarari commented 6 years ago

@ReubenBond i tried using Orleans's built in OrleansJsonSerializer as the external serializer by registering it like this in Silohost:

config.Globals.SerializationProviders.Add(typeof(Orleans.Serialization.OrleansJsonSerializer).GetTypeInfo());

but i'm getting an error that it doesn't have a parameter-less ctor

fail: Orleans.Serialization.SerializationManager[102404]
      Failed to create instance of type: Orleans.Serialization.OrleansJsonSerializer
System.MissingMethodException: No parameterless constructor defined for this object.
   at System.RuntimeTypeHandle.CreateInstance(RuntimeType type, Boolean publicOnly, Boolean& canBeCached, RuntimeMethodHandleInternal& ctor)
   at System.RuntimeType.CreateInstanceSlow(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache, StackCrawlMark& stackMark)
   at System.Activator.CreateInstance(Type type, Boolean nonPublic)
   at System.Activator.CreateInstance(Type type)
   at Orleans.Serialization.SerializationManager.<RegisterSerializationProviders>b__99_0(TypeInfo typeInfo)

Is there a sample project using Json serialization for grain calls?

talarari commented 6 years ago

Linking to this : https://github.com/dotnet/orleans/issues/3331 Is this part of the 1.5.3 release?

ReubenBond commented 6 years ago

@talarari this issue is specifically for the discussion of a version tolerant serializer. It's ok to ask questions on closed issues/prs. 1.5.3 includes only one specific fix over 1.5.2, #3753.

talarari commented 6 years ago

@ReubenBond I asked here since my plan was to use OrleansJsonSerializer as my version-tolerant serializer of choice and i though it would be helpful for others wanting to do the same. Would have been better to ask in the other issue though, Thanks for the answer.

ReubenBond commented 6 years ago

No worries, @talarari :)

mrpantsuit commented 5 years ago

Any update on this? I'm investigating using Orleans in my company, but need to better understand what will be our deployment and persistence strategies before we proceed, both of which are largely affected by version-tolerant serialization.

ReubenBond commented 5 years ago

You can customize serialization out of the box, particularly for cases where you're persisting data (we certainly don't recommend using the generated serializers for persistence). You can customize serialization on a per-type basis if you want. One team in PlayFab which sits nearby serializes certain types using JSON so that they can use its form of version tolerance.

EDIT: I am intending to continue working on improved out-of-the-box version tolerance regardless of the existing workarounds.

mrpantsuit commented 5 years ago

So the status quo is that if we want to allow for in-place upgrades and proper persistence we need to use a 3rd-party serializer (e.g., Protobuf) for all serialization within Orleans, correct?

ReubenBond commented 5 years ago

@mrpantsuit It depends on what kind of changes you want to be able to make. If you're modifying types which are being sent over the wire in grain calls then you need to make sure that those are version tolerant if you want smooth rolling upgrades. If you're using blue/green deployments then you don't need wire compatibility.

If you're persisting some data then the in-built serializer is not designed for that and you should use a different serializer (eg, protobuf-net if it works for your data models or JSON.NET)

If you are adding/removing methods/grains then you can deploy new functionality using two or three stages (eg 1: add new functionality, 2: use new functionality), which is a common approach regardless of Orleans, and make use of grain interface versioning/heterogeneous clusters: http://dotnet.github.io/orleans/Documentation/deployment/grain_versioning/grain_versioning.html

icanhasjonas commented 3 years ago

Hey All :-)

It's been some time and I've been lurking the thread and here are my 10¢

If there are any thoughts of migrating to a gRPC based network, and possibly allow non-dotnet actors in the future, then ProtoBuf is the obvious choice.

Now, if ProtoBuf was the choice, I think there are ways to get the Hello World example with little to no friction, and also solve for the greater plan.

Something like protobuf-net.Grpc would be a good starter

services
  .AddCodeFirstProtoSerializer() // using Marc Gravells protobuf-net.Grpc approach

Next, maybe it's not so bad defining actors and their messages as .proto files? Borrowing a bit from Proto.Actor

ReubenBond commented 3 years ago

@icanhasjonas supporting non-.NET grains/etc is off-topic for this discussion, but feel free to open a new issue with thoughts. I personally believe that there is a lot of value in supporting .NET well, and that means supporting .NET's type system, which largely excludes Protocol Buffers as a serialization format (unless you're layering new type information on top of that, which means largely losing cross-language support). That doesn't mean that support for other languages cannot be added, but they would have to use a different, more limited protocol (eg, like gRPC/ProtoBufs/JSON/HTTP).

zahirtezcan commented 3 years ago

Question: Do you plan to support DataContractAttribute and friends or are you going with your own attribute family?

ReubenBond commented 3 years ago

@zahirtezcan you will be able to define your own attributes as long as they conform: eg, this uses a [GenerateSerializer] attribute on the type and [Id(x)] attributes on the fields, but you can use any attribute on the type and any attribute which has a single integer argument on the fields/properties. The attribute is configured via csproj, which could be in a shared Directory.Build.props file.

Your feedback is most welcome, so please let us know if you have any preferences. I am considering implementing strict and non-strict mode where in the non-strict mode, in the absence of any configured attribute on the type, types and fields/properties are inferred just as they are today (by examining grain interfaces). That way, a developer can add the attributes later (automatically, using the Roslyn CodeFix), before they start adding/removing type members.

zahirtezcan commented 3 years ago

@ReubenBond thanks for the reply. It is good for us that you support the no attribute case, because it opens the possibility for quick prototyping etc. But for versioning one need to use attributes/annotations to declare naming and ordering of properties as well as defining whether a property is optional etc. In the past we used DataContractAttribute because it is out-of-the box so incurs no external dependencies, and covers the cases above. In fact, even our C++ interop code generator used those attributes for debug text dumps etc.

Some niche cases though, may need specific attributes such as JsonConstructorAttribute which is not included in data-contract family. My deepest wish is, dotnet team to enrich data-contract family to keep-up the expanding serialization needs of the community but it is not relevant to Orleans.

So, IMHO sticking to data-contracts whenever possible and provide additional attributes for specific use cases is the way to go. But, your solution seems much more generic and welcomes broader family of code generators.