Closed jamescourtney closed 3 years ago
If so, is it something that could be mitigated during the nuget update process?
Looks like we can't script it: https://stackoverflow.com/questions/54410853/how-to-execute-custom-script-when-installing-nuget-package
It is possible that there could be some functional regressions that could lead to bad data:
This would work fine for reference types, but would fail for value types.
var s = new Struct();
table.Struct = s;
s.Value = 5;
var s = new Struct(); table.Struct = s; // this is a by value assignment, so expected c# behavior. s.Value = 5; // this only works with ref types, so you wouldn't expect this to update table.struct
My point was that if you update FlatSharp, and what was a ref struct suddenly becomes a value struct, that would fail in a very nonobvious way.
Unfortunately, I think this needs to be a "won't fix". The rationale is above. I'm happy with breaking changes, insofar as they break noisily. However -- the scenario I described above would be a silent break and would require users to RTFM when upgrading... which won't happen. I certainly don't look at docs each time I update a package.
This isn't ideal, but it seems like a legacy thing that will need to be carried forward. Ultimately, I don't think adding fs_valueStruct
is enough overhead to justify the risk here.
Breaking change for FlatSharp 6: Should value structs be the default?
This would require replace
fs_valueStruct
withfs_referenceStruct
and updating a bunch of tests, but would otherwise be pretty easy.Are there any cases where a user upgrading to v6 would go from a reference struct to a value struct implicitly and experience a meaningful regression because of that?