Closed CharliePoole closed 11 months ago
Using BinaryFormatter
was a quick fix in the initial development of our TCP communication protocol because it allowed use of arbitrary data types in messages. This was useful because it permitted rapid development and modification of the protocol. However, we actually only use a very limited set of types and it's possible to reduce that set even further. That's what I'll do as the first step.
The TCP communication pipeline is not well tested. In particular, we need tests of the messages themselves. This step will be carried out in parallel with the simplification step.
Messages should be responsible for their own encoding and decoding. Each message type should have an Encode method, which converts that messages content to a byte array. A static Decode method will create the appropriate message instance from a received byte array. Sending the bytes along with a length prefix and receiving messages from a socket will remain the responsibility of the BinarySerializationProtocol
class.
A number of problems had to be resolved in implementing this change. I'm listing them here, primarily for the benefit of anyone working on the corresponding NUnit issue nunit/nunit-console#1354. The first few listed were envisioned ahead of time but the rest came to light as I worked. Naturally, I was only able to address each problem as I found it but it may turn out to be more effective to address them in some other order. :-) I'll continue update this list as I find other problems.
...More to come...
:tada: This issue has been resolved in version 2.0.0-beta4 :tada:
The release is available on:
It's considered unsafe and is expected to go away at some point. See https://github.com/dotnet/designs/blob/main/accepted/2020/better-obsoletion/binaryformatter-obsoletion.md