Closed ivan-lapitski closed 2 years ago
Thanks for the report. I'll take this up with Version 3.0 which I'm currently working on.
I'm going to leave the default behaviour that it returns -1, as the Read() method returns 0 also in a timeout (without raising an exception).
My proposal will be to add a new property ThrowOnReadTimeout
, which is false
by default. If it is going to be set to true
, then it should have the behaviour as described.
The reasoning for this behaviour at the start of the project (v1.x) was to reduce the number of exception blocks, and the user relies on the return value of Read()
or ReadBytes
.
The first implementation is provided. It needs unit test cases and reviews before it gets squashed/merged.
Thanks! Using a property to keep it backwards compatible sounds sensible.
Just a small note, I saw on your documentation on "ThrowOnReadTimeout" that you mention that TCP streams (NetworkStream) work the same way as the original behaviour. According to NetworkStream documentation, Read/ReadByte will throw an IOException on read timeout.
Thanks. I should research more. I've removed the comment. I've implemented unit tests now, need to review and fix the errors.
I believe the implementation is now complete. Unit test cases are written and all pass. If you see any issues, please let me know. I'll close this in a few days.
This is now integrated and referenced. It will be in the next v3.0.0 release.
Thank you!
First of all, thanks for a great library!
A SerialPortStream with ReadTimeout set will return -1 when ReadByte() timeouts, -1 is according to System.IO.Stream documentation "end of stream" wich is not the case here. This creates all sorts of problems in higher layer code that depends on Stream.
I think the expected behaviour is to raise a System.TimeoutException() ?