jacobsa / go-serial

A Go library for dealing with serial ports.
Apache License 2.0
631 stars 121 forks source link

Why does InterCharacterTimeout or MinimumReadSize have to be set? #20

Open cbrake opened 8 years ago

cbrake commented 8 years ago
    if vmin == 0 && vtime < 100 {
        return nil, errors.New("invalid values for InterCharacterTimeout and MinimumReadSize")
    }

Why is the above required? I have an application where I want to read any data from the serial port and immediately return if there is nothing to read. I tried commenting this out and it seems to work fine.

jacobsa commented 8 years ago

I don't remember for sure, but I suspect it's this (from the termios man page):

MIN == 0; TIME == 0: If data is available, read(2) returns immediately, with the lesser of the number of bytes available, or the number of bytes requested. If no data is available, read(2) returns 0.

It sounds like this puts the port into a non-blocking mode, which is discouraged for io.Reader:

Implementations of Read are discouraged from returning a zero byte count with a nil error, except when len(p) == 0.

Even if you do want to allow this, it may be confusing as a default.

cbrake commented 8 years ago

I can see where the defaults of 0, 0 are not desirable for most scenarios.

In the application I'm working on, I need to probe a number of RS485 addresses to see what devices are present. If a device is present, it will reply within 10ms with data. So, with vmin, vtime = 0, I can simply write a packet, wait 10ms and read -- if there is nothing there go on. With the 100ms min delay, my 10ms wait becomes 100ms, and it slows down my probe loop by a factor of 10. I can't use MinimumReadSize=1 because if no devices are at the address, nothing will ever be returned. I could spawn a go routine for each address, but we're sharing one RS485 bus, so not sure how I would ever synchronize those. Another possibility would be to put the RS485 reader in a go routine, and then select on read data returned from that, or a 10ms timeout. This would be the optimal approach but I would need to parse the data as it comes in so that I knew when I had everything, vs just waiting 10ms and assuming I do. However, since there are few devices compared to the number of addresses to scan, this would not really buy much on the scan time and is really a needless complication.

What do you think of a config option that disables the if vmin == 0 && vtime < 100 check? By default, it would be enabled.

jacobsa commented 8 years ago

Would it also work to just change the magic value of 100? I don't know why I chose that.

cbrake commented 8 years ago

That would help, but the simple solution for my scanning task is still 0/0 bytes/timeout. Anything else requires that I implement packet parsing logic and keep track of elapsed time because if I read immediately after I send, the read bytes will trickle in over time, and I'll need to keep re-reading until I determine end of packet or a 10ms window has passed, and every read can lead to a potential Xms delay. Eventually I may implement that, but there are other priorities right now.

This is perhaps an unusual requirement so I may just maintain a forked copy for now.

jacobsa commented 8 years ago

I suppose an AllowNonBlocking bool field that disabled the check in question would be fine.