Currently the component code uses a "latency" delay to try to ensure that it does not miss parts of message, i.e. before checking whether another byte is available at the serial port, the code waits a specified amount of time (documented here). It assumes that, if no byte was transferred during that time period, the complete message has been received.
There is a possibility that parts of the message could get lost, if it takes longer to send the byte that this latency period.
Is this actually possible in serial communications? Or is the information guaranteed to take place according to the baud rate?
If this is possible, could we improve the situation by using a STOP character to identify the end of a message?
Currently the component code uses a "latency" delay to try to ensure that it does not miss parts of message, i.e. before checking whether another byte is available at the serial port, the code waits a specified amount of time (documented here). It assumes that, if no byte was transferred during that time period, the complete message has been received.
There is a possibility that parts of the message could get lost, if it takes longer to send the byte that this latency period.
Is this actually possible in serial communications? Or is the information guaranteed to take place according to the baud rate?
If this is possible, could we improve the situation by using a STOP character to identify the end of a message?