Closed enshivam closed 4 years ago
How are u measuring latency? Time-based measurements are usually very imprecise at the sub-millisecond range. You could be seeing a lot of noise.
Sorry to not mention that. I have my pub and sub in same solution, and using a static stopwatch to measure the time difference between the send the receive.
Please use a benchmark framework to do your measurements more accurately. You should at least measure the latency of 1000s of messages and average to get a better number. In any case, a 1ms figure over a tcp connection is ok I believe. Are you using a loopback connection or binding to a external ip? There are a lof of factors in play.
In any case, a 1ms figure over a tcp connection is ok I believe. So, this is basically the expected behaviour, right?
Do you suggest any other library for ultra low latency requirements.?
Is it a .NET Stopwatch? How do you get the latency? As @israellot wrote - such a request goes over TCP/IP to Redis, gets parsed and executed, response comes back, gets parsed and then your SUB handler is executed, so 1ms is not that terrible here I think.
What I am trying to say is that you can't trust your stopwatch by measuring a single message round trip. Even a thread context switch could spoil your results. Averaging your results should converge to a more reliable number. Benchmark frameworks will do their best to remove the overhead of the measurement itself. It's like quantum physics, if you observe you already altered the result, so you can't measure it precisely.
But the bottom line is:
I have measured it with 1000 messages, in several runs the average varies from 0.9 to 1.4 milliseconds. I got the point though, that is what is expected on the TCP.
Thanks so much for the prompt replies and support. Thanks!
Getting single message (pub sub) latency of around 1 milliseconds if there is no continuous flow of messages in pub/sub. Is it an expected behaviour. Please suggest.