At the moment we specify timeouts in microseconds using Int. I see a few problems. The microsecond precision seems like a clear overkill and a usage overhead. Milliseconds or even seconds could be better. Another problem is that the Int type carries zero information about what precision we use. So I've given here a go for the DiffTime type.
At the moment we specify timeouts in microseconds using
Int
. I see a few problems. The microsecond precision seems like a clear overkill and a usage overhead. Milliseconds or even seconds could be better. Another problem is that theInt
type carries zero information about what precision we use. So I've given here a go for the DiffTime type.@robx What's your opinion?