launchdarkly / dotnet-eventsource

Server-sent events (SSE) client implementation for .NET
Other
49 stars 16 forks source link

System.Net.WebException: The operation has timed out. #52

Closed joe-ifit closed 2 years ago

joe-ifit commented 4 years ago

We are seeing the following Stack Trace in our logs. We are currently using LaunchDarkly.EventSource version 3.2.3

Class: System.Net.WebException Message: The operation has timed out. at System.Net.HttpWebRequest+<RunWithTimeoutWorker>d__241'1[T].MoveNext () <0x101b68a10 + 0x00498> in <8294fc839f2d4b799a08e766e2dfa68e#9da8dd95652572baf7b138c1bdc40608>:0 --- End of stack trace from previous location where exception was thrown --- at System.Net.WebResponseStream+<ReadAsync>d__48.MoveNext () <0x101b13b90 + 0x00a2c> in <8294fc839f2d4b799a08e766e2dfa68e#9da8dd95652572baf7b138c1bdc40608>:0 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) <0x1016cc650 + 0x000d3> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) <0x1016cc5a0 + 0x0008b> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) <0x1016cc530 + 0x00053> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.ConfiguredTaskAwaitable'1+ConfiguredTaskAwaiter[TResult].GetResult () <0x1017a5c40 + 0x0001b> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.IO.StreamReader+<ReadBufferAsync>d__98.MoveNext () <0x101619770 + 0x0046f> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) <0x1016cc650 + 0x000d3> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) <0x1016cc5a0 + 0x0008b> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) <0x1016cc530 + 0x00053> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.ConfiguredTaskAwaitable'1+ConfiguredTaskAwaiter[TResult].GetResult () <0x1017a5c40 + 0x0001b> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.IO.StreamReader+<ReadLineAsyncInternal>d__61.MoveNext () <0x101618060 + 0x001eb> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) <0x1016cc650 + 0x000d3> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) <0x1016cc5a0 + 0x0008b> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) <0x1016cc530 + 0x00053> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at System.Runtime.CompilerServices.TaskAwaiter'1[TResult].GetResult () <0x1016cc910 + 0x0001b> in <3c7b99a36820490fb2cbc5a6fc6b06d8#9da8dd95652572baf7b138c1bdc40608>:0 at LaunchDarkly.EventSource.EventSourceStreamReader+<ReadLineAsync>d__2.MoveNext () <0x1041d44c0 + 0x00243> in <641c2bc30eae4c1d8d3db4b93ea5a14f#9da8dd95652572baf7b138c1bdc40608>:0

eli-darkly commented 4 years ago

Well, without knowing any more about the circumstances, I would say it's possible that it means just what it says: an operation timed out. There are two kinds of timeouts you can have with EventSource, and the duration for each of them is configurable:

  1. Connection timeout: if the server is just not responding to the connection attempt.
  2. Read timeout: if the connection seems to still be alive, but the server has not sent any data in a while.

The read timeout case is particularly tricky because under some circumstances, a streaming connection (not just for EventSource; for anything) can appear to still be open when really it has died, and there's no way to detect that except by noticing that no data has been received for a long time. So, it's common to add "heartbeat" behavior to the server that's providing the SSE stream: every so often (say 3 minutes) it writes a meaningless data item, or just a ":" comment line, to the stream simply so the client knows it's still alive. Then you would set the client's read timeout to some number greater than that (say 4 minutes).

In our experience, this is the best practice to deal with the kind of network unreliability I mentioned, so EventSource uses a read timeout by default. You can however set the read timeout to System.Threading.Timeout.Infinite to disable this.

If you have reason to believe that that's not what's happening, then I would need to know more about your use case, like what is generating the stream, how often does it produce data, how long does your client run before seeing the error - anything at all that you can tell me besides just the stacktrace.