datalust / serilog-sinks-seq

A Serilog sink that writes events to the Seq structured log server
https://datalust.co/seq
Apache License 2.0
239 stars 50 forks source link

Missing context information for Event: JSON representation exceeds the body size limit XXX; #202

Closed FynnNikolaus closed 4 months ago

FynnNikolaus commented 1 year ago

I have a question about the "eventBodyLimitBytes" attribute of Serilog. Unfortunately, there are places in our code that lead to an event that exceeds the limit. Unfortunately, the entire context of what was actually logged is lost in the retroactive view. Is there a practicable way to prevent this other than manually truncate the event in the code?

Thank you and best regards Fynn

liammclennan commented 1 year ago

Hi @FynnNikolaus

You can set eventBodyLimitBytes to a higher value, with the consequence that the events sent to Seq will be larger. Note that Seq also places configurable limits on the size of events.

If possible, manually truncating the event before sending is the best solution.

FynnNikolaus commented 1 year ago

Hey @liammclennan , thank you for your quick answer. I think that the default value of 256kb makes sense. If the Seq limit takes effect, the event is no longer displayed on the interface, which is not exactly advantageous. In our log there are 5 - 7 places where such an error often occurs. At every point where such an error occurs in the past, I would like to truncate the event.

However, I don't like the fact that the context is lost when the error occurs. So you can only find out where the error occurred and not what happened. Such an error can also happen because the user makes a special data query that can no longer be reproduce afterwards. Can the truncate be done at a central location?

I have tried the following approach, which I don't like and doesn't work: image image

I cannot access the complete event in the TryDestructure()method and therefore I cannot truncate it at this point. At the point where the error is logged, I can of course make a trancate, but logically I don't know every point where such a problem could potentially occur.

Thanks for your help!

liammclennan commented 1 year ago

It is probably best to manually build your log data, instead of serializing arbitrarily large objects.

nblumhardt commented 4 months ago

Thanks again for raising this @FynnNikolaus; we've reviewed this again and think that controlling this at the point the entity is logged will be the better solution here. If you're still running into any issues though or need help figuring out what that might look like, please let us know.