aws / aws-logging-dotnet

.NET Libraries for integrating Amazon CloudWatch Logs with popular .NET logging libraries
Apache License 2.0
299 stars 133 forks source link

Value null at 'logEvents' failed to satisfy constraint: Member must not be null #25

Closed dietzep closed 5 years ago

dietzep commented 7 years ago

Occasionally my application encounters an exception which gets logged in the LibraryLog, attached here is an example. When this happens, logs stop getting ingested for some time, usually hours, until the application seems to recover and continue delivering logs where it left off. I assume the correlation as I see logs appear in Cloudwatch Logs with a much later ingestion timestamp relative to the log entry, with the gap beginning at the time I see the exceptions in the LibraryLog.

During these periods it seems the internal buffer is all the while filling, as I will also see The AWS Logger in-memory buffer has reached maximum capacity entries in the application's log stream. When this happens it appears the application drops log entries.

It's possible the application is doing something silly and/or preventable, but I don't expect it should be possible for it to put the library in this state. If I can determine that the app is doing something preventable I will post my findings here.

Thank you!

LibraryLog.txt

dietzep commented 6 years ago

If I can offer any better information or my description is unclear, please advise. I still encounter this scenario every few days, rendering my logs unreliable.

KennethWKZ commented 6 years ago

Facing same issue, if log events exceed 256kb size then will hit this error

snakefoot commented 6 years ago

The issue with large log-events has been resolved with AWS.Logger.Core ver. 1.2.0:

Break up large logging messages into sizes that CloudWatch Logs will accept. This change was done to handle the CloudwatchLogs restriction on the event size(256 KB as per https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/cloudwatch_limits_cwl.html). This is addressing the following GitHub issue: #40

dietzep commented 5 years ago

Following up (better late than never?) to say that the issue I reported was in fact triggered by > 256KB log messages as snakefoot suggests. I ended up working around it by truncating messages upstream of AWS.Logger.Core and will now test the updates from issue #40. Thank you!