Open OlekMotyka opened 1 year ago
I can confirm that this is a bug... The issue occurs in AmazonS3Sink
class where the rolling interval is not set from the options... Stay tuned for the new version, will be available soon.
Should be fixed with https://github.com/serilog-contrib/Serilog.Sinks.AmazonS3/commit/82161caecfba5b53f65718079eb6bec5ba4f7f57. Sadly, I can't test at the moment as I don't get my data to AWS in any test case somehow (No exceptions thrown in SelfLog).
Please try with version 1.4.0.
Or just hit me up and I re-open this issue again if it doesn't work ;)
I am trying to configure so that a log file is created for each day. Either I haven't known how to configure it, or there is still a bug about this.
This is my setup...
.WriteTo.AmazonS3(
path: "logs.web..txt",
bucketName: bucketName,
endpoint: amazonRegionEndoint,
awsAccessKeyId: accessKeyId,
awsSecretAccessKey: secretAccessKey,
restrictedToMinimumLevel: LogEventLevel.Verbose,
outputTemplate: null,
formatProvider: null,
levelSwitch: null,
rollingInterval: Serilog.Sinks.AmazonS3.RollingInterval.Day,
encoding: null,
failureCallback: e => { Log.Error(e, $"Sink error: '{e.Message}'."); },
bucketPath: $"{envirotment}/Logs/{now.Year}/{now.Month:00}/web",
batchSizeLimit: 100,
batchingPeriod: TimeSpan.FromSeconds(15),
eagerlyEmitFirstEvent: null,
queueSizeLimit: 10000)
These are the files I have in S3...
I would like to have a single file for each day...
Reviewing the code, I have seen that in S3 it is not possible to add new events to a file that has already been created. Maybe this is the problem... with rollingInterval per day and batching period to 15 seconds and queueSizeLimit to 10.000.
I'm having the same setup as @genoher and updating to 1.4.0 does not solve the issue 😬
.WriteTo.AmazonS3(debugfile, "logs", Amazon.RegionEndpoint.USEast1, "xxxx", "xxxx",
restrictedToMinimumLevel: Serilog.Events.LogEventLevel.Debug,
outputTemplate: "{Timestamp:dd-MM-yy HH:mm:ss} [{Level:u3}] <{SourceContext}> {Message}{NewLine}{Exception}",
rollingInterval: Serilog.Sinks.AmazonS3.RollingInterval.Day, bucketPath: s3_path,
batchSizeLimit: 10000, eagerlyEmitFirstEvent: true, batchingPeriod: TimeSpan.FromMinutes(15));
me too same issue
Any update on this ticket? I cannot find a way to avoid the generation of multiple files even though the rolling interval is set to Day. I am using version 1.5.1. The only way I figured out to avoid this is to have a batchingPeriod of 24 hours but that's not ideal. With the following configuration, it generates a file every 15 seconds
serilogConfig.WriteTo.AmazonS3( "log.txt", RegionEndpoint.EUWest2, LogEventLevel.Verbose, "[{Timestamp:yyyy-MM-dd HH:mm:ss.fff} {Level:u3}] {Message:lj}{NewLine}", rollingInterval: Serilog.Sinks.AmazonS3.RollingInterval.Day, bucketPath: "logs/", batchSizeLimit: 10000, batchingPeriod: TimeSpan.FromSeconds(15)
Hi, is this still a thing? I want to use this package but this bug is preventing me from doing so.
Hi, is this still a thing? I want to use this package but this bug is preventing me from doing so.
Yes, although I haven't yet figured out why this happens.
Hello lately i was trying to implement some configuration which tries to push logs to S3 bucket, every full hour where every log is in one file. Log is generated by a windows service so it runs non stop.
Code that i use:
BatchPeriod and batchSizeLimit are used to make sure that every event is logged in same batch and finally be saved in same file.
Issue here is rollingInterval, which makes correct file names, however acts like it is set to infinity. Log files are pushed in the middle of hour (e.g. 11:30), and another file appear with suffix _001.
Debugging showed that:
Can I use rolling interval and batching configuration in that way? Perhaps it is a bug here?
Thanks in advance.