serilog-contrib / Serilog.Sinks.AmazonS3

Serilog.Sinks.AmazonS3 is a library to save logging information from Serilog to Amazon S3. The idea there was to upload log files to Amazon S3 to later evaluate them with Amazon EMR services.
MIT License
21 stars 21 forks source link

Rolling interval is not respected in periodic batching #52

Open OlekMotyka opened 1 year ago

OlekMotyka commented 1 year ago

Hello lately i was trying to implement some configuration which tries to push logs to S3 bucket, every full hour where every log is in one file. Log is generated by a windows service so it runs non stop.

Code that i use:

logger.WriteTo.AmazonS3(
    restrictedToMinimumLevel: Serilog.Events.LogEventLevel.Debug,
    path: $"{fileName}.log",
    bucketName: "BucketName",
    endpoint: RegionEndpoint.EUCentral1,
    bucketPath: "BucketPath",
    awsAccessKeyId: "AccessKey",
    awsSecretAccessKey: "SecretKey",
    rollingInterval: Serilog.Sinks.AmazonS3.RollingInterval.Hour,
    eagerlyEmitFirstEvent: false,
    formatter: new Serilog.Formatting.Json.JsonFormatter(),
    batchingPeriod: TimeSpan.FromHours(2),
    batchSizeLimit: 10000
    )

BatchPeriod and batchSizeLimit are used to make sure that every event is logged in same batch and finally be saved in same file.

Issue here is rollingInterval, which makes correct file names, however acts like it is set to infinity. Log files are pushed in the middle of hour (e.g. 11:30), and another file appear with suffix _001.

Debugging showed that:

logger._logEventSinks[0]._sink._batchedLogEventSink.amazonS3Options.RollingInterval = Infinite

Can I use rolling interval and batching configuration in that way? Perhaps it is a bug here?

Thanks in advance.

SeppPenner commented 1 year ago

I can confirm that this is a bug... The issue occurs in AmazonS3Sink class where the rolling interval is not set from the options... Stay tuned for the new version, will be available soon.

SeppPenner commented 1 year ago

Should be fixed with https://github.com/serilog-contrib/Serilog.Sinks.AmazonS3/commit/82161caecfba5b53f65718079eb6bec5ba4f7f57. Sadly, I can't test at the moment as I don't get my data to AWS in any test case somehow (No exceptions thrown in SelfLog).

SeppPenner commented 1 year ago

Please try with version 1.4.0.

SeppPenner commented 1 year ago

Or just hit me up and I re-open this issue again if it doesn't work ;)

genoher commented 1 year ago

I am trying to configure so that a log file is created for each day. Either I haven't known how to configure it, or there is still a bug about this.

This is my setup...

.WriteTo.AmazonS3(
                    path: "logs.web..txt",
                    bucketName: bucketName,
                    endpoint: amazonRegionEndoint,
                    awsAccessKeyId: accessKeyId,
                    awsSecretAccessKey: secretAccessKey,
                    restrictedToMinimumLevel: LogEventLevel.Verbose,
                    outputTemplate: null,
                    formatProvider: null,
                    levelSwitch: null,
                    rollingInterval: Serilog.Sinks.AmazonS3.RollingInterval.Day,
                    encoding: null,
                    failureCallback: e => { Log.Error(e, $"Sink error: '{e.Message}'."); },
                    bucketPath: $"{envirotment}/Logs/{now.Year}/{now.Month:00}/web",
                    batchSizeLimit: 100,
                    batchingPeriod: TimeSpan.FromSeconds(15),
                    eagerlyEmitFirstEvent: null,
                    queueSizeLimit: 10000)

These are the files I have in S3...

image

I would like to have a single file for each day...

Reviewing the code, I have seen that in S3 it is not possible to add new events to a file that has already been created. Maybe this is the problem... with rollingInterval per day and batching period to 15 seconds and queueSizeLimit to 10.000.

mngobbi commented 1 year ago

I'm having the same setup as @genoher and updating to 1.4.0 does not solve the issue 😬

   .WriteTo.AmazonS3(debugfile, "logs", Amazon.RegionEndpoint.USEast1, "xxxx", "xxxx",
                restrictedToMinimumLevel: Serilog.Events.LogEventLevel.Debug, 
                outputTemplate: "{Timestamp:dd-MM-yy HH:mm:ss} [{Level:u3}] <{SourceContext}> {Message}{NewLine}{Exception}",
                rollingInterval: Serilog.Sinks.AmazonS3.RollingInterval.Day, bucketPath: s3_path, 
                batchSizeLimit: 10000, eagerlyEmitFirstEvent: true, batchingPeriod: TimeSpan.FromMinutes(15));
HikaruChiu commented 1 year ago

me too same issue

fcobo commented 8 months ago

Any update on this ticket? I cannot find a way to avoid the generation of multiple files even though the rolling interval is set to Day. I am using version 1.5.1. The only way I figured out to avoid this is to have a batchingPeriod of 24 hours but that's not ideal. With the following configuration, it generates a file every 15 seconds

serilogConfig.WriteTo.AmazonS3( "log.txt", RegionEndpoint.EUWest2, LogEventLevel.Verbose, "[{Timestamp:yyyy-MM-dd HH:mm:ss.fff} {Level:u3}] {Message:lj}{NewLine}", rollingInterval: Serilog.Sinks.AmazonS3.RollingInterval.Day, bucketPath: "logs/", batchSizeLimit: 10000, batchingPeriod: TimeSpan.FromSeconds(15)

dodomood commented 6 months ago

Hi, is this still a thing? I want to use this package but this bug is preventing me from doing so.

SeppPenner commented 6 months ago

Hi, is this still a thing? I want to use this package but this bug is preventing me from doing so.

Yes, although I haven't yet figured out why this happens.