Open megastef opened 7 years ago
ls
a bucket and read all files... or simply check periodically?
I think a periodical check is no-go. A bucket can have thousands, tens thousands or hundreds of thousands of objects and ls
will not be fast enough.
Perhaps we just need an example configuration and documentation to combine s3cmd with other Logagent plugins:
s3cmd sync --exclude '*.zip' --include 'dir2/*.log' . s3://s3tools-demo/demo/
Documentation links for plugins: https://sematext.com/docs/logagent/#plugins
Helpful S3 download module: https://www.npmjs.com/package/s3-download-stream
Logagent should have a plugin 1) to read S3 data as soon new files are created 2) to import historical data