Closed fribse closed 4 years ago
Hello, there are two types of 'logs' that are available:
Information/debug output which goes to the stdout/console. This is only valuable if you're encountering problems.
JSON DMARC result output which goes to the path specified in this config setting: json_output_file. This is that good data to ingest into logstash and elasticsearch, as it gives you the details you'll need for reporting on DKIM and SPF success rates, etc.
So if you setup a volume such as /dmarc-log and configure json_output_file=/dmarc-log/dmarc.log and mount that same volume into both the logstash container and the dmarc2logstash container, then you can configure filebeat's prospector path to monitor that /dmarc-log directory.
Yes, that was the idea, now I just need to get a working filebeat config, so far all my filebeats have been installed on windows :-) (for all the terrible apps with lousy logfiles)
Hmm, darned, that I do not understand. I've created this config: In docker-compose:
dmarcfilebeat:
image: docker.elastic.co/beats/filebeat:6.5.1
container_name: dmarcfilebeat
volumes:
- ./dmarc/dmarclogs:/usr/share/filebeat/data
- ./dmarc/filebeat.docker.yml:/usr/share/filebeat/filebeat.yml:ro
- ./dmarc/logs:/logs
filebeat.config:
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
filebeat.registry_file: /usr/share/filebeat/filebeat_registry
filebeat.autodiscover:
providers:
- type: docker
hints.enabled: true
filebeat.inputs:
- type: log
enabled: true
paths:
- /usr/share/filebeat/data/
json.keys_under_root: true
json.add_error_key: true
fields_under_root: true
fields:
source_type: json-logs
logtype: dmarc
output.logstash:
hosts: ["logstash:5000"]
logging.level: info
logging.to_files: true
logging.files:
path: /logs
name: filebeat
keepfiles: 7
permissions: 0644
As per the recommandations in the docs, and I've added the logging myself. But all I see in the log is 'exit 1'
Hi there, wow this looks very interesting. I'm using docker-compose here, and I've placed your program in a container, and currently trying to get a filebeat in another container to send it to logstash (in a third container). Logstash, Kibana and Elasticsearch are already set up and working in the docker-compose, so it should be feasable to do it. Just need to get a handle on the filebeat in a container, I haven't tried that before :-)
Great work, thankyou for the effort!