Closed fribse closed 2 years ago
Hi @jertel I can see that it now just says 'Preserving email message' which is much better than before, thankyou for the update! I really love this small project, simple and easy to use. Is there a setting that instead of preserving the email, would just delete it? Nobody checks the box setup manually here, so they will just build up over time I guess.
I pushed a change that supports a new flag (delete_failures). By default it's not going to delete failures, but if you set that new param to 1 then it will delete them. Let me know if it works out for you.
Sounds great, as soon as I get elastic up and running again in 7.16.2 I'll check it out :-)
Hi @jertel So I'm finally getting somewhere in this setup... It has taken way too long, and in the process I accidently deleted all data 😢 Now I've gotten everything moved to 7.16.3, but I can't get the dmarcfilebeat working again. I tried fetching the config, but that complains about filebeat.prospectors which is no longer supported, and I'm not allowed to set the 'registry' file either.
The config from the readme
filebeat.prospectors:
- type: log
enabled: true
paths:
- "/opt/dmarc2logstash/*.log"
json.keys_under_root: true
json.add_error_key: true
fields_under_root: true
fields:
source_type: json-logs
output.logstash:
hosts:
- logstash:5000
index: dmarc
timeout: 15
logging.level: info
And my old config
filebeat.config:
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
#filebeat.registry_file: filebeat_registry.json
filebeat.inputs:
- type: log
enabled: true
paths:
- /dmarclogs/*.log
json.keys_under_root: true
json.add_error_key: true
fields:
source_type: json-logs
logtype: dmarc
output.logstash:
hosts: ["logstash:5000"]
#logging.level: debug
#logging.to_files: true
#logging.files:
# path: /logs
# name: filebeat
# keepfiles: 7
# permissions: 0644
When I start them I see this first:
WARNING: The STACK_VERSION variable is not set. Defaulting to a blank string.
WARNING: The ELASTIC_PASSWORD variable is not set. Defaulting to a blank string.
WARNING: The KIBANA_PASSWORD variable is not set. Defaulting to a blank string.
WARNING: The MEM_LIMIT variable is not set. Defaulting to a blank string.
WARNING: The CLUSTER_NAME variable is not set. Defaulting to a blank string.
WARNING: The LICENSE variable is not set. Defaulting to a blank string.
Then in the logs I see this:
dmarcfilebeat | 2022-02-23T10:46:47.245Z INFO cfgfile/reload.go:164 Config reloader started
dmarcfilebeat | 2022-02-23T10:46:47.246Z INFO cfgfile/reload.go:224 Loading of config files completed.
dmarcfilebeat | 2022-02-23T10:46:47.246Z INFO [input.harvester] log/harvester.go:309 Harvester started for file. {"input_id": "b996f7a8-5774-4eba-a56a-76f7f31b0359", "source": "/dmarclogs/dmarc.log", "state_id": "native::940178948-64769", "finished": false, "os_id": "940178948-64769", "old_source": "/dmarclogs/dmarc.log", "old_finished": true, "old_os_id": "940178948-64769", "harvester_id": "09881e4f-9dd2-484b-ba49-104dc9ebf964"}
dmarcfilebeat | 2022-02-23T10:46:47.479Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
dmarcfilebeat | 2022-02-23T10:46:47.479Z INFO [publisher] pipeline/retry.go:223 done
dmarcfilebeat | 2022-02-23T10:46:47.478Z INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(async(tcp://logstash:5000))
dmarcfilebeat | 2022-02-23T10:46:49.010Z ERROR [publisher_pipeline_output] pipeline/output.go:154 Failed to connect to backoff(async(tcp://logstash:5000)): dial tcp 192.168.112.8:5000: connect: connection refused
dmarcfilebeat | 2022-02-23T10:46:49.010Z INFO [publisher_pipeline_output] pipeline/output.go:145 Attempting to reconnect to backoff(async(tcp://logstash:5000)) with 1 reconnect attempt(s)
dmarcfilebeat | 2022-02-23T10:46:49.010Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
dmarcfilebeat | 2022-02-23T10:46:49.010Z INFO [publisher] pipeline/retry.go:223 done
dmarcfilebeat | 2022-02-23T10:46:52.255Z ERROR [publisher_pipeline_output] pipeline/output.go:154 Failed to connect to backoff(async(tcp://logstash:5000)): dial tcp 192.168.112.8:5000: connect: connection refused
dmarcfilebeat | 2022-02-23T10:46:52.256Z INFO [publisher_pipeline_output] pipeline/output.go:145 Attempting to reconnect to backoff(async(tcp://logstash:5000)) with 2 reconnect attempt(s)
dmarcfilebeat | 2022-02-23T10:46:52.256Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
dmarcfilebeat | 2022-02-23T10:46:52.256Z INFO [publisher] pipeline/retry.go:223 done
dmarcfilebeat | 2022-02-23T10:47:00.028Z ERROR [publisher_pipeline_output] pipeline/output.go:154 Failed to connect to backoff(async(tcp://logstash:5000)): dial tcp 192.168.112.8:5000: connect: connection refused
dmarcfilebeat | 2022-02-23T10:47:00.028Z INFO [publisher_pipeline_output] pipeline/output.go:145 Attempting to reconnect to backoff(async(tcp://logstash:5000)) with 3 reconnect attempt(s)
dmarcfilebeat | 2022-02-23T10:47:00.028Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
dmarcfilebeat | 2022-02-23T10:47:00.028Z INFO [publisher] pipeline/retry.go:223 done
dmarcfilebeat | 2022-02-23T10:47:15.324Z ERROR [publisher_pipeline_output] pipeline/output.go:154 Failed to connect to backoff(async(tcp://logstash:5000)): dial tcp 192.168.112.8:5000: connect: connection refused
dmarcfilebeat | 2022-02-23T10:47:15.324Z INFO [publisher_pipeline_output] pipeline/output.go:145 Attempting to reconnect to backoff(async(tcp://logstash:5000)) with 4 reconnect attempt(s)
dmarcfilebeat | 2022-02-23T10:47:15.324Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
dmarcfilebeat | 2022-02-23T10:47:15.324Z INFO [publisher] pipeline/retry.go:223 done
dmarcfilebeat | 2022-02-23T10:47:17.256Z INFO [monitoring] log/log.go:184 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cgroup":{"cpu":{"cfs":{"period":{"us":100000}},"id":"/"},"cpuacct":{"id":"/","total":{"ns":1351452324}},"memory":{"id":"/","mem":{"limit":{"bytes":9223372036854771712},"usage":{"bytes":74125312}}}},"cpu":{"system":{"ticks":370,"time":{"ms":372}},"total":{"ticks":1260,"time":{"ms":1272},"value":1260},"user":{"ticks":890,"time":{"ms":900}}},"handles":{"limit":{"hard":1048576,"soft":1048576},"open":10},"info":{"ephemeral_id":"826b4e40-b55a-44b9-9a22-cdb5be650649","uptime":{"ms":30119},"version":"7.16.3"},"memstats":{"gc_next":39791120,"memory_alloc":32220744,"memory_sys":52642824,"memory_total":112168800,"rss":127725568},"runtime":{"goroutines":48}},"filebeat":{"events":{"active":4117,"added":4119,"done":2},"harvester":{"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0},"reloads":1,"scans":1},"output":{"events":{"active":0},"type":"logstash"},"pipeline":{"clients":1,"events":{"active":4117,"filtered":2,"published":4116,"retry":8192,"total":4119},"queue":{"max_events":4096}}},"registrar":{"states":{"current":1,"update":2},"writes":{"success":2,"total":2}},"system":{"cpu":{"cores":16},"load":{"1":0.19,"15":0.22,"5":0.19,"norm":{"1":0.0119,"15":0.0138,"5":0.0119}}}}}}
dmarcfilebeat | 2022-02-23T10:47:43.869Z ERROR [publisher_pipeline_output] pipeline/output.go:154 Failed to connect to backoff(async(tcp://logstash:5000)): dial tcp 192.168.112.8:5000: connect: connection refused
dmarcfilebeat | 2022-02-23T10:47:43.869Z INFO [publisher_pipeline_output] pipeline/output.go:145 Attempting to reconnect to backoff(async(tcp://logstash:5000)) with 5 reconnect attempt(s)
dmarcfilebeat | 2022-02-23T10:47:43.870Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
dmarcfilebeat | 2022-02-23T10:47:43.870Z INFO [publisher] pipeline/retry.go:223 done
dmarcfilebeat | 2022-02-23T10:47:47.255Z INFO [monitoring] log/log.go:184 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cgroup":{"cpuacct":{"total":{"ns":21448364}},"memory":{"mem":{"usage":{"bytes":2170880}}}},"cpu":{"system":{"ticks":380,"time":{"ms":11}},"total":{"ticks":1280,"time":{"ms":20},"value":1280},"user":{"ticks":900,"time":{"ms":9}}},"handles":{"limit":{"hard":1048576,"soft":1048576},"open":10},"info":{"ephemeral_id":"826b4e40-b55a-44b9-9a22-cdb5be650649","uptime":{"ms":60119},"version":"7.16.3"},"memstats":{"gc_next":39791120,"memory_alloc":32981792,"memory_total":112929848,"rss":130039808},"runtime":{"goroutines":48}},"filebeat":{"harvester":{"open_files":1,"running":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"active":0}},"pipeline":{"clients":1,"events":{"active":4117,"retry":2048}}},"registrar":{"states":{"current":1}},"system":{"load":{"1":0.17,"15":0.22,"5":0.19,"norm":{"1":0.0106,"15":0.0138,"5":0.0119}}}}}}
So is that running correctly, or?
It says connection refused I just noticed, that's really odd. It does resolve it to the correct IP in the docker network.
Hmm, so it could look like it doesn't open port 5000, despite I have this in the pipeline config file:
input {
beats {
port => 5000
ecs_compatibility => disabled
}
syslog {
add_field => [fields][logtype] = "syslog"
}
}
Ok, so the logstash pipeline is failing, that's why, so nothing to do with the dmarc filebeat :-)
Ok, a f...... typing error in the old config, that the new version doesn't ignore, it works now :-)
Sadly it just stops if the mail in the box is not kosher, and then you manually have to go and dig that specific mail out of the mailbox. It should handle it automatically.