Closed tmaiorca closed 9 months ago
Yes parsedmarc checks the inbox every minute, as far as applying new config rebuilding the stack should be enough, the already processed messages will probably have been moved to Processed
folder of your mailbox, to re-process them just move them out to Inbox again
If you add
silent = False
debug = True
to your data/conf/parsedmarc/config.ini
you can observe what exactly parsedmarc is doing, example:
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1087:Found 0 messages in INBOX
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1095:Processing 0 messages
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1087:Found 3 messages in INBOX
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1095:Processing 3 messages
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1099:Processing message 1 of 3: UID 499
parsedmarc-dockerized-parsedmarc-1 | INFO:__init__.py:805:Parsing mail from "[REDACTED] DMARC Report" <noreply-dmarc@[REDACTED]>
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1099:Processing message 2 of 3: UID 500
parsedmarc-dockerized-parsedmarc-1 | INFO:__init__.py:805:Parsing mail from "[REDACTED] DMARC Report" <noreply-dmarc@[REDACTED]>
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1099:Processing message 3 of 3: UID 501
parsedmarc-dockerized-parsedmarc-1 | INFO:__init__.py:805:Parsing mail from "[REDACTED] DMARC Report" <noreply-dmarc@[REDACTED]>
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1152:Moving aggregate report messages from INBOX to Processed/Aggregate
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1159:Moving message 1 of 3: UID 499
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1159:Moving message 2 of 3: UID 500
parsedmarc-dockerized-parsedmarc-1 | DEBUG:__init__.py:1159:Moving message 3 of 3: UID 501
parsedmarc-dockerized-parsedmarc-1 |
parsedmarc-dockerized-parsedmarc-1 | {
parsedmarc-dockerized-parsedmarc-1 | "aggregate_reports": [],
parsedmarc-dockerized-parsedmarc-1 | "forensic_reports": []
parsedmarc-dockerized-parsedmarc-1 | }
parsedmarc-dockerized-parsedmarc-1 |
parsedmarc-dockerized-parsedmarc-1 | {
parsedmarc-dockerized-parsedmarc-1 | "aggregate_reports": [],
parsedmarc-dockerized-parsedmarc-1 | "forensic_reports": []
parsedmarc-dockerized-parsedmarc-1 | }
parsedmarc-dockerized-parsedmarc-1 |
parsedmarc-dockerized-parsedmarc-1 | {
parsedmarc-dockerized-parsedmarc-1 | "aggregate_reports": [],
parsedmarc-dockerized-parsedmarc-1 | "forensic_reports": []
parsedmarc-dockerized-parsedmarc-1 | }
parsedmarc-dockerized-parsedmarc-1 |
parsedmarc-dockerized-parsedmarc-1 | INFO:elastic.py:295:Saving aggregate report to Elasticsearch
I think @hugalafutro did great job explaining all, so I'll be closing this one for now as part of a overdue issues cleanup. Let me know if you disagree or help is still needed.
I was able to run docker-compose up with some sample data I placed in a test folder with an Office365 mailbox. I'd like to now re-run this with production data.
Because I'm a little inexperienced with docker, I have a few questions:
docker-compose up -d
, is parsedmarc running as a service where it will occasionally pull down messages from my O365 via MSGraph?