domainaware / parsedmarc

A Python package and CLI for parsing aggregate and forensic DMARC reports
https://domainaware.github.io/parsedmarc/
Apache License 2.0
1.02k stars 224 forks source link

Parse an Individual Report #89

Closed mda1125 closed 5 years ago

mda1125 commented 5 years ago

Assuming the email import option isn't working at the moment .. how do you manually get the XML file or .GZ file into Kibana as it allows you to upload a JSON object.

Maybe there is a way to SSH into the box and put some XML or .GZ files into a folder and have the service parse and import them?

michaeldavie commented 5 years ago

I have a similar requirement. I already have the reports in XML files, and I would like to flatten them into CSV.

DataSquid commented 5 years ago

Same. I have 140k rua attachments already pulled from emails and I just need to import them.

seanthegeek commented 5 years ago

@mda1125, @DataSquid

You can already do this.

Create a parsedmarc config file with just the elasticsearch config part (or Splunk) without the IMAP part, cd to the directory containing the report xml, zip, gzip, or even eml or msg files, and run

parsedmarc -c /etc/parsedmarc.ini *

@michaeldavie If you just need csv output, you can skip the config file, and just run

parsedmarc -o output.csv *

You can find more details by running parsedmarc --help:

usage: parsedmarc [-h] [-c CONFIG_FILE] [--strip-attachment-payloads]
               [-o OUTPUT] [-n NAMESERVERS [NAMESERVERS ...]]
               [-t DNS_TIMEOUT] [-s] [--debug] [--log-file LOG_FILE] [-v]
               [file_path [file_path ...]]

Parses DMARC reports

positional arguments:
  file_path             one or more paths to aggregate or forensic report
                        files or emails

optional arguments:
  -h, --help            show this help message and exit
  -c CONFIG_FILE, --config-file CONFIG_FILE
                        A path to a configuration file (--silent implied)
  --strip-attachment-payloads
                        remove attachment payloads from forensic report output
  -o OUTPUT, --output OUTPUT
                        write output files to the given directory
  -n NAMESERVERS [NAMESERVERS ...], --nameservers NAMESERVERS [NAMESERVERS ...]
                        nameservers to query (default is Cloudflare's
                        nameservers)
  -t DNS_TIMEOUT, --dns_timeout DNS_TIMEOUT
                        number of seconds to wait for an answer from DNS
                        (default: 6.0)
  -s, --silent          only print errors and warnings
  --debug               print debugging information
  --log-file LOG_FILE   output logging to a file
  -v, --version         show program's version number and exit

Let me know if you have any other questions.

DataSquid commented 5 years ago

@mda1125, @DataSquid

You can already do this.

Create a parsedmarc config file with just the elasticsearch config part (or Splunk) without the IMAP part, cd to the directory containing the report xml, zip, gzip, or even eml or msg files, and run

parsedmarc -c /etc/parsedmarc.ini *

Alas, when you do this with a directory containing >140k files,

bash: /usr/local/bin/parsedmarc: Argument list too long

is the result.

seanthegeek commented 5 years ago

Hmm. I didn't know argparse had a limit to the number of arguments.

This probably isn't the most efficient workaround, but try using xargs. This will run parsedmarc separately on each file in the directory

ls *.xml *.zip *.gz *.eml *.msg | xargs -I {} parsedmarc -c /etc/parsedmarc.ini {}

In the future as you have a smaller backlog of reports, the usual wildcard should work although I don't know what the limit is.

DataSquid commented 5 years ago

ls has similar constraints. The following seems to be working through my local backlog now:

ls | xargs -I {} parsedmarc -c /etc/parsedmarc.ini {}

Thanks.

mda1125 commented 5 years ago

Saved attachments from 20 reports Put attachments in directory cd to the directory parsedmarc -c /etc/parsedmarc.ini *

Dashboard is empty...

sudo nano /etc/parsedmarc.ini

Put the following lines in the file. Replace the palceholders as necessary.

[general]

Save aggregate and forensic reports to Elasticsearch

save_aggregate = True save_forensic = True

[elasticsearch]

Send data to Elastichsearch, which listens on port 9200.

hosts = 127.0.0.1:9200 ssl = False

seanthegeek commented 5 years ago

@mda1125 If they are old reports, you may need to adjust the time window at the top of the dashboard, The default view only shows report data from the last 7 days.