Open blueteambram opened 3 years ago
Have you been able to solve it?
No, I have not
Solution that enabled me to get it up and running below.
The ingest into Elastic script does only work with .tar.gz files currently, however everything is in there to ingest a single JSON file since that is what is in the compressed files. I haven't exported that functionality into a single script yet.
A workaround that worked for me was tar -czvf mordor-data.tar.gz json-files-here
(I've also got a fork that only has .tar.gz files, I'll work with authors to figure out if they want me to merge)
To address the second part @blueteambram mentioned about indices, I also had a yellow status but was able to use it. Once elastic search has parsed the winlogbeat-mordor index and it has events, you need to create a Kibana index pattern for it as well before leveraging the search functionality in the Discover tab.
Once you have created the index, select that index in the Discover tab to start hunting! Also remember these data sets were created years ago so set the timeframe accordingly.
This may not be a complete solution, but it is what ended up working for me so hopefully it can provide some guidance for anyone else experiencing this issue.
I can put it, win winlogbeat-mordor, but are not shown in the dashboards
Hello @tim-scythe , would you mind sharing more details about using the files as .tar.gz with Elasticsearch? Please and thank you. Sorry all for the delay. Thank you for taking the time to test and share more details.
I can put it, win winlogbeat-mordor, but are not shown in the dashboards
Long delay between my replies, but an additional item to make sure the data shows up is to expand the time window. Some of this data was generated years ago so you will need to expand your search window to the past three-four years to accommodate that.
Hello @tim-scythe , would you mind sharing more details about using the files as .tar.gz with Elasticsearch? Please and thank you. Sorry all for the delay. Thank you for taking the time to test and share more details.
Hey @Cyb3rWard0g! When looking at using and sending data to Elasticsearch, I was using the https://github.com/OTRF/Security-Datasets/blob/master/scripts/data-shippers/Mordor-Elastic.py script and noticed that it wasn't able to parse the zip files that have the data such as https://github.com/OTRF/Security-Datasets/tree/master/datasets/atomic/windows/discovery/host
Code below from the Mordor-Elastic.py, lines 63-66
if args.recursive: paths = [ p for path in args.inputs for p in path.glob("**/*.tar.gz") if p.is_file() ] else: paths = [ path for path in args.inputs if path.is_file() ]
and lines 81-86 only support tar file types.
tf = tarfile.open(path) for m in tf.getmembers(): if m.isfile(): print(f"- Importing member file {m.name}...") logfile = f"{path}/{m.name}" mf = tf.extractfile(m)
I'll be honest in that I tinkered around for a few hours with trying to get zip file extraction working in a similar manner to have the above script support both formats, however converting zip files to tar ended up being a simpler implementation that ended up working.
I am confused about why this script only works with .tar files - "The ingest into Elastic script does only work with .tar.gz files currently" when the "Ship Data to HELK" page https://securitydatasets.com/consume/helk.html clearly shows it working with JSON files.
mordor/scripts/data-shippers/Mordor-Elastic.py --url http://localhost:9200 inputs empire_dcsync_dcerpc_drsuapi_DsGetNCChanges_2020-09-21185829.json
Did something change???
Hey @l0gm0nk3y69, I was looking at ways to mass ingest the data. At the time since each JSON file was compressed into a zip file, it required unzipping the file and then using the ingestion script as you mentioned. The ingestion script worked on tar files and get each of the JSON files within it, however it did not work the same with zip compressed files and that was the primary way a lot of the data was being stored within this repository.
I am stuck with importing data sets files to HELK using Mordor-Elastic.py script and get the following error : TypeError: Positional arguments can't be used with Elasticsearch API methods. Instead only use keyword arguments.
I went through the walkthrough for installing HELK and when I try to ingest the JSON files using the data-shipper script, I get an error saying that it is unable to open the JSON file. I was able to get it to work by instead passing the script with a tar.gz data set and it will show as complete, but when I go to Kibana to look at the discover tab it shows no logs. Also, when I look at the elasticsearch indices management tab, it shows the winlogbeat-mordor and the number of events parsed, but its health status is yellow.