infobyte / faraday

Open Source Vulnerability Management Platform
https://www.faradaysec.com
GNU General Public License v3.0
4.96k stars 908 forks source link

Cannot import big AWS Inspector report #495

Closed Yevhen-bc closed 4 months ago

Yevhen-bc commented 4 months ago

BUG - nothing happens while importing AWS inspector json file

Faraday 5.3.0 , installed from faraday-server_amd64.deb

cat /home/faraday/.faraday/config/server.ini [faraday_server] port = 5985 bind_address = 0.0.0.0 websocket_port = 9000 debug = false session_timeout = 12 api_token_expiration = 43200 secret_key = XXXXXXXXXXXXXXXXX agent_registration_secret = XXXXXXXXXXXXXXXXX

[logger] use_rfc5424_formatter = false

[storage] path = /home/faraday/.faraday/storage

[database] connection_string = postgresql+psycopg2://faraday_postgresql:uwT0LbhzafRIrRBjfHoLwT7lu@localhost/faraday

Component Name

GUI AWS INSPECTOR upload

Steps to reproduce

Upload big file AWS inspector , processing queue stuck on ...processing .

File uploaded and found on /home/faraday/.faraday/uploaded_reports/ but 0 records in PG tables even after 1-2hrs

2 VCPU on 0-10% loads

Expected results

Uploaded and import successfully

$ cat /etc/lsb-release cat /etc/os-release PRETTY_NAME="Debian GNU/Linux 12 (bookworm)" NAME="Debian GNU/Linux" VERSION_ID="12" VERSION="12 (bookworm)" VERSION_CODENAME=bookworm

ezk06eer commented 4 months ago

@Yevhen-bc hi, nice to hear from you, we will need some information about the case, is your deployment exposed somehow (nginx, traefik, etc) ?

also please run :

faraday-plugins detect-report yourfile.json 
faraday-plugins process-report yourfile.json --summary

to see if the report can be processed or not.

Keep in mind that processing big files requires a lot of ram so minimally set 16GB in your VM/Server/K8s

Yevhen-bc commented 4 months ago

btw, how to clean queue?

faraday-plugins detect-report /home/faraday/.faraday/uploaded_reports/ZWHOO4YCH2MI_aws-inspector_p1.json

Faraday Plugin: AWSInspector_Json

and faraday-plugins process-report yourfile.json --summary will generate long answer with the following rows in the end:


      "850783bd14d8d2f042f7fc419d3d99352ff4a1c0"
    ]
}

==>> is your deployment exposed somehow (nginx, traefik, etc) ? Yes, im trying to test Faraday in AWS also as in homelab .

Yevhen-bc commented 4 months ago

Update:: great news! Just stopped debian version of Faraday and set it up via pip in env.

 python3 -m venv faraday_venv
source faraday_venv/bin/activate
git clone ....
pip install .

repeat last 2 commands also for faraday-cli and faraday-plugin

Removed old workspace in UI, created NEW workspace, same file choosen for upload.

Now i have 3 files in queue processing

image

but also i have imported vulnerabilities. Over 50k vulners atm, file size ~120mb.

Yevhen-bc commented 4 months ago

im back on debian 12 installation. what do i have right now =>

  1. report with 1 vulnerability succesfully uploaded and processed - Done.
  2. 2GB file in processing stage but no records in DB
  3. 120mb file processed and Done!
ezk06eer commented 4 months ago

@Yevhen-bc , everything seems fine. However, it's important to note that high usage on the community version is not recommended for production purposes. One potential solution could be to split your JSON multiline into separate files. Please remember that the community version is not intended for commercial use.

If you're available, we can schedule a Proof of Concept (PoC) for tomorrow to demonstrate how to handle large files, although you may need to consider a different approach for this task.

The approach required may vary depending on the volume of data you are working with. It seems like you may have over 4 million results to process, so you might need to adjust your cluster accordingly.

Please suggest a convenient time to discuss your requirements and explore Faraday further. Let's schedule a demo session. Get a Demo

Yevhen-bc commented 4 months ago

Thank you, with 2GB file i have a timeout. Looks like need to split it on a 5-10 pieces.