DefectDojo / django-DefectDojo

DevSecOps, ASPM, Vulnerability Management. All on one platform.
https://defectdojo.com
BSD 3-Clause "New" or "Revised" License
3.75k stars 1.56k forks source link

PostgreSQL error: string is too long for tsvector #8299

Open lischetzke opened 1 year ago

lischetzke commented 1 year ago

Bug description PostgreSQL full-text search limit is 1M. Import of big scans can run into this limit and import fails with HTTP 500. This happened while importing a Tenable Nessus Scan with 7+ MB in size. First occurrence was with a file with 60 MB in size.

Steps to reproduce Steps to reproduce the behavior:

  1. Perform Tenable Nessus scan
  2. Add a product in DefectDojo
  3. Start importing scan result using .nessus file
  4. Wait while importing until DefectDojo shows "500 Internal Server Error"

Expected behavior Clean and successful import of all findings no matter how big the Tenable Nessus scan file is.

Deployment method (select with an X)

Environment information

Logs Logs are shortened. If more detailed is needed it's possible to upload more logs. docker-postgres.log

Screenshots 2023-06-23 15_36_14-Product List _ DefectDojo — Mozilla Firefox

Additional context To the Tenable Nessus scan configuration: The scan is based on "Basic Network Scan" and should scan using all plugins and for all vulnerabilities on multiple systems 70+. Error started to occur based on the amount of plugin output (even with only 5 hosts, 119 endpoints and ~200 findings total). Import was done using default settings. Issue seems to be the same in this issue for miniflux v2 and the same fix/workaround as described in a PR there might be a possible solution.

lischetzke commented 1 year ago

UWSGI Traceback: docker-uwsgi.log

lischetzke commented 1 year ago

Current workaround is to use mysql instead of postgres. For Docker that would be using the profile and environment mysql-redis.

Building on Windows: docker-compose --profile mysql-rabbitmq --profile mysql-redis --env-file .\docker\environments\mysql-redis.env build

Running on Windows: docker-compose --profile mysql-redis --env-file .\docker\environments\mysql-redis.env up --no-deps

Ran into the problem that "max-allowed-packet" size was configured too small for mysql. Increased the value by editing docker-compose.yml and adding the value --max-allowed-packet=536870912 to the command array of mysql container.

docker-compose.yml snippet:

  mysql:
    image: mysql:5.7.42@sha256:f57eef421000aaf8332a91ab0b6c96b3c83ed2a981c29e6528b21ce10197cd16
    profiles: 
      - mysql-rabbitmq
      - mysql-redis
    environment:
      MYSQL_RANDOM_ROOT_PASSWORD: 'yes'
      MYSQL_DATABASE: ${DD_DATABASE_NAME}
      MYSQL_USER: ${DD_DATABASE_USER}
      MYSQL_PASSWORD: ${DD_DATABASE_PASSWORD}
    command: ['mysqld', '--character-set-server=utf8mb4', '--collation-server=utf8mb4_unicode_ci', '--max-allowed-packet=536870912']
    volumes:
      - defectdojo_data:/var/lib/mysql
manuel-sommer commented 10 months ago

Maybe you can make a PR @lischetzke ?

nv-pipo commented 9 months ago

We are running into this bug too.

The workaround (use profile mysql-redis) works, but it feels considerably slower than postgres-redis. We currently have: 4164 engagements, 121174 findings, 30046 endpoinst and 989 products. All synched from Qualys (VMDR / WAS).

image

Any ideas on how to use postgres and not have this tsvector issue? eg: