OpenCTI-Platform / opencti

Open Cyber Threat Intelligence Platform
https://opencti.io
Other
6.52k stars 961 forks source link

External reference not (re)created when using bundle and OpenCTIStix2.put_attribute_in_extension #7217

Closed julienloizelet closed 4 months ago

julienloizelet commented 6 months ago

Description

Environment

  1. OS : Docker stack on Ubuntu 22.04
  2. OpenCTI version: OpenCTI 6.1.1 (same bug with 6.0.9)
  3. OpenCTI client: python
  4. Other environment details:

Here is the docker-compose file I'm using for debugging this.

I'm using the hygiene connector as an example but this is not an hygiene related bug: behavior should be the same for all connectors using OpenCTIStix2.put_attribute_in_extension to add an external reference to an IPv4 observable.

docker-compose.yml ``` version: '3' services: redis: image: redis:7.2.4 restart: always volumes: - redisdata:/data elasticsearch: image: docker.elastic.co/elasticsearch/elasticsearch:8.13.4 volumes: - esdata:/usr/share/elasticsearch/data environment: # Comment-out the line below for a cluster of multiple nodes - discovery.type=single-node # Uncomment the line below below for a cluster of multiple nodes # - cluster.name=docker-cluster - xpack.ml.enabled=false - xpack.security.enabled=false - thread_pool.search.queue_size=5000 - logger.org.elasticsearch.discovery="ERROR" - "ES_JAVA_OPTS=-Xms${ELASTIC_MEMORY_SIZE} -Xmx${ELASTIC_MEMORY_SIZE}" restart: always ulimits: memlock: soft: -1 hard: -1 nofile: soft: 65536 hard: 65536 minio: image: minio/minio:RELEASE.2024-01-16T16-07-38Z volumes: - s3data:/data ports: - "9000:9000" environment: MINIO_ROOT_USER: ${MINIO_ROOT_USER} MINIO_ROOT_PASSWORD: ${MINIO_ROOT_PASSWORD} command: server /data restart: always rabbitmq: image: rabbitmq:3.13-management environment: - RABBITMQ_DEFAULT_USER=${RABBITMQ_DEFAULT_USER} - RABBITMQ_DEFAULT_PASS=${RABBITMQ_DEFAULT_PASS} - RABBITMQ_NODENAME=rabbit01@localhost volumes: - amqpdata:/var/lib/rabbitmq restart: always opencti: image: opencti/platform:6.1.1 environment: - NODE_OPTIONS=--max-old-space-size=8096 - APP__PORT=8080 - APP__BASE_URL=${OPENCTI_BASE_URL} - APP__ADMIN__EMAIL=${OPENCTI_ADMIN_EMAIL} - APP__ADMIN__PASSWORD=${OPENCTI_ADMIN_PASSWORD} - APP__ADMIN__TOKEN=${OPENCTI_ADMIN_TOKEN} - APP__APP_LOGS__LOGS_LEVEL=error - REDIS__HOSTNAME=redis - REDIS__PORT=6379 - ELASTICSEARCH__URL=http://elasticsearch:9200 - MINIO__ENDPOINT=minio - MINIO__PORT=9000 - MINIO__USE_SSL=false - MINIO__ACCESS_KEY=${MINIO_ROOT_USER} - MINIO__SECRET_KEY=${MINIO_ROOT_PASSWORD} - RABBITMQ__HOSTNAME=rabbitmq - RABBITMQ__PORT=5672 - RABBITMQ__PORT_MANAGEMENT=15672 - RABBITMQ__MANAGEMENT_SSL=false - RABBITMQ__USERNAME=${RABBITMQ_DEFAULT_USER} - RABBITMQ__PASSWORD=${RABBITMQ_DEFAULT_PASS} - SMTP__HOSTNAME=${SMTP_HOSTNAME} - SMTP__PORT=25 - PROVIDERS__LOCAL__STRATEGY=LocalStrategy ports: - "8080:8080" depends_on: - redis - elasticsearch - minio - rabbitmq restart: always worker: image: opencti/worker:6.1.1 environment: - OPENCTI_URL=http://opencti:8080 - OPENCTI_TOKEN=${OPENCTI_ADMIN_TOKEN} - WORKER_LOG_LEVEL=info depends_on: - opencti deploy: mode: replicated replicas: 3 restart: always connector-export-file-stix: image: opencti/connector-export-file-stix:6.1.1 environment: - OPENCTI_URL=http://opencti:8080 - OPENCTI_TOKEN=${OPENCTI_ADMIN_TOKEN} - CONNECTOR_ID=${CONNECTOR_EXPORT_FILE_STIX_ID} # Valid UUIDv4 - CONNECTOR_TYPE=INTERNAL_EXPORT_FILE - CONNECTOR_NAME=ExportFileStix2 - CONNECTOR_SCOPE=application/json - CONNECTOR_LOG_LEVEL=info restart: always depends_on: - opencti connector-export-file-csv: image: opencti/connector-export-file-csv:6.1.1 environment: - OPENCTI_URL=http://opencti:8080 - OPENCTI_TOKEN=${OPENCTI_ADMIN_TOKEN} - CONNECTOR_ID=${CONNECTOR_EXPORT_FILE_CSV_ID} # Valid UUIDv4 - CONNECTOR_TYPE=INTERNAL_EXPORT_FILE - CONNECTOR_NAME=ExportFileCsv - CONNECTOR_SCOPE=text/csv - CONNECTOR_LOG_LEVEL=info restart: always depends_on: - opencti connector-export-file-txt: image: opencti/connector-export-file-txt:6.1.1 environment: - OPENCTI_URL=http://opencti:8080 - OPENCTI_TOKEN=${OPENCTI_ADMIN_TOKEN} - CONNECTOR_ID=${CONNECTOR_EXPORT_FILE_TXT_ID} # Valid UUIDv4 - CONNECTOR_TYPE=INTERNAL_EXPORT_FILE - CONNECTOR_NAME=ExportFileTxt - CONNECTOR_SCOPE=text/plain - CONNECTOR_LOG_LEVEL=info restart: always depends_on: - opencti connector-import-file-stix: image: opencti/connector-import-file-stix:6.1.1 environment: - OPENCTI_URL=http://opencti:8080 - OPENCTI_TOKEN=${OPENCTI_ADMIN_TOKEN} - CONNECTOR_ID=${CONNECTOR_IMPORT_FILE_STIX_ID} # Valid UUIDv4 - CONNECTOR_TYPE=INTERNAL_IMPORT_FILE - CONNECTOR_NAME=ImportFileStix - CONNECTOR_VALIDATE_BEFORE_IMPORT=true # Validate any bundle before import - CONNECTOR_SCOPE=application/json,text/xml - CONNECTOR_AUTO=true # Enable/disable auto-import of file - CONNECTOR_LOG_LEVEL=info restart: always depends_on: - opencti connector-import-document: image: opencti/connector-import-document:6.1.1 environment: - OPENCTI_URL=http://opencti:8080 - OPENCTI_TOKEN=${OPENCTI_ADMIN_TOKEN} - CONNECTOR_ID=${CONNECTOR_IMPORT_DOCUMENT_ID} # Valid UUIDv4 - CONNECTOR_TYPE=INTERNAL_IMPORT_FILE - CONNECTOR_NAME=ImportDocument - CONNECTOR_VALIDATE_BEFORE_IMPORT=true # Validate any bundle before import - CONNECTOR_SCOPE=application/pdf,text/plain,text/html - CONNECTOR_AUTO=true # Enable/disable auto-import of file - CONNECTOR_ONLY_CONTEXTUAL=false # Only extract data related to an entity (a report, a threat actor, etc.) - CONNECTOR_CONFIDENCE_LEVEL=15 # From 0 (Unknown) to 100 (Fully trusted) - CONNECTOR_LOG_LEVEL=info - IMPORT_DOCUMENT_CREATE_INDICATOR=true restart: always depends_on: - opencti connector-hygiene: image: opencti/connector-hygiene:6.1.1 environment: - OPENCTI_URL=http://opencti:8080 - OPENCTI_TOKEN=${OPENCTI_ADMIN_TOKEN} - CONNECTOR_ID=d08e788b-d21d-4314-bcaa-e1eb5119c392 - CONNECTOR_NAME=Hygiene - CONNECTOR_SCOPE=IPv4-Addr,IPv6-Addr,Domain-Name,StixFile,Artifact - CONNECTOR_AUTO=false - CONNECTOR_CONFIDENCE_LEVEL=15 # From 0 (Unknown) to 100 (Fully trusted) - CONNECTOR_LOG_LEVEL=debug - HYGIENE_WARNINGLISTS_SLOW_SEARCH=true # Enable warning lists slow search mode - HYGIENE_ENRICH_SUBDOMAINS=true # Enrich subdomains with hygiene_parent label if the parents are found in warninglists restart: always depends_on: - opencti volumes: esdata: s3data: redisdata: amqpdata: ```

Reproducible Steps

Steps to create the smallest reproducible scenario:

  1. Launch the docker stack with docker-compose up -d --build
  2. Add an observable with IPv4 value: 103.11.223.1
  3. Enrich the observable using Hygiene connector
  4. An external reference should have been added to the observable: delete it

image (DELETE, and not Remove from this object)

  1. Repeat 3 and 4 two more times
  2. From the 4th enrichment (included), external reference is not created anymore

As said, I'm using Hygiene connector as an example only.

I reproduced this behavior in development every time I'm trying to use this kind of code snippet:

stix_objects = data["stix_objects"] # data is the param of _process_message(self, data: Dict) in the connector
stix_observable = data["stix_entity"] 
OpenCTIStix2.put_attribute_in_extension(
                    stix_entity,
                    STIX_EXT_OCTI_SCO,
                    "external_references",
                    {
                        "source_name": "some name",
                        "url": "https://some-uri",
                        "description": "some description",
                    },
                    True,
                )
serialized_bundle = self.helper.stix2_create_bundle(stix_objects)
self.helper.send_stix2_bundle(serialized_bundle)

Here are some connectors with this kind of code:

Expected Output

External reference should be created anytime we re-enrich the observable

Actual Output

External reference is created only on the first 3 enrichments

Additional information

Appears to be worker-related

My tests show that this bug seems to be linked to the worker container.

It works 3 times because my docker-compose.yml file defines 3 replicas for the worker service.

If we set only 1 replica, it works only one time.

If we do a docker-compose restart worker, it works again (for 1 or 3 times depending on the number of replicas)

I did some docker logs docker_worker_1 and here are the output:

{"timestamp": "2024-05-16T08:36:51.000840Z", "level": "INFO", "name": "worker", "message": "Processing a new message, launching a thread...", "taskName": null, "attributes": {"tag": 1}}
{"timestamp": "2024-05-16T08:36:51.005066Z", "level": "INFO", "name": "api", "message": "Listing Labels with filters", "taskName": null, "attributes": {"filters": "{\"mode\": \"and\", \"filters\": [{\"key\": \"value\", \"values\": [\"hygiene\"]}], \"filterGroups\": []}"}}
{"timestamp": "2024-05-16T08:36:51.012405Z", "level": "INFO", "name": "api", "message": "Creating External Reference", "taskName": null, "attributes": {"source_name": "misp-warninglist"}}
{"timestamp": "2024-05-16T08:36:51.056165Z", "level": "INFO", "name": "api", "message": "Creating Stix-Cyber-Observable with indicator", "taskName": null, "attributes": {"type": "IPv4-Addr", "create_indicator": false}}
{"timestamp": "2024-05-16T08:36:51.217262Z", "level": "INFO", "name": "api", "message": "Report expectation", "taskName": null, "attributes": {"work_id": "work_d08e788b-d21d-4314-bcaa-e1eb5119c392_2024-05-16T08:36:50.685Z"}}
{"timestamp": "2024-05-16T08:36:51.253723Z", "level": "INFO", "name": "worker", "message": "Message processed, thread terminated", "taskName": null}
{"timestamp": "2024-05-16T08:36:51.253843Z", "level": "INFO", "name": "worker", "message": "Message acknowledged", "taskName": null, "attributes": {"tag": 1}}
{"timestamp": "2024-05-16T08:37:54.039584Z", "level": "INFO", "name": "worker", "message": "Processing a new message, launching a thread...", "taskName": null, "attributes": {"tag": 2}}
{"timestamp": "2024-05-16T08:37:54.040032Z", "level": "INFO", "name": "api", "message": "Creating Stix-Cyber-Observable with indicator", "taskName": null, "attributes": {"type": "IPv4-Addr", "create_indicator": false}}
{"timestamp": "2024-05-16T08:37:55.553525Z", "level": "INFO", "name": "worker", "message": "Message reprocess", "taskName": null, "attributes": {"tag": 2, "count": 1}}
{"timestamp": "2024-05-16T08:37:55.554748Z", "level": "INFO", "name": "api", "message": "Creating Stix-Cyber-Observable with indicator", "taskName": null, "attributes": {"type": "IPv4-Addr", "create_indicator": false}}
{"timestamp": "2024-05-16T08:37:57.081772Z", "level": "INFO", "name": "worker", "message": "Message reprocess", "taskName": null, "attributes": {"tag": 2, "count": 2}}
{"timestamp": "2024-05-16T08:37:57.082074Z", "level": "INFO", "name": "api", "message": "Creating Stix-Cyber-Observable with indicator", "taskName": null, "attributes": {"type": "IPv4-Addr", "create_indicator": false}}
{"timestamp": "2024-05-16T08:37:57.113968Z", "level": "INFO", "name": "api", "message": "Report expectation", "taskName": null, "attributes": {"work_id": "work_d08e788b-d21d-4314-bcaa-e1eb5119c392_2024-05-16T08:37:53.699Z"}}
{"timestamp": "2024-05-16T08:37:57.210385Z", "level": "INFO", "name": "worker", "message": "Message processed, thread terminated", "taskName": null}
{"timestamp": "2024-05-16T08:37:57.210560Z", "level": "INFO", "name": "worker", "message": "Message acknowledged", "taskName": null, "attributes": {"tag": 2}}

Not only external references

I've tested that it was the same behavior for labels: they are not re-created (after a few number of attempts depending on the number of worker replicas)

For labels, code looks like:

OpenCTIStix2.put_attribute_in_extension(
                        stix_entity,
                        STIX_EXT_OCTI_SCO,
                        "labels",
                        "some-value",
                        True,
                    )

Screenshots (optional)

I tried to screen cast the bug here:

1) With 3 workers https://app.screencastify.com/v3/watch/DHARyxDk2Gh7k2KAocuj

2) Shorter video with 1 worker: https://app.screencastify.com/v3/watch/iguN0kdr0dt11snuaqaD

Megafredo commented 6 months ago

Hello @julienloizelet, we were able to reproduce this bug with another connector (URLScan), we are not sure if this concerns the connectors, but we will work on it, thank you for your feedback

helene-nguyen commented 4 months ago

@julienloizelet @misje a PR will be opened today to solve the issue, thank you for your feedbacks