OpenCTI-Platform / opencti

Open Cyber Threat Intelligence Platform
https://opencti.io
Other
6.32k stars 932 forks source link

Cannot start service connector-mitre during docker installation #619

Closed tsitsiflora closed 4 years ago

tsitsiflora commented 4 years ago

Hi all,

I am trying to install the platform using the automatic installation that is mentioned here.

I've encountered a lot of errors but managed to fix most of them by referring to previous issues here such as #322.

However this last error that I'm facing I cannot find an answer for it anywhere.

Host: Ubuntu 18.04 RAM: 12Gb docker-compose version 1.22.0, build f46880fe

Here is the error I'm getting after running docker-compose --compatibility up

docker_connector-export-file-stix_1 is up-to-date
docker_connector-import-file-stix_1 is up-to-date
docker_connector-export-file-csv_1 is up-to-date
docker_rabbitmq_1 is up-to-date
Starting docker_connector-import-file-pdf-observables_1 ... 
Starting docker_connector-mitre_1                       ... 
docker_redis_1 is up-to-date
docker_grakn_1 is up-to-date
docker_connector-opencti_1 is up-to-date
docker_elasticsearch_1 is up-to-date
docker_minio_1 is up-to-date
docker_opencti_1 is up-to-date
docker_worker_1 is up-to-date
Starting docker_connector-import-file-pdf-observables_1 ... error
docker_worker_3 is up-to-date

ERROR: for docker_connector-import-file-pdf-observables_1  Cannot start service connector-import-file-pdf-observables: failed to listen to abstract unix socket "/containerd-shim/d0b1c3c47e5a081799ee67fa599f5806db73b622dd5d24ca9d7cef26a0b10428.sock": listen unix /containerd-shim/d0b1c3c47e5a081799ee6Starting docker_connector-mitre_1                       ... error

ERROR: for docker_connector-mitre_1  Cannot start service connector-mitre: failed to listen to abstract unix socket "/containerd-shim/5321f70685e2f2d7f2f1a9dac862de9e19c591585ecdf0019fe27856d9ce25a3.sock": listen unix /containerd-shim/5321f70685e2f2d7f2f1a9dac862de9e19c591585ecdf0019fe27856d9ce25a3.sock: bind: address already in use: unknown

ERROR: for connector-import-file-pdf-observables  Cannot start service connector-import-file-pdf-observables: failed to listen to abstract unix socket "/containerd-shim/d0b1c3c47e5a081799ee67fa599f5806db73b622dd5d24ca9d7cef26a0b10428.sock": listen unix /containerd-shim/d0b1c3c47e5a081799ee67fa599f5806db73b622dd5d24ca9d7cef26a0b10428.sock: bind: address already in use: unknown

ERROR: for connector-mitre  Cannot start service connector-mitre: failed to listen to abstract unix socket "/containerd-shim/5321f70685e2f2d7f2f1a9dac862de9e19c591585ecdf0019fe27856d9ce25a3.sock": listen unix /containerd-shim/5321f70685e2f2d7f2f1a9dac862de9e19c591585ecdf0019fe27856d9ce25a3.sock: bind: address already in use: unknown
ERROR: Encountered errors while bringing up the project.

Here is the .env file that I'm using. I copied the .env.example then changed the OPENCTI_ADMIN_TOKEN and the ChangeMeAccess and ChangeMeKey

OPENCTI_ADMIN_EMAIL=admin@opencti.io
OPENCTI_ADMIN_PASSWORD=admin
OPENCTI_ADMIN_TOKEN=df8635b1-39b5-41c2-8873-2f19b0e6ca8c
MINIO_ACCESS_KEY=NewAccess19191919
MINIO_SECRET_KEY=NewAccessKey19191919
RABBITMQ_DEFAULT_USER=guest
RABBITMQ_DEFAULT_PASS=guest
CONNECTOR_EXPORT_FILE_STIX_ID=7a9d21f1-9dbf-4ba5-bb81-99cf1b4cecf2
CONNECTOR_EXPORT_FILE_CSV_ID=2cd3da6c-192e-4f9d-bf66-4876defffb9d
CONNECTOR_IMPORT_FILE_STIX_ID=52ffbcf2-c1cc-4543-85de-f8f0b0179e76
CONNECTOR_IMPORT_FILE_PDF_OBSERVABLES_ID=d2920808-5f71-4658-a738-3534d4a1ca14
CONNECTOR_OPENCTI_ID=6b0dd1dc-023e-4afe-8b8b-f06765ac8b36
CONNECTOR_MITRE_ID=209b4810-5877-47f3-8724-a52a341dd55f

Here is my docker-compose.yml. In here I changed the original port 8080 to 4000 as suggested in #322 using the command: sudo sed -i 's/8080/4000/g' docker-compose.yml

version: '3'
services:
  grakn:
    image: graknlabs/grakn:1.6.2
    ports:
      - 48555:48555 
    volumes:
      - grakndata:/grakn-core-all-linux/server/db
    restart: always
  redis:
    image: redis:5.0.8
    restart: always
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.6.2
    volumes:
      - esdata:/usr/share/elasticsearch/data
    environment:
      - discovery.type=single-node
    restart: always
    ulimits:
      memlock:
        soft: -1
        hard: -1
      nofile:
        soft: 65536
        hard: 65536
  minio:
    image: minio/minio:RELEASE.2020-02-27T00-23-05Z
    volumes:
      - s3data:/data
    ports:
      - "9000:9000"
    environment:
      MINIO_ACCESS_KEY: ${MINIO_ACCESS_KEY}
      MINIO_SECRET_KEY: ${MINIO_SECRET_KEY}
    command: server /data
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:9000/minio/health/live"]
      interval: 30s
      timeout: 20s
      retries: 3
    restart: always
  rabbitmq:
    image: rabbitmq:3.7-management
    environment:
      - RABBITMQ_DEFAULT_USER=${RABBITMQ_DEFAULT_USER}
      - RABBITMQ_DEFAULT_PASS=${RABBITMQ_DEFAULT_PASS}    
    restart: always
  opencti:
    image: opencti/platform:3.1.0
    environment:
      - APP__PORT=4000
      - APP__ADMIN__EMAIL=${OPENCTI_ADMIN_EMAIL}
      - APP__ADMIN__PASSWORD=ThisIsNotAUniquePassword
      - APP__ADMIN__TOKEN=b0908485-db8c-4072-af02-4e976e6a8681
      - APP__LOGS_LEVEL=error
      - APP__LOGS=./logs
      - APP__REACTIVE=true
      - APP__COOKIE_SECURE=false
      - GRAKN__HOSTNAME=grakn
      - GRAKN__PORT=48555
      - GRAKN__TIMEOUT=30000
      - REDIS__HOSTNAME=redis
      - REDIS__PORT=6379
      - ELASTICSEARCH__URL=http://elasticsearch:9200
      - MINIO__ENDPOINT=minio
      - MINIO__PORT=9000
      - MINIO__USE_SSL=false
      - MINIO__ACCESS_KEY=${MINIO_ACCESS_KEY}
      - MINIO__SECRET_KEY=${MINIO_SECRET_KEY}
      - RABBITMQ__HOSTNAME=rabbitmq
      - RABBITMQ__PORT=5672
      - RABBITMQ__PORT_MANAGEMENT=15672
      - RABBITMQ__MANAGEMENT_SSL=false
      - RABBITMQ__USERNAME=${RABBITMQ_DEFAULT_USER}
      - RABBITMQ__PASSWORD=${RABBITMQ_DEFAULT_PASS}
      - PROVIDERS__LOCAL__STRATEGY=LocalStrategy
    ports:
      - "4000:4000"
    depends_on:
      - grakn
      - redis
      - elasticsearch
      - minio
      - rabbitmq
    restart: always
  worker:
    image: opencti/worker:3.1.0
    environment:
      - OPENCTI_URL=http://localhost:4000
      - OPENCTI_TOKEN=b0908485-db8c-4072-af02-4e976e6a8681
      - WORKER_LOG_LEVEL=info
    depends_on:
      - opencti
    deploy:
      mode: replicated
      replicas: 3
    restart: always
  connector-export-file-stix:
    image: opencti/connector-export-file-stix:3.1.0
    environment:
      - OPENCTI_URL=http://localhost:4000
      - OPENCTI_TOKEN=b0908485-db8c-4072-af02-4e976e6a8681
      - CONNECTOR_ID=${CONNECTOR_EXPORT_FILE_STIX_ID} # Valid UUDv4
      - CONNECTOR_TYPE=INTERNAL_EXPORT_FILE
      - CONNECTOR_NAME=ExportFileStix2
      - CONNECTOR_SCOPE=application/json
      - CONNECTOR_CONFIDENCE_LEVEL=3
      - CONNECTOR_LOG_LEVEL=info
    restart: always
  connector-export-file-csv:
    image: opencti/connector-export-file-csv:3.1.0
    environment:
      - OPENCTI_URL=http://localhost:4000
      - OPENCTI_TOKEN=b0908485-db8c-4072-af02-4e976e6a8681
      - CONNECTOR_ID=${CONNECTOR_EXPORT_FILE_CSV_ID} # Valid UUDv4
      - CONNECTOR_TYPE=INTERNAL_EXPORT_FILE
      - CONNECTOR_NAME=ExportFileCsv
      - CONNECTOR_SCOPE=application/csv
      - CONNECTOR_CONFIDENCE_LEVEL=3
      - CONNECTOR_LOG_LEVEL=info
    restart: always    
  connector-import-file-stix:
    image: opencti/connector-import-file-stix:3.1.0
    environment:
      - OPENCTI_URL=http://localhost:4000
      - OPENCTI_TOKEN=b0908485-db8c-4072-af02-4e976e6a8681
      - CONNECTOR_ID=${CONNECTOR_IMPORT_FILE_STIX_ID} # Valid UUDv4
      - CONNECTOR_TYPE=INTERNAL_IMPORT_FILE
      - CONNECTOR_NAME=ImportFileStix2
      - CONNECTOR_SCOPE=application/json
      - CONNECTOR_CONFIDENCE_LEVEL=3
      - CONNECTOR_LOG_LEVEL=info
    restart: always
  connector-import-file-pdf-observables:
    image: opencti/connector-import-file-pdf-observables:3.1.0
    environment:
      - OPENCTI_URL=http://localhost:4000
      - OPENCTI_TOKEN=b0908485-db8c-4072-af02-4e976e6a8681
      - CONNECTOR_ID=${CONNECTOR_IMPORT_FILE_PDF_OBSERVABLES_ID} # Valid UUDv4
      - CONNECTOR_TYPE=INTERNAL_IMPORT_FILE
      - CONNECTOR_NAME=ImportFilePdfObservables
      - CONNECTOR_SCOPE=application/pdf
      - CONNECTOR_CONFIDENCE_LEVEL=3
      - CONNECTOR_LOG_LEVEL=info
      - PDF_OBSERVABLES_CREATE_INDICATOR=False
    restart: always    
  connector-opencti:
    image: opencti/connector-opencti:3.1.0
    environment:
      - OPENCTI_URL=http://localhost:4000
      - OPENCTI_TOKEN=b0908485-db8c-4072-af02-4e976e6a8681
      - CONNECTOR_ID=${CONNECTOR_OPENCTI_ID} # Valid UUDv4
      - CONNECTOR_TYPE=EXTERNAL_IMPORT
      - CONNECTOR_NAME=OpenCTI
      - CONNECTOR_SCOPE=identity,sector,region,country,city
      - CONNECTOR_CONFIDENCE_LEVEL=5
      - CONNECTOR_UPDATE_EXISTING_DATA=true
      - CONNECTOR_LOG_LEVEL=info
      - CONFIG_SECTORS_FILE_URL=https://raw.githubusercontent.com/OpenCTI-Platform/datasets/master/data/sectors.json
      - CONFIG_GEOGRAPHY_FILE_URL=https://raw.githubusercontent.com/OpenCTI-Platform/datasets/master/data/geography.json
      - CONFIG_INTERVAL=7 # Days
    restart: always
  connector-mitre:
    image: opencti/connector-mitre:3.1.0
    environment:
      - OPENCTI_URL=http://localhost:4000
      - OPENCTI_TOKEN=b0908485-db8c-4072-af02-4e976e6a8681
      - CONNECTOR_ID=${CONNECTOR_MITRE_ID} # Valid UUDv4
      - CONNECTOR_TYPE=EXTERNAL_IMPORT
      - CONNECTOR_NAME=MITRE ATT&CK
      - CONNECTOR_SCOPE=identity,attack-pattern,course-of-action,intrusion-set,malware,tool,report
      - CONNECTOR_CONFIDENCE_LEVEL=3
      - CONNECTOR_UPDATE_EXISTING_DATA=true
      - CONNECTOR_LOG_LEVEL=info
      - MITRE_ENTERPRISE_FILE_URL=https://raw.githubusercontent.com/mitre/cti/master/enterprise-attack/enterprise-attack.json
      - MITRE_PRE_ATTACK_FILE_URL=https://raw.githubusercontent.com/mitre/cti/master/pre-attack/pre-attack.json
      - MITRE_INTERVAL=7 # Days
    restart: always
volumes:
  grakndata:
  esdata:
  s3data:

Any suggestions will be appreciated. I've been on this for almost a week now.

richard-julien commented 4 years ago

bind: address already in use: unknown

Looks like a docker problem :( https://github.com/moby/moby/issues/38726 Reading the ticket it seems rebooting the machine can solve the problem...

cyamal1b4 commented 4 years ago

Try this. A docker port thing I learned recently. The 4000:4000 is actually probably your issue:

It is important to note the distinction between HOST_PORT and CONTAINER_PORT. In the above example, the HOST_PORT is 4000 and the container port is 4000 (). Networked service-to-service communication use the CONTAINER_PORT. When HOST_PORT is defined, the service is accessible outside the swarm as well.

If you have a conflict with 8080 then change the host port to 4000 but leave the container port as 8080. You’ll then access your UI at 4000 but within .. the containers will talk to OCTI on 8080 on their own network, which happens by default btw. So:

ports:

SamuelHassine commented 4 years ago

I am closing this due to inactivity. Please feel free to re-open it if you still have some issues after the 4.0.0 release.