akvorado / akvorado

Flow collector, enricher and visualizer
https://demo.akvorado.net
GNU Affero General Public License v3.0
1.32k stars 81 forks source link

Exporters akvorado: cannot convert slice with length 0 to pointer to array with length 16 #263

Closed shatovilya closed 1 year ago

shatovilya commented 1 year ago

Bug description

Exporters containers restart, application exporters error: "panic: runtime error: cannot convert slice with length 0 to pointer to array with length 16".

Attached is the config file.

Steps to reproduce the problem

akvorado v1.6.2 Build date: 2022-11-03T21:10:15+0000 Built with: go1.19.2

The quick start assembly from the release page is used.

docker-compose up -d

Expected outcome

Exporters docker containers work stably.

Current outcome

CONTAINER ID   IMAGE                               COMMAND                  CREATED        STATUS                    PORTS                                                                                                                                           NAMES
aabdc252cc2f   bitnami/kafka:2.8.1                 "/opt/bitnami/script…"   16 hours ago   Up 16 hours (healthy)     9092/tcp                                                                                                                                        akvorado_kafka_1
d6268064dd34   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Up 34 seconds (healthy)   8080/tcp                                                                                                                                        akvorado_akvorado-exporter2_1
7d492fffb7c0   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Up 33 seconds (healthy)   8080/tcp                                                                                                                                        akvorado_akvorado-exporter1_1
993019b1e519   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Up 34 seconds (healthy)   8080/tcp                                                                                                                                        akvorado_akvorado-exporter0_1
bd9a4903605a   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Up 33 seconds (healthy)   8080/tcp                                                                                                                                        akvorado_akvorado-exporter3_1
9703350bf835   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Up 16 hours                                                                                                                                                               akvorado_akvorado-conntrack-fixer_1
ea41a7f49cf9   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Exited (0) 16 hours ago                                                                                                                                                   akvorado_akvorado-service_1
944132d36acb   maxmindinc/geoipupdate:v4           "/usr/bin/entry.sh"      16 hours ago   Exited (1) 16 hours ago                                                                                                                                                   akvorado_geoip_1
50072dde13d3   clickhouse/clickhouse-server:22.8   "/entrypoint.sh"         16 hours ago   Up 16 hours (healthy)     8123/tcp, 9000/tcp, 9009/tcp                                                                                                                    akvorado_clickhouse_1
aa521af9c336   bitnami/zookeeper:3.6               "/opt/bitnami/script…"   16 hours ago   Up 16 hours               2181/tcp, 2888/tcp, 3888/tcp, 8080/tcp                                                                                                          akvorado_zookeeper_1
9256c43943bc   traefik:2.6                         "/entrypoint.sh --ap…"   16 hours ago   Up 16 hours               80/tcp, 127.0.0.1:8080->8080/tcp, 0.0.0.0:8081->8081/tcp, :::8081->8081/tcp                                                                     akvorado_traefik_1
e6e6de20750d   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Up 16 hours (healthy)     0.0.0.0:2055->2055/udp, :::2055->2055/udp, 0.0.0.0:6343->6343/udp, :::6343->6343/udp, 8080/tcp, 0.0.0.0:10179->10179/tcp, :::10179->10179/tcp   akvorado_akvorado-inlet_1
cdbce6ef1470   provectuslabs/kafka-ui:v0.4.0       "/bin/sh -c 'java $J…"   16 hours ago   Up 16 hours               8080/tcp                                                                                                                                        akvorado_kafka-ui_1
bbd96b81a608   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Up 16 hours (healthy)     8080/tcp                                                                                                                                        akvorado_akvorado-orchestrator_1
c7d4a1dd4088   ghcr.io/akvorado/akvorado:latest    "/usr/local/bin/akvo…"   16 hours ago   Up 16 hours (healthy)     8080/tcp                                                                                                                                        akvorado_akvorado-console_1

Docker logs:

goroutine 77 [running]:
akvorado/demoexporter/flows.getNetflowData.func1()
        akvorado/demoexporter/flows/nfdata.go:72 +0x879
created by akvorado/demoexporter/flows.getNetflowData
        akvorado/demoexporter/flows/nfdata.go:31 +0x474
{"level":"info","listen":"0.0.0.0:8080","time":"2022-11-15T06:53:43Z","caller":"akvorado/common/http/root.go:135","module":"akvorado/common/http","message":"starting HTTP server"}
{"level":"info","time":"2022-11-15T06:53:43Z","caller":"akvorado/demoexporter/snmp/root.go:54","module":"akvorado/demoexporter/snmp","message":"starting SNMP component"}
{"level":"info","time":"2022-11-15T06:53:43Z","caller":"akvorado/demoexporter/flows/root.go:71","module":"akvorado/demoexporter/flows","message":"starting flows component"}
{"level":"info","version":"v1.6.2","build-date":"2022-11-03T21:10:15+0000","time":"2022-11-15T06:53:43Z","caller":"akvorado/cmd/components.go:37","module":"akvorado/cmd","message":"akvorado has started"}
panic: runtime error: cannot convert slice with length 0 to pointer to array with length 16

Config file:

---
# This configuration file is documented in docs/02-configuration.md.
# You can get all default values with `akvorado orchestrator /dev/null
# --dump --check` or `docker-compose run akvorado-orchestrator
# orchestrator /dev/null --dump --check`.
kafka:
  topic: flows
  version: 2.8.1
  brokers:
    - kafka:9092
  topic-configuration:
    num-partitions: 8
    replication-factor: 1
    config-entries:
      # The retention policy in Kafka is mainly here to keep a buffer
      # for ClickHouse.
      segment.bytes: 1073741824
      retention.ms: 86400000 # 1 day
      cleanup.policy: delete
      compression.type: producer

clickhouse:
  orchestrator-url: http://akvorado-orchestrator:8080
  kafka:
    consumers: 4
  servers:
    - clickhouse:9000
  # asns:
  #   64501: ACME Corporation
  networks:
    # You should customize this section with your networks. This
    # populates the Src/DstNetName/Role/Site/Region/Tenant fields.
    172.16.0.0/16:
      name: ipv4-customers
      role: customers
  network-sources: []
    # amazon:
    #   url: https://ip-ranges.amazonaws.com/ip-ranges.json
    #   interval: 6h
    #   transform: |
    #     (.prefixes + .ipv6_prefixes)[] |
    #     { prefix: (.ip_prefix // .ipv6_prefix), tenant: "amazon", region: .region, role: .service|ascii_downcase }
    # gcp:
    #   url: https://www.gstatic.com/ipranges/cloud.json
    #   interval: 6h
    #   transform: |
    #     .prefixes[] |
    #     { prefix: (.ipv4Prefix // .ipv6Prefix), tenant: "google-cloud", region: .scope }

inlet:
  kafka:
    compression-codec: zstd
  geoip:
    optional: true
    # When running on Docker, these paths are inside the container.
    # Check docker-compose.yml for details.
    asn-database: /usr/share/GeoIP/GeoLite2-ASN.mmdb
    geo-database: /usr/share/GeoIP/GeoLite2-Country.mmdb
  snmp:
    workers: 10
  flow:
    inputs:
      # - type: udp
      #   decoder: netflow
      #   listen: 0.0.0.0:2055
      #   workers: 6
      #   receive-buffer: 10485760
      - type: udp
        decoder: sflow
        listen: 0.0.0.0:6343
        workers: 6
        receive-buffer: 10485760
  core:
    workers: 6
    exporter-classifiers:
      # This is an example. This should be customized depending on how
      # your exporters are named.
      - ClassifySiteRegex(Exporter.Name, "^([^-]+)-", "$1")
      - ClassifyRegion("europe")
      - ClassifyTenant("acme")
      - ClassifyRole("edge")
    interface-classifiers:
      # This is an example. This must be customized depending on the
      # descriptions of your interfaces. In the following, we assume
      # external interfaces are named "Transit: Cogent" Or "IX:
      # FranceIX".
      - |
        ClassifyConnectivityRegex(Interface.Description, "^(?i)(transit|pni|ppni|ix):? ", "$1") &&
        ClassifyProviderRegex(Interface.Description, "^\\S+?\\s(\\S+)", "$1") &&
        ClassifyExternal()
      - ClassifyInternal()
demo-exporter:
  - snmp:
      name: 172.16.10.17
      interfaces:
        1: "uplink-230cab"
        2: "uplink-eltex24p-417cab"
      listen: 127.0.0.1:161
    flows: &flows2
      samplingrate: 50000
      target: 127.0.0.1:6343
      flows:
        - per-second: 0.01
          in-if-index: 1
          out-if-index: 2
          peak-hour: 24h
          multiplier: 3
          src-port: 0
          dst-port: 0
          DstAS: 1
          SrcAS: 2
          protocol: tcp
          size: 1300
vincentbernat commented 1 year ago

Remove the whole demo-exporter section. You don't need it. Many people try to keep it in some way. There is a comment at the top saying to remove it. Could you explain why you keep it so that I can update the comment to avoid such error?

The crash is a bug. You should have had an error about missing src-net and dst-net, but the requirement check does not work.