Telefonica / prometheus-kafka-adapter

Use Kafka as a remote storage database for Prometheus (remote write only)
Apache License 2.0
364 stars 135 forks source link

Unable to send data from prometheus to kafka #113

Open babvin opened 1 year ago

babvin commented 1 year ago

{"error":"Local: Queue full","level":"debug","msg":"Failing metric [123 34 108 97 98 101 108 115 34 58 123 34 95 95 110 97 109 101 95 95 34 58 34 110 111 100 101 95 115 121 115 116 101 109 100 95 117 110 105 116 95 115 116 97 116 101 34 44 34 101 110 118 34 58 34 100 101 118 34 44 34 105 110 115 116 97 110 99 101 34 58 34 49 48 46 50 50 55 46 53 51 46 56 51 58 57 49 48 48 34 44 34 106 111 98 34 58 34 110 111 100 101 34 44 34 110 97 109 101 34 58 34 112 114 111 109 101 116 104 101 117 115 45 110 111 100 101 45 101 120 112 111 114 116 101 114 45 105 112 109 105 116 111 111 108 45 115 101 110 115 111 114 46 116 105 109 101 114 34 44 34 115 116 97 116 101 34 58 34 105 110 97 99 116 105 118 101 34 125 44 34 110 97 109 101 34 58 34 110 111 100 101 95 115 121 115 116 101 109 100 95 117 110 105 116 95 115 116 97 116 101 34 44 34 116 105 109 101 115 116 97 109 112 34 58 34 50 48 50 51 45 48 52 45 48 54 84 48 52 58 52 53 58 51 49 90 34 44 34 118 97 108 117 101 34 58 34 49 34 125]","time":"2023-04-06T04:45:39Z"}

{"error":"Local: Queue full","level":"error","msg":"couldn't produce message in kafka topic metrics","time":"2023-04-06T04:45:39Z"} {"fields.time":"2023-04-06T04:45:39Z","ip":"10.244.128.91","latency":4495050,"level":"info","method":"POST","msg":"","path":"/receive","status":500,"time":"2023-04-06T04:45:39Z","user-agent":"Prometheus/2.43.0"} host os: ubuntu kafka broker: 10.244.129.91 Prometheus server: 10.244.128.91 kafka-adapter: 10.244.128.91 (running on docker) env.lst: KAFKA_BROKER_LIST=10.244.128.91:29092 KAFKA_TOPIC=metrics KAFKA_COMPRESSION= KAFKA_BATCH_NUM_MESSAGES=1000 SERIALIZATION_FORMAT=json PORT=8080 BASIC_AUTH_USERNAME: BASIC_AUTH_PASSWORD: LOG_LEVEL=debug GIN_MODE=release

Able to produce using python client root@ubuntu2004:~/kafka# python3 prod.py Message produced: <cimpl.Message object at 0x7f833a95bf40> root@ubuntu2004:~/kafka# from confluent_kafka import Producer import socket

conf = {'bootstrap.servers': "0.0.0.0:29092", 'client.id': socket.gethostname()} topic = 'metrics' producer = Producer(conf)

producer.produce(topic, key="key", value="value")

def acked(err, msg): if err is not None: print("Failed to deliver message: %s: %s" % (str(msg), str(err))) else: print("Message produced: %s" % (str(msg)))

producer.produce(topic, key="key", value="value", callback=acked) producer.flush()

Wait up to 1 second for events. Callbacks will be invoked during

this method call if the message is acknowledged.

producer.poll(1)

[appuser@b31dece626dc ~]$ kafka-console-consumer --topic metrics --bootstrap-server 0.0.0.0:9092 --from-beginning value value 12345 abcde !@#$% value

babvin commented 1 year ago

docker compose file for kafka

version: '2' services: zookeeper: image: confluentinc/cp-zookeeper:latest environment: ZOOKEEPER_CLIENT_PORT: 2181 ZOOKEEPER_TICK_TIME: 2000 ports:

johnseekins commented 1 year ago

This is your error:

{"error":"Local: Queue full","level":"error","msg":"couldn't produce message in kafka topic metrics","time":"2023-04-06T04:45:39Z"}

It seems you're either:

  1. writing faster than the adapter can process and write the data to Kafka
  2. The adapter is failing to write to Kafka for some other reason

I would verify that the metrics topic exists in Kafka and that metrics are flowing into it. Otherwise, this is a configuration issue.