Closed bkslash closed 2 years ago
Hi, how to achieve opposite data flow - if data is already in kafka topic, how can I send it to remote storage using remote_write? So something like this:
some_metric_source->remote_write->prometheus-kafka-adapter->kafka_topic->???????->remote_write->some_remote_storage
I don't want to scrape data using prometheus pull mechanism, I'd like to have all data to be pushed in realtime...
Thanks for asking @bkslash. Prometheus-kafka-adapter doesn't support it. Not sure if this issue https://github.com/Telefonica/prometheus-kafka-adapter/issues/55 is related.
Hi, how to achieve opposite data flow - if data is already in kafka topic, how can I send it to remote storage using remote_write? So something like this:
some_metric_source->remote_write->prometheus-kafka-adapter->kafka_topic->???????->remote_write->some_remote_storage
I don't want to scrape data using prometheus pull mechanism, I'd like to have all data to be pushed in realtime...