Kong / charts

Helm chart for Kong
Apache License 2.0
249 stars 480 forks source link

Expose Admin API Status endpoint through service #777

Closed hnajib-sym closed 1 year ago

hnajib-sym commented 1 year ago

The current chart does not support exposing status endpoint through a k8s service , reason stated here

# Specify Kong status listener configuration
# This listen is internal-only. It cannot be exposed through a service or ingress.

But in the other hand status port 8100 is listening on 0.0.0.0 by default

KONG_STATUS_LISTEN:                    0.0.0.0:8100

This preventing datadog integration from checking the status of kong since they are running on different pods

I can call status endpoint with kong pod ip address from outside , so no point of preventing service , but we cannot use this in datadog config .

pmalek commented 1 year ago

Hi @hnajib-sym 👋

How would you see this work with a Service? You'd get random instance's status endpoint this way.

Do you mind sharing your use case for this? What are you trying to get out of this endpoint and for what purpose?

rainest commented 1 year ago

The Service isn't exposed because its functionality doesn't really work through a Service.

Information exposed through the status endpoint is per-instance. The /status and /metrics responses are valid for that instance only; they do not expose aggregate information for all Pods in the Deployment. Requests sent through a Service are delivered only to a single instance, chosen at random.

Kubernetes integrations need to instead keep track of the Deployment replicas and enumerate the Pod IPs to send requests directly to them (AFAIK this is how Prometheus handles it) or run locally in a sidecar so they can access it over localhost.

It looks like Datadog's config is pointing to the latter: they only allow configuring a single URL and use localhost as an example, suggesting that it should be a per-instance agent that runs alongside Kong rather than a single agent that watches every replica.

Do they provide additional instructions about how you should install it? Given the above, I think you'll want to create a sidecar container for the agent and then configure URLs with a localhost:8100 host/port.

hnajib-sym commented 1 year ago

Hello,

Datadog kong integration support multi instances scarping see conf , tried testing with a service and it seems to the integration is broken since they switched to OpenMetrics implementation rendering kong_status_url parameters useless

Thanks for the feedbacks.