honeycombio / terraform-provider-honeycombio

A Terraform provider for Honeycomb.io
https://registry.terraform.io/providers/honeycombio/honeycombio/latest
MIT License
48 stars 24 forks source link

Wrong types inferred in query specification #553

Open purkhusid opened 2 weeks ago

purkhusid commented 2 weeks ago

Versions 0.27.1

Steps to reproduce

If you have the following query specification:

data "honeycombio_query_specification" "board-istio-data-plane-system-4xx-errors" {
  calculation {
    op     = "RATE_SUM"
    column = "istio_requests_total"
  }

  filter {
    column = "istio_requests_total"
    op     = "exists"
  }

  filter {
    column = "k8s.namespace.name"
    op     = "="
    value  = "istio-system"
  }

  filter {
    column = "response_code"
    op     = "starts-with"
    value  = "4"
  }

  breakdowns = ["k8s.pod.name"]
  time_range = 3600
}

The filter value for response_code is inferred to be an integer when the spec is passed to e.g. a honeycomb query resource. The filter should infer the value to be a string.

The query json ends up as:

"{\"calculations\":[{\"op\":\"RATE_SUM\",\"column\":\"istio_requests_total\"}],\"filters\":[{\"column\":\"istio_requests_total\",\"op\":\"exists\"},{\"column\":\"k8s.namespace.name\",\"op\":\"=\",\"value\":\"istio-system\"},{\"column\":\"response_code\",\"op\":\"starts-with\",\"value\":4}],\"breakdowns\":[\"k8s.pod.name\"],\"time_range\":3600}"

but should end up as:

"{\"calculations\":[{\"op\":\"RATE_SUM\",\"column\":\"istio_requests_total\"}],\"filters\":[{\"column\":\"istio_requests_total\",\"op\":\"exists\"},{\"column\":\"k8s.namespace.name\",\"op\":\"=\",\"value\":\"istio-system\"},{\"column\":\"response_code\",\"op\":\"starts-with\",\"value\":\"4'"}],\"breakdowns\":[\"k8s.pod.name\"],\"time_range\":3600}"
jharley commented 2 weeks ago

@purkhusid thanks for reporting this!

I'm afraid the solution -- as I understand it -- is not so trivial at the moment but I have half-started the work to move to Dynamic Attributes instead of the type coercion/inference we're currently doing.