grafana / influxdb-flux-datasource

Grafana datasource plugin for Flux (InfluxDB)
Apache License 2.0
51 stars 21 forks source link

Metric request error in 7.0.0 #105

Closed hugodlg closed 4 years ago

hugodlg commented 4 years ago
t=2020-05-25T13:17:32+0000 lvl=info msg="from(bucket: \"test\")\n  |> range(start: 2020-05-25T07:17:32Z, stop: 2020-05-25T13:17:32Z)\n  |> filter(fn: (r) => r[\"_measurement\"] == \"test_event\")\n  |> filter(fn: (r) => r[\"_field\"] == \"event\")\n  |> drop(columns: [\"host\"])\n  |> duplicate(column: \"_value\", as: \"event\")\n  |> group(columns: [\"event\"])\n  |> aggregateWindow(every: 1m, fn: count) => { test test}" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource
t=2020-05-25T13:17:34+0000 lvl=eror msg="Metric request error" logger=context userId=1 orgId=1 uname=admin error="rpc error: code = Unavailable desc = transport is closing" remote_addr=IP_ADDR
t=2020-05-25T13:17:34+0000 lvl=eror msg="Request Completed" logger=context userId=1 orgId=1 uname=admin method=POST path=/api/ds/query status=500 remote_addr=IP_ADDR time_ms=1309 size=34 referer="http://grafana:3000/d/ID/test?editPanel=4&orgId=1"
t=2020-05-25T13:17:35+0000 lvl=info msg=Profiler logger=plugins.backend pluginId=grafana-influxdb-flux-datasource enabled=false

Using grafana/grafana:7.0.0 docker image and plugin (version 7.0.0) installed by the command-line utility

jjkrol commented 4 years ago

I am getting this error for a query such as:

from(bucket: "bucket_name")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r._field == "field_name")
  |> keep(columns: ["_time","_value", "fieldA", "fieldB"])

However when I drop the keep command, or use drop instead of keep(which isn't perfect), I get the results.

Edit: I've found it's not consistently happening across different queries (they were working in the previous version and I'm going one by one checking them). In some queries keep is working, in others it's the pivot that is causing the error.

petersalemink95 commented 4 years ago

Hi @jjkrol I have the same issue as you have. Did you find a solution?

jjkrol commented 4 years ago

@peter-git-s unfortunately not yet, I didn't have time to investigate more, but I will have another look - maybe we can narrow it down to some specific cases.

petersalemink95 commented 4 years ago

@jjkrol I just found out that I got the same issue when using drop. So both drop and keep give the same errors

cyanghsieh commented 4 years ago

I met a similar issue when I used elapsed . Here's my FLUX script:

from(bucket: "kscatest3/autogen")
  |> range(start: -5d)
  |> filter(fn: (r) => r._measurement == "FakeState2")
  |> elapsed(unit:1s)
  |> filter(fn: (r) => r._value == 1)
  |> sum(column:"elapsed")
  |> filter(fn: (r) => r.elapsed > 0)

the last 2 lines can be ignored.

in grafana 7, it will show Internal Server Error, Metric request error grafana7

in grafana 6, the final result will be put into the series name. In the picture you will see the series become "FakeState2 value elapsed=46322", which 46322 is my query result grafana6 4

and in chronograf, it looks good. Chronografxx

andig commented 4 years ago

My log message with queries is:

[httpd] 172.22.0.1 - - [02/Jun/2020:11:59:04 +0000] "POST /api/v2/query?org= HTTP/1.1" 500 78 "-" "influxdb-client-go/1.1.0  (linux; amd64)" 74725c1d-a4c8-11ea-865d-0242ac160002 437
ts=2020-06-02T11:59:04.229313Z lvl=error msg="[500] - \"loc 2:12-2:18: unexpected token for property key: ILLEGAL (\\\"$\\\")\"" log_id=0N0wkt60000 service=httpd

It looks as if the $ sign gets passed through from Grafana to Influx when it shouldn't?

jjkrol commented 4 years ago

@andig did you try changing the

|> range($range)

to

|> range(start: v.timeRangeStart, stop: v.timeRangeStop)

?

For me the $range variable was no longer being substituted

andig commented 4 years ago

For me the $range variable was no longer being substituted

Ahh thanks, that has escaped my attention. WFM. I'd suggest to rename the issue title to something like "Breaking change in template variable substitution" with Grafana 7?

jjkrol commented 4 years ago

@andig I agree that the breaking change is a problem, but this issue is about the Metric request error that is caused by (identified until now) keep, drop, pivot and elapsed.

@peter-git-s what are the field names you are using in the problematic lines? I noticed that drop and keep works for me as long as I do not use _time (or other underscored fields) in the columns.

This seems consistent with #109, however it doesn't explain @cyanghsieh 's issue with elapsed.

I've changed grafana log level to debug, but the only thing I've noticed are the two lines:

lvl=dbug msg="panic: interface conversion: interface {} is nil, not time.Time" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource
lvl=eror msg="Metric request error" logger=context userId=1 orgId=1 uname=admin error="rpc error: code = Unavailable desc = transport is closing" remote_addr=ed.it.ed.xx
petersalemink95 commented 4 years ago

@jjkrol an axample where it goes wrong is:

|> drop(columns:["_measurement", "device", "test_id", "type", "unique_id"])

andig commented 4 years ago

Regarding variables the same problem exists with __interval which is broken:

|> aggregateWindow(every: $__interval,fn: mean)
M0rdecay commented 4 years ago

Same situation - fails when request has group() function (Grafana log, 7.0.1, image grafana/grafana:7.0.1):

t=2020-06-01T06:24:30+0000 lvl=eror msg="Metric request error" logger=context userId=1 orgId=1 uname=admin error="rpc error: code = Unavailable desc = transport is closing" remote_addr=xxx.xxx.xxx.xxx
t=2020-06-01T06:24:30+0000 lvl=eror msg="Request Completed" logger=context userId=1 orgId=1 uname=admin method=POST path=/api/ds/query status=500 remote_addr=xxx.xxx.xxx.xxx time_ms=1752 size=34 referer="http://grafana:3001/d/NtNnqDjWk/area?orgId=1&refresh=1m&editPanel=49"
t=2020-06-01T06:24:31+0000 lvl=info msg=Profiler logger=plugins.backend pluginId=grafana-influxdb-flux-datasource enabled=false
t=2020-06-01T06:24:31+0000 lvl=eror msg="Metric request error" logger=context userId=1 orgId=1 uname=admin error="rpc error: code = Unavailable desc = connection error: desc = \"transport: error while dialing: dial unix /tmp/plugin105981337: connect: connection refused\"" remote_addr=xxx.xxx.xxx.xxx
t=2020-06-01T06:24:31+0000 lvl=eror msg="Request Completed" logger=context userId=1 orgId=1 uname=admin method=POST path=/api/ds/query status=500 remote_addr=xxx.xxx.xxx.xxx time_ms=0 size=34 referer="http://grafana:3001/d/NtNnqDjWk/area?orgId=1&refresh=1m&editPanel=49"

Very strange, but everything is fine in the InfluxDB log:

[httpd] 172.17.0.1 - - [01/Jun/2020:06:24:30 +0000] "POST /api/v2/query?org= HTTP/1.1" 200 948 "-" "influxdb-client-go/1.1.0  (linux; amd64)" aec707f3-a3d1-11ea-8d6e-0242ac110002 1596765

This is an example of a query that has a problem. The first filter is intentionally replaced by neutral data, but still displays the essence of the situation:

errs = from(bucket: "$database/$ret_policy")
  |> range(start: v.timeRangeStart, stop:v.timeRangeStop)
  |> filter(fn:(r) => (r._measurement == "cpu" 
                       or r._measurement == "mem")
                   and (r._field == "used_percent"
                       or r._field == "usage_system"))
  |> group(columns: ["host"], mode:"by")
  |> derivative(unit: $coll_interval, nonNegative: true)
  |> aggregateWindow(every: $coll_interval, fn: sum)
  |> filter(fn: (r) => r["_value"] >= 0)
  |> yield()

But this request is working fine:

errs = from(bucket: "$database/$ret_policy")
  |> range(start: v.timeRangeStart, stop:v.timeRangeStop)
  |> filter(fn:(r) => (r._measurement == "cpu" 
                       or r._measurement == "mem")
                   and (r._field == "used_percent"
                       or r._field == "usage_system"))

  |> derivative(unit: $coll_interval, nonNegative: true)
  |> aggregateWindow(every: $coll_interval, fn: sum)
  |> filter(fn: (r) => r["_value"] >= 0)
  |> yield()
fhriley commented 4 years ago

This query gives me the Metric request error. The same query works fine in the influxdb 2 GUI.

from(bucket: "unbound")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "unbound" and (r["_field"] == "num_rpz_action_nxdomain" or r["_field"] == "total_num_queries"))
  |> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
  |> map(fn: (r) => ({ r with _value: r.num_rpz_action_nxdomain / r.total_num_queries * 100.0 }))
  |> last(column: "_value")

Commenting out the last two lines and it still fails. Commenting out the last 3, and it does not so it appears to be the pivot that is the issue.

fhriley commented 4 years ago

Here is a really simple query that generates the error. Remove the sum() at the end and the error goes away:

from(bucket: "unbound")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "unbound" and r["_field"] == "total_num_queries")
  |> difference(nonNegative: true)
  |> sum()
gji commented 4 years ago

Regarding variables the same problem exists with __interval which is broken:

|> aggregateWindow(every: $__interval,fn: mean)

I'm pretty sure $__interval no longer exists. The upgrade to version 7 introduces breaking changes, anything with "$" no longer exists. The following works:

  |> aggregateWindow(every: v.windowPeriod, fn: mean)

You can look in pkg/influx/macros.go to see the new code that does substitutions.

Digging through the code (most relevant code is in Init and Append of builder.go), I also found that if you want to plot something as a time series (which the code assumes if you have the _measurement column), you also need the _value, and _time column. If your operations drop these columns (e.g. after pivot the value field is gone, and after group the time column is gone), the plugin crashes. The way around this is to add in the necessary columns, e.g. via a map operation.

Moreover, there's a bug where if you don't have the _measurement column, the code tries to cast a time as a string, which just errors out.

ryantxu commented 4 years ago

@gji & @fhriley -- thanks for digging into this and finding queries that hit the problems. Can you post some sample responses so we can take a look at how to support them. Ideally: https://github.com/grafana/influxdb-flux-datasource/tree/master/pkg/influx/testdata will contain a wide variety of response formats that we can make sure work reasonably

ZeroPointTwo commented 4 years ago

@ryantxu I'm seeing the same error for spread() as well. The query works in Grafana if I remove the call to spread. I've attached the results csv file from Chronograf. I hope that's what you meant by sample response, sorry if I misunderstood.

from(bucket: "costs")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r._measurement == "energy_usage" and r._field == "total_kwh")
  |> drop(columns: ["hostname", "ip", "alias"])
  |> spread()

spread.txt

M0rdecay commented 4 years ago

I confirm the words @gji, the plugin crashes when the response from InfluxDB does not contain _measurement, that is, after any conversion. In fact, this means that we cannot use the aggregation functions.

M0rdecay commented 4 years ago

An example of a query result that breaks plugin

#datatype,string,long,dateTime:RFC3339,dateTime:RFC3339,dateTime:RFC3339,string,double
#group,false,false,true,true,false,true,false
#default,_result,,,,,,
,result,table,_start,_stop,_time,host,_value
,,0,2020-06-05T12:03:27.444072266Z,2020-06-05T12:08:27.444072266Z,2020-06-05T12:06:00Z,hostname.ru,8.291381590647958
,,0,2020-06-05T12:03:27.444072266Z,2020-06-05T12:08:27.444072266Z,2020-06-05T12:07:00Z,hostname.ru,0.5341565263056448
,,0,2020-06-05T12:03:27.444072266Z,2020-06-05T12:08:27.444072266Z,2020-06-05T12:08:00Z,hostname.ru,0.6676119389260387

We are waiting for news on this issue!

gji commented 4 years ago

@ryantxu, I think @M0rdecay covered my use case which is when using group(). However, I think rather than supporting every single use case it's important that the plugin returns more useful error messages (instead of just crashing). I don't think the plugin should try to guess what the X and Y axes are and do a lot of data formatting, which introduces a lot of mystery black box behavior. I can see why there's an exception for data with _measurement, _value, and _time columns as that is the format of the response for most generic queries. But if you're doing fancy data aggregation stuff, it's quite easy to also add a map command that formats the data to some specification.

If the _measurement, _value, and _time columns don't all exist then the data shouldn't be marked as a time series. E.g. changes on line 100 of builder.go for a more rigorous check:

        case col.Name() == "_measurement":
            fb.isTimeSeries = true

Though I think the distinction between treating data as "timeseries" and "table view" should be made explicit in documentation.

As an aside, I noticed that influxdb-client-go returns dateTime:RFC3339 columns as time.Time objects, while getConverter returns a function that expects an int64. This causes the plugin to crash in "table view". I'm pretty sure that is why the group() example above is crashing.

ryantxu commented 4 years ago

Untested... but the issue is likely solved in: https://github.com/grafana/influxdb-flux-datasource/pull/112

https://400-137349011-gh.circle-artifacts.com/0/ci/packages/grafana-influxdb-flux-datasource-7.0.0-dev.zip

M0rdecay commented 4 years ago

I updated the plugin from your link: bash-5.0$ grafana-cli --pluginUrl /var/lib/grafana/grafana-influxdb-flux-datasource-7.0.0-dev.zip plugins install grafana-influxdb-flux-datasource

installing grafana-influxdb-flux-datasource @ from: /var/lib/grafana/grafana-influxdb-flux-datasource-7.0.0-dev.zip into: /var/lib/grafana/plugins

Installed grafana-influxdb-flux-datasource successfully

And restarted the container. Result: bash-5.0$ grafana-cli plugins ls installed plugins: grafana-influxdb-flux-datasource @ 7.0.0 grafana-piechart-panel @ 1.5.0

But it seems that the plugin did not install properly - it displays version 7.0.0, not 7.0.0-dev. Also, requests with group() do not work. Am I doing something wrong?

M0rdecay commented 4 years ago

The contents of the archives are also different - on the left, the archive from the page https://grafana.com/grafana/plugins/grafana-influxdb-flux-datasource/installation image

M0rdecay commented 4 years ago

Unfortunately, even after completely removing the old plugin and unpacking the new one, requests with the group() function do not work. Same error: t=2020-06-08T07:20:02+0000 lvl=eror msg="Metric request error" logger=context userId=1 orgId=1 uname=admin error="rpc error: code = Unavailable desc = transport is closing" remote_addr=172.30.96.212

Appelg commented 4 years ago

I'm having trouble with this too.

The query works in Influx Cloud Data Explorer, but when I copy it to grafana I get "Metric request error". Removing "keep(..)" "solves" the problem, but I need "keep"...

from(bucket: "disk")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "diskspacefolders")
  |> keep(columns: ["_time", "_value", "_field"])
wuyb66 commented 4 years ago

group() can work with https://400-137349011-gh.circle-artifacts.com/0/ci/packages/grafana-influxdb-flux-datasource-7.0.0-dev.zip .

But "Metric request error" is returned when pivot() function used.

The following query is ok for both grafana and flux query. from(bucket: "meas2") |> range(start: v.timeRangeStart, stop: v.timeRangeStop) |> filter(fn: (r) => r["_measurement"] == "MS_DISK_MEAS") |> filter(fn: (r) => r["_field"] == "KBYTES" or r["_field"] == "USED" or r["_field"] == "CAPACITY")

Grafana will return "Metric request error" when pivot() added and flux query is normal. from(bucket: "meas2") |> range(start: v.timeRangeStart, stop: v.timeRangeStop) |> filter(fn: (r) => r["_measurement"] == "MS_DISK_MEAS") |> filter(fn: (r) => r["_field"] == "KBYTES" or r["_field"] == "USED" or r["_field"] == "CAPACITY") |> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")

wuyb66 commented 4 years ago

The pivot() function error may be related with time. It works when drop all time related columns. The blow query is ok in grafana. from(bucket: "meas2") |> range(start: v.timeRangeStart, stop: v.timeRangeStop) |> filter(fn: (r) => r["_measurement"] == "MS_DISK_MEAS") |> filter(fn: (r) => r["_field"] == "KBYTES" or r["_field"] == "USED" or r["_field"] == "CAPACITY") |> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value") |> drop(columns: ["_time", "_start", "_stop"])

jimmythom commented 4 years ago

I've got the same error (Metric Request Error) in Grafana 7.0.3 with each query (including queries without special functions like pivot, drop, ...) - but not with each panel type - for example stat panel works fine. I see correct result during editing panel, but after saving and refreshing dashboard page, the error is showing.

lasertiger commented 4 years ago

I have the issue. 7.0.3 Ubuntu. Query works in Influx ui (copied from). It's flakey. Sometimes works and sometimes fails. Mostly fails though.

Works sometimes: If I query for 1 hour Fails: If I query for 2 hours it fails. 5 minutes, it fails.

from(bucket: "a_bucket") |> range(start: v.timeRangeStart, stop: v.timeRangeStop) |> filter(fn: (r) => r["_measurement"] == "number_running") |> filter(fn: (r) => r["_field"] == "value")

Grafana-server logs that might be useful:

Here's the log with the query Jun 20 01:44:16 host grafana-server[31623]: t=2020-06-20T01:44:16+0000 lvl=info msg="from(bucket: \"a_bucket\")\n |> range(start: 2020-06-19T22:44:16Z, stop: 2020-06-20T01:44:16Z)\n |> filter(fn: (r) => r[\"_measurement\"] == \"number_running\")\n |> filter(fn: (r) => r[\"_field\"] == \"value\") => { REDACTED REDACTED}" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource

Panic: Interface conversion: interface {} is nil, not string Jun 20 01:44:17 host grafana-server[31623]: t=2020-06-20T01:44:17+0000 lvl=dbug msg="panic: interface conversion: interface {} is nil, not string" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource

rpc error: Remote transport is closing Jun 20 01:44:17 host grafana-server[31623]: t=2020-06-20T01:44:17+0000 lvl=eror msg="Metric request error" logger=context userId=1 orgId=1 uname=admin error="rpc error: code = Unavailable desc = transport is closing" remote_addr=x.x.x.x

Plugin exit status 2 Jun 20 01:44:17 host grafana-server[31623]: t=2020-06-20T01:44:17+0000 lvl=dbug msg="plugin process exited" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource path=/var/lib/grafana/plugins/grafana-influxdb-flux-datasource/dist/gpx_influx_linux_amd64 pid=17115 error="exit status 2"

Grafana returns 500 error Jun 20 01:44:17 host grafana-server[31623]: t=2020-06-20T01:44:17+0000 lvl=eror msg="Request Completed" logger=context userId=1 orgId=1 uname=admin method=POST path=/api/ds/query status=500 remote_addr=x.x.x.x time_ms=93 size=62 referer="http://_REDACTED_:3000/plugin/grafana/d/LtLlsCmMz/url-performance-copy?editPanel=5&orgId=1&from=now-3h&to=now"

Here is the full debug log: lvl=info msg="from(bucket: \"a_bucket\")\n |> range(start: 2020-06-19T22:44:16Z, stop: 2020-06-20T01:44:16Z)\n |> filter(fn: (r) => r[\"_measurement\"] == \"number_running\")\n |> filter(fn: (r) => r[\"_field\"] == \"value\") => { REDACTED REDACTED }" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource

lvl=dbug msg="panic: interface conversion: interface {} is nil, not string" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg= logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="goroutine 20 [running]:" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grafana/influxdb-flux-datasource/pkg/influx.(FrameBuilder).Append(0xc00013b4e0, 0xc0002b5b30, 0x0, 0x0)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/project/pkg/influx/builder.go:140 +0xd28" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grafana/influxdb-flux-datasource/pkg/influx.readDataFrames(0xc0001f4000, 0x5cd, 0xc8, 0xc000306010, 0x1, 0x1, 0x0, 0x0)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/project/pkg/influx/executor.go:58 +0x1e5" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grafana/influxdb-flux-datasource/pkg/influx.ExecuteQuery(0xde87a0, 0xc000032390, 0xc0001ee000, 0xce, 0x0, 0x0, 0xc0001b2538, 0x5, 0xc0001b2540, 0xe, ...)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/project/pkg/influx/executor.go:27 +0x164" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grafana/influxdb-flux-datasource/pkg/influx.(InfluxDataSource).QueryData(0xc000223500, 0xde8820, 0xc0001a0600, 0xc0001b1950, 0xb974c0, 0xdd7920, 0x8f1ccb)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/project/pkg/influx/datasource.go:150 +0x394" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grafana/grafana-plugin-sdk-go/backend.(dataSDKAdapter).QueryData(0xc000223520, 0xde8820, 0xc0001a0600, 0xc0001b0280, 0xb974c0, 0x0, 0xdd7920)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/github.com/grafana/grafana-plugin-sdk-go@v0.65.0/backend/data_adapter.go:21 +0x63" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grafana/grafana-plugin-sdk-go/backend/grpcplugin.(dataGRPCServer).QueryData(0xc0002237c0, 0xde8820, 0xc0001a0600, 0xc0001b0280, 0xc0002237c0, 0x7f92b69d1108, 0xc0001b4340)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/github.com/grafana/grafana-plugin-sdk-go@v0.65.0/backend/grpcplugin/grpc_data.go:47 +0x51" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grafana/grafana-plugin-sdk-go/genproto/pluginv2._Data_QueryData_Handler.func1(0xde8820, 0xc0001a0600, 0xc73940, 0xc0001b0280, 0x18, 0xc0001b1900, 0x203000, 0xc000148380)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/github.com/grafana/grafana-plugin-sdk-go@v0.65.0/genproto/pluginv2/backend.pb.go:1306 +0x86" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grpc-ecosystem/go-grpc-prometheus.(ServerMetrics).UnaryServerInterceptor.func1(0xde8820, 0xc0001a0600, 0xc73940, 0xc0001b0280, 0xc000196f60, 0xc000196f80, 0x96f93a, 0xc38ea0, 0xc000196fa0, 0xc000196f60)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-prometheus@v1.2.0/server_metrics.go:107 +0xad" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1(0xde8820, 0xc0001a0600, 0xc73940, 0xc0001b0280, 0xc0001cc000, 0x0, 0xc0001d7b08, 0x40cc08)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware@v1.2.0/chain.go:25 +0x63" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1(0xde8820, 0xc0001a0600, 0xc73940, 0xc0001b0280, 0xc000196f60, 0xc000196f80, 0xc000313b78, 0x5bf618, 0xc50ea0, 0xc0001a0600)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware@v1.2.0/chain.go:34 +0xd5" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="github.com/grafana/grafana-plugin-sdk-go/genproto/pluginv2._Data_QueryData_Handler(0xb974c0, 0xc0002237c0, 0xde8820, 0xc0001a0600, 0xc0001ae840, 0xc00027a060, 0xde8820, 0xc0001a0600, 0xc0001d0700, 0x307)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/github.com/grafana/grafana-plugin-sdk-go@v0.65.0/genproto/pluginv2/backend.pb.go:1308 +0x14b" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="google.golang.org/grpc.(Server).processUnaryRPC(0xc00027c180, 0xdef8c0, 0xc000308180, 0xc0001cc000, 0xc00027a810, 0x131b010, 0x0, 0x0, 0x0)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/google.golang.org/grpc@v1.27.1/server.go:1024 +0x501" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="google.golang.org/grpc.(Server).handleStream(0xc00027c180, 0xdef8c0, 0xc000308180, 0xc0001cc000, 0x0)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/google.golang.org/grpc@v1.27.1/server.go:1313 +0xd3d" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="google.golang.org/grpc.(Server).serveStreams.func1.1(0xc00033e020, 0xc00027c180, 0xdef8c0, 0xc000308180, 0xc0001cc000)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/google.golang.org/grpc@v1.27.1/server.go:722 +0xa1" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="created by google.golang.org/grpc.(*Server).serveStreams.func1" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="\t/root/go/pkg/mod/google.golang.org/grpc@v1.27.1/server.go:720 +0xa1" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource

lvl=eror msg="Metric request error" logger=context userId=1 orgId=1 uname=admin error="rpc error: code = Unavailable desc = transport is closing" remote_addr=x.x.x.x

lvl=dbug msg="plugin process exited" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource path=/var/lib/grafana/plugins/grafana-influxdb-flux-datasource/dist/gpx_influx_linux_amd64 pid=17115 error="exit status 2"

lvl=eror msg="Request Completed" logger=context userId=1 orgId=1 uname=admin method=POST path=/api/ds/query status=500 remote_addr=x.x.x.x time_ms=93 size=62 referer="http://REDACTED:3000/plugin/grafana/d/LtLlsCmMz/url-performance-copy?editPanel=5&orgId=1&from=now-3h&to=now"

lvl=dbug msg="Restarting plugin" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource lvl=dbug msg="starting plugin" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource path=/var/lib/grafana/plugins/grafana-influxdb-flux-datasource/dist/gpx_influx_linux_amd64 args=[/var/lib/grafana/plugins/grafana-influxdb-flux-datasource/dist/gpx_influx_linux_amd64] lvl=dbug msg="plugin started" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource path=/var/lib/grafana/plugins/grafana-influxdb-flux-datasource/dist/gpx_influx_linux_amd64 pid=17504 lvl=dbug msg="waiting for RPC address" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource path=/var/lib/grafana/plugins/grafana-influxdb-flux-datasource/dist/gpx_influx_linux_amd64 lvl=info msg=Profiler logger=plugins.backend pluginId=grafana-influxdb-flux-datasource enabled=false lvl=dbug msg="Serving plugin" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource plugins="[diagnostics resource data]" lvl=dbug msg="using plugin" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource version=2 lvl=dbug msg="plugin address" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource address=/tmp/plugin888959841 network=unix lvl=dbug msg="Plugin restarted" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource

hex0r commented 4 years ago

Hi everyone

I have the same problem using grafana 7.0 complex queries just work in the Influxdb “dataexplorer” but not on grafana 7.0

I have some single stat panels which queries from the same data source just different host variables. Sometime some of them works without changing anything (just reloading the page) Next time reloading the same panels give me "Metric request error" Could it be that there is kind of an timeout issue?

ryantxu commented 4 years ago

Support for flux is now in grafana core, and will be released in the next version 7.1 (beta in early july)

@hex0r, can you check with the latest nightly release of grafana?

https://grafana.com/grafana/download image (see the "Nightly builds" link on the right)

If it is not working there, can you post the query and response that are giving you troubles?

andig commented 4 years ago

The docker tag on the nightlies page doesn‘t exist- how can we get a nightly docker image?

ryantxu commented 4 years ago

https://hub.docker.com/r/grafana/grafana/tags?page=1&name=master

(i think)

hex0r commented 4 years ago

thx I have tested the version v7.1.0-503e4700pre (4488a4cd5a) but still the same issue with "Metric request error" Sorry

ryantxu commented 4 years ago

Can you post your query and the response you see in chronograph (csv text)?

ryantxu commented 4 years ago

Ideally something we can eventually put in: https://github.com/grafana/grafana/tree/master/pkg/tsdb/influxdb/flux/testdata

to make sure that response works OK

hex0r commented 4 years ago

This one never works on grafana Result 2020-06-29-19-50_chronograf_data.txt

st = "1" futter= from(bucket: "prod_data") |> range(start: v.timeRangeStart, stop: v.timeRangeStop) |> filter(fn: (r) => r["_measurement"] == "mqtt_consumerprod") |> filter(fn: (r) => r["_field"] == "f") |> filter(fn: (r) => r["host"] == "iot") |> filter(fn: (r) => r["st"] == st) |> sum()

wasser = from(bucket: "prod_data") |> range(start: v.timeRangeStart, stop: v.timeRangeStop) |> filter(fn: (r) => r["_measurement"] == "mqtt_consumerprod") |> filter(fn: (r) => r["_field"] == "w") |> filter(fn: (r) => r["host"] == "iot") |> filter(fn: (r) => r["st"] == st) |> sum()

join(tables: {water:wasser ,food:futter}, on: ["st"])

|> map(fn: (r) => ({ r with _value: if r._value_water == 0.0 or r._value_food == 0.0 then 0.0 else r._value_water / r._value_food * 100.0 }))

|> drop(columns: ["topic_food","topic_water", "st_water", "host_food", "host_water","_stop_food", "_start_food","_measurement_food","_measurement_water", "_field_food","_field_water","_value_food","_value_water"])

This one sometime works on grafana 2020-06-29-19-54_chronograf_data.txt

st="2" field = "temp"

from(bucket: "env_data") |> range(start: v.timeRangeStart, stop: v.timeRangeStop) |> filter(fn: (r) => r["_measurement"] == "mqtt_consumerenv") |> filter(fn: (r) => r["stall"] == st) |> filter(fn: (r) => r["_field"] == field ) |> drop(columns: ["dev_id", "location"]) |> sort(columns: ["_time"], desc: false) |> movingAverage(n: 4) |> last()

ryantxu commented 4 years ago

@hex0r -- looks like the root of that issue is that the "_time" column in the timeseries is renamed to "_start_food" Hopefully: https://github.com/grafana/grafana/pull/25910 should fix it

The alternative approach is that if "_time" does not exist the response should be parsed as a table

andig commented 4 years ago

@ryantxu here is another query that works fine in the CLI (against 1.8) but produces a flat line in the UI (it does not error though):

❯ docker run influxdb:1.8 influx -host nas.fritz.box -type flux -path-prefix /api/v2/query -execute \
'from(bucket: "evcc") |> range(start: -1m) |> filter(fn: (r) => r._measurement != "chargedEnergy") |> filter(fn: (r) => r.loadpoint == "wallbe" or not exists r.loadpoint)'
Result: _result
Table: keys: [_start, _stop, _field, _measurement, loadpoint]
                   _start:time                      _stop:time           _field:string     _measurement:string        loadpoint:string                      _time:time                  _value:float  
------------------------------  ------------------------------  ----------------------  ----------------------  ----------------------  ------------------------------  ----------------------------  
2020-06-30T06:24:09.808635700Z  2020-06-30T06:25:09.808635700Z                   value             chargePower                  wallbe  2020-06-30T06:24:11.000000000Z                             0  
2020-06-30T06:24:09.808635700Z  2020-06-30T06:25:09.808635700Z                   value             chargePower                  wallbe  2020-06-30T06:24:21.000000000Z                             0  
Table: keys: [_start, _stop, _field, _measurement]
                   _start:time                      _stop:time           _field:string     _measurement:string                      _time:time                  _value:float  
------------------------------  ------------------------------  ----------------------  ----------------------  ------------------------------  ----------------------------  
2020-06-30T06:24:09.808635700Z  2020-06-30T06:25:09.808635700Z                   value               gridPower  2020-06-30T06:24:11.000000000Z                      -465.019  
2020-06-30T06:24:09.808635700Z  2020-06-30T06:25:09.808635700Z                   value               gridPower  2020-06-30T06:24:21.000000000Z                      -467.342  
2020-06-30T06:24:09.808635700Z  2020-06-30T06:25:09.808635700Z                   value               gridPower  2020-06-30T06:24:31.000000000Z                      -490.024  
Table: keys: [_start, _stop, _field, _measurement]
                   _start:time                      _stop:time           _field:string     _measurement:string                      _time:time                  _value:float  
------------------------------  ------------------------------  ----------------------  ----------------------  ------------------------------  ----------------------------  
2020-06-30T06:24:09.808635700Z  2020-06-30T06:25:09.808635700Z                   value                 pvPower  2020-06-30T06:24:11.000000000Z                      -891.298  
2020-06-30T06:24:09.808635700Z  2020-06-30T06:25:09.808635700Z                   value                 pvPower  2020-06-30T06:24:21.000000000Z                       -892.55  

The query inspector is useless as it shows only a "data stream".

I've also tried 7.1-master (thank you for the tag, it isn't documented on the nightly page), but I cannot even define the datasource:

Screenshot 2020-06-30 at 08 24 18

Maybe the datasource definition need some logic along the lines of -path-prefix /api/v2/query?

hex0r commented 4 years ago

if I rename the time column like |> rename(columns: {"_start_food: "_time"}) it is still not working

lasertiger commented 4 years ago

This problem is intermittent. I am including Grafana debug logs which show a problem. The Grafana Influx plugin exits and restarts (i.e. it crashes).

Note: Queries which were working yesterday don't work today. Sometimes a query works. Sometimes it doesn't (metric request error). Sometimes changing the date range helps. Today it isn't and most of my views have 'metric_error'. Others work sometimes.

Grafana: 7.0.3 InfluxDB: Version 2.0.0 (d00706d) (from GUI)

If you check my previous report here 17 days ago:

https://github.com/grafana/influxdb-flux-datasource/issues/105#issuecomment-646921009

The problem appears in logs in the Grafana Influx driver. Written in Go.

Grafana Server logs:

Jul 6 22:42:58 grafana-server[17278]: t=2020-07-06T22:42:58+0000 lvl=dbug msg="plugin process exited" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource path=/var/lib/grafana/pluginsgrafana-influxdb-flux-datasource/dist/gpx_influx_linux_amd64 pid=17482 error="exit status 2" Jul 6 22:42:58 grafana-server[17278]: t=2020-07-06T22:42:58+0000 lvl=eror msg="Metric request error" logger=context userId=1 orgId=1 uname=admin error="rpc error: code = Unavailable desc = transport is closing" remote_addr=99.230.229.86

Here are the debug logs for Influx:

Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/github.com/grafana/grafana-plugin-sdk-go@v0.65.0/backend/data_adapter.go:21 +0x63" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="github.com/grafana/grafana-plugin-sdk-go/backend/grpcplugin.(dataGRPCServer).QueryData(0xc00019ca40, 0xde8820, 0xc0003001b0, 0xc0003080a0, 0xc00019ca40, 0x7fa5520a3030, 0xc000306100)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/github.com/grafana/grafana-plugin-sdk-go@v0.65.0/backend/grpcplugin/grpc_data.go:47 +0x51" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="github.com/grafana/grafana-plugin-sdk-go/genproto/pluginv2._Data_QueryData_Handler.func1(0xde8820, 0xc0003001b0, 0xc73940, 0xc0003080a0, 0x18, 0xc000309720, 0x203000, 0xc000180000)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/github.com/grafana/grafana-plugin-sdk-go@v0.65.0/genproto/pluginv2/backend.pb.go:1306 +0x86" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="github.com/grpc-ecosystem/go-grpc-prometheus.(ServerMetrics).UnaryServerInterceptor.func1(0xde8820, 0xc0003001b0, 0xc73940, 0xc0003080a0, 0xc0003162c0, 0xc0003162e0, 0x96f93a, 0xc38ea0, 0xc000316300, 0xc0003162c0)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-prometheus@v1.2.0/server_metrics.go:107 +0xad" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1(0xde8820, 0xc0003001b0, 0xc73940, 0xc0003080a0, 0xc000318000, 0x0, 0xc000335b08, 0x40cc08)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware@v1.2.0/chain.go:25 +0x63" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1(0xde8820, 0xc0003001b0, 0xc73940, 0xc0003080a0, 0xc0003162c0, 0xc0003162e0, 0xc00032cb78, 0x5bf618, 0xc50ea0, 0xc0003001b0)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware@v1.2.0/chain.go:34 +0xd5" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="github.com/grafana/grafana-plugin-sdk-go/genproto/pluginv2._Data_QueryData_Handler(0xb974c0, 0xc00019ca40, 0xde8820, 0xc0003001b0, 0xc000314060, 0xc0001f6000, 0xde8820, 0xc0003001b0, 0xc000326700, 0x330)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/github.com/grafana/grafana-plugin-sdk-go@v0.65.0/genproto/pluginv2/backend.pb.go:1308 +0x14b" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="google.golang.org/grpc.(*Server).processUnaryRPC*(0xc000182480, 0xdef8c0, 0xc000183380, 0xc000318000, 0xc0001f67b0, 0x131b010, 0x0, 0x0, 0x0)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/google.golang.org/grpc@v1.27.1/server.go:1024 +0x501" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="google.golang.org/grpc.(Server).handleStream(0xc000182480, 0xdef8c0, 0xc000183380, 0xc000318000, 0x0)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/google.golang.org/grpc@v1.27.1/server.go:1313 +0xd3d" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="google.golang.org/grpc.(*Server).serveStreams.func1.1*(0xc000194750, 0xc000182480, 0xdef8c0, 0xc000183380, 0xc000318000)" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/google.golang.org/grpc@v1.27.1/server.go:722 +0xa1" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="created by google.golang.org/grpc.(Server).serveStreams.func1" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="\t/root/go/pkg/mod/google.golang.org/grpc@v1.27.1/server.go:720 +0xa1" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="plugin process exited" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource path=/var/lib/grafana/plugins/grafana-influxdb-flux-datasource/dist/gpx_influx_linux_amd64 pid=19760 error="exit status 2" Jul 6 22:44:22 grafana-server[17278]: t=2020-07-06T22:44:22+0000 lvl=dbug msg="Restarting plugin" logger=plugins.backend pluginId=grafana-influxdb-flux-datasource

ryantxu commented 4 years ago

flux support was added to grafana 7.1 (released today :tada: ) https://www.influxdata.com/blog/how-grafana-dashboard-influxdb-flux-influxql/

If you still have issues, can you post them: https://github.com/grafana/grafana/issues