akumuli / akumuli-datasource

Datasource plugin for grafana
Apache License 2.0
23 stars 4 forks source link

Doesn't show any data and gives error #9

Open bitc opened 6 years ago

bitc commented 6 years ago

I'm using Grafana 5.2.1.

I installed the plugin, and in Grafana added the "Akumuli" data source.

Then I clicked create new graph button.

Immediately the new graph showed a red exclamation point with the error tooltip "query parsing error".

I clicked to edit the graph, clicked the "Metric" field, and all of my metrics that I loaded show up :+1:

When I choose a metric, nothing shows up in the graph. The red exclamation point error tooltip changes to "not found".

When I look at the grafana logs, I see this line whenever I click one of my metrics:

2018/07/13 14:06:56 httputil: ReverseProxy read error during body copy: unexpected EOF
Lazin commented 6 years ago

Thank you for the report. What version of Akumuli are you using and is there any errors in the akumuli log?

bitc commented 6 years ago

I am using Akumuli compiled from source. Version 0.7.50 bbc0af666ee4ce45e80fe18aa3df206bb3334ac0

I don't see any output in the Akumuli log (other than the 3 initial "OK" lines)

Lazin commented 6 years ago

By default logs are stored in /tmp/akumuli.log

bitc commented 6 years ago

Here is what shows up in the Akumuli log as soon as I click "Create new Graph" button in grafana:

2018-07-13 16:54:05,094 http [INFO] Cursor 140703128619896 created
2018-07-13 16:54:05,095 http [INFO] Cursor 140703128619896 started
2018-07-13 16:54:05,096 Main [INFO] Parsing query:
2018-07-13 16:54:05,096 Main [INFO] {
    "group-aggregate": {
        "step": "20s",
        "func": [
            "null"
        ]
    },
    "range": {
        "from": "20180713T105405.046",
        "to": "20180713T165405.046"
    },
    "where": "",
    "order-by": "series",
    "apply": "",
    "limit": "958"
}

2018-07-13 16:54:05,096 Main [ERROR] Invalid aggregation function `null`
2018-07-13 16:54:05,096 Main [ERROR] Can't validate `group-aggregate` statement, `metric` field required
2018-07-13 16:54:05,097 http [INFO] Cursor 140702927329216 done
2018-07-13 16:54:05,097 http [INFO] Cursor 140702927329216 destroyed

Then when I edit the graph and select one of the metrics this shows up in the log:

2018-07-13 16:55:02,579 http [INFO] Cursor 140703128619896 created
2018-07-13 16:55:02,579 http [INFO] Cursor 140703128619896 started
2018-07-13 16:55:02,583 http [INFO] Cursor 140702927333536 done
2018-07-13 16:55:02,584 http [INFO] Cursor 140702927333536 destroyed
2018-07-13 16:55:04,185 http [INFO] Cursor 140703128619896 created
2018-07-13 16:55:04,185 http [INFO] Cursor 140703128619896 started
2018-07-13 16:55:04,187 http [INFO] Cursor 140702927333408 done
2018-07-13 16:55:04,188 http [INFO] Cursor 140702927333408 destroyed
2018-07-13 16:55:04,344 http [INFO] Cursor 140703128619896 created
2018-07-13 16:55:04,344 http [INFO] Cursor 140703128619896 started
2018-07-13 16:55:04,345 http [INFO] Cursor 140702927329216 done
2018-07-13 16:55:04,346 http [INFO] Cursor 140702927329216 destroyed
2018-07-13 16:55:04,348 http [INFO] Cursor 140703128619896 created
2018-07-13 16:55:04,348 http [INFO] Cursor 140703128619896 started
2018-07-13 16:55:04,349 http [INFO] Cursor 140702927333232 done
2018-07-13 16:55:04,350 http [INFO] Cursor 140702927333232 destroyed
2018-07-13 16:55:10,302 http [INFO] Cursor 140703128619896 created
2018-07-13 16:55:10,302 http [INFO] Cursor 140703128619896 started
2018-07-13 16:55:10,303 Main [INFO] Parsing query:
2018-07-13 16:55:10,303 Main [INFO] {
    "group-aggregate": {
        "metric": "cpu",
        "step": "20s",
        "func": [
            "mean"
        ]
    },
    "range": {
        "from": "20180713T105510.260",
        "to": "20180713T165510.260"
    },
    "where": "",
    "order-by": "series",
    "apply": "",
    "limit": "958"
}

2018-07-13 16:55:10,304 http [INFO] Cursor 140702927333664 done
2018-07-13 16:55:10,305 http [INFO] Cursor 140702927333664 destroyed
2018-07-13 16:55:12,736 http [INFO] Cursor 140703128619896 created
2018-07-13 16:55:12,736 http [INFO] Cursor 140703128619896 started
2018-07-13 16:55:12,737 Main [INFO] Parsing query:
2018-07-13 16:55:12,737 Main [INFO] {
    "group-aggregate": {
        "metric": "cpu.0.cpu.user",
        "step": "20s",
        "func": [
            "mean"
        ]
    },
    "range": {
        "from": "20180713T105512.690",
        "to": "20180713T165512.691"
    },
    "where": "",
    "order-by": "series",
    "apply": "",
    "limit": "958"
}

2018-07-13 16:55:12,738 Main [TRACE] Can't open next iterator because not found
2018-07-13 16:55:12,738 Main [TRACE] Can't open next iterator because not found
2018-07-13 16:55:12,749 http [INFO] Cursor 140702927334096 destroyed
2018-07-13 16:55:23,894 http [INFO] Cursor 140703128653512 created
2018-07-13 16:55:23,894 http [INFO] Cursor 140703128653512 started
2018-07-13 16:55:23,895 Main [INFO] Parsing query:
2018-07-13 16:55:23,895 Main [INFO] {
    "group-aggregate": {
        "metric": "cpu.0.cpu.user",
        "step": "20s",
        "func": [
            "mean"
        ]
    },
    "range": {
        "from": "20180713T105523.847",
        "to": "20180713T165523.847"
    },
    "where": "",
    "order-by": "series",
    "apply": "",
    "limit": "958"
}

2018-07-13 16:55:23,896 Main [TRACE] Can't open next iterator because not found
2018-07-13 16:55:23,896 Main [TRACE] Can't open next iterator because not found
2018-07-13 16:55:23,907 http [INFO] Cursor 140702927334272 destroyed

Thank you

Lazin commented 6 years ago

Now it's getting clearer. Can you try to set the aggregation function manually from the drop-down list called "Aggregator"? It should be set by default (at least it's supposed to). Another possible fix is to disable downsampling. I'll try to reproduce this issue locally. What browser are you using?

Lazin commented 6 years ago

The second query looks good. Is it possible that there are no data-points in this time-range? The message Can't open next iterator because not found means that it tried to scan the series but it was empty.

bitc commented 6 years ago

The "Aggregator" was set at "mean", I tried changing it, and also changing it back to "mean". also tried disabling downsamping. The red exclamation point goes away, but still the graph shows no data (and says "No data points". Everytime I change one of these fields in grafana, the grafana log prints

2018/07/13 17:22:56 httputil: ReverseProxy read error during body copy: unexpected EOF

I'm using latest Chrome, also tried in Firefox. The data is coming from collected with its "write_tsdb" plugin pointed to Akumuli port 4242. The collectd logs show no problem, and the metric names do show up in the grafana autocomplete

Lazin commented 6 years ago

OK, I'll try to replicate the problem using collectd. Maybe the data is not actually gets written because of some compatibility issue (you can check /api/stats endpoints to see if this is the case). Or maybe, there is a time-zone issue and all data-points are shifted several hours back. BTW, are there other kinds of errors in the log?