Closed mkristiansen closed 8 years ago
Hi,
Do you have a pro or higher level NewRelic subscription?
Can you include the output of:
curl -X GET 'https://api.newrelic.com/v2/applications.json' \
-H 'X-Api-Key:xxxxxxxxxx' -i
curl -X GET 'https://api.newrelic.com/v2/applications/{application_id}/metrics.json' \
-H 'X-Api-Key:xxxxxxxxxx' -i
(where application_id is one of the id fields in the first response)
Obviously it'd be useful to have as much data as possible, but you might want to filter out anything sensitive in there.
Thanks!
Thank you for looking into this!
I collected the following data
All 4 files are here
It is my employers NR account - subscriptions included: Web Enterprise Annual Mobile Enterprise Trial Insights Pro Annual Browser Pro Annual Synthetics Pro Annual
Hi,
I haven't yet been able to reproduce the issue. However, I've just merged a change to the api fetching code that fixes a significant memory leak, and improves performance somewhat too. It's possible that it was failing previously as you have considerably more applications than I'm able to test with, and if so, this might fix it.
Would you mind checking if the latest version ( f2545ce ) fixes it?
At a cursory look (cloned repo afresh, make, run, use browser to access /metrics) I get the same result. At least the output is the same as seen on previous attempts.
Really sorry this has taken so long - I've been swamped with other things. I've created a branch: https://github.com/jfindley/newrelic_exporter/tree/issue-4 which I think should fix the issue - can you please check it out and let me know?
If it doesn't fix the problem, can you possibly give me the debug logs from the new version?
Thanks,
James
It took over 45 minutes to scrape the metrics once by it did complete. Thanks for your support!
Are you amenable to help establish a filter/ parameter to select which application(s) are scraped?
45 minutes! ouch! I probably should parallelise the retrieval, although this would come at the cost of increased memory usage, 45 minutes is clearly unreasonable.
A filter is also a reasonable idea, though it'd require some thought to implement, because it'd be good to be able to filter on both application IDs and metric types. It probably means introducing a config file (which could store the API key too).
api_key: zxy
1234:
- all
4567:
- metric1
- metric2
Might be reasonable. I'll have a think.
I've created https://github.com/jfindley/newrelic_exporter/tree/parallel-reqs which should improve the speed of the retrieval.
My testing shows that it's a bit over 3 times faster than it was, but that's on my (relatively) small data set. I don't sadly have access to a data set anything like as huge as yours, but I'd hope that the gains were bigger in your case. I'd be interested to see what difference it made, if you have a few mins to test?
I'll have a closer look at the filtering soon.
Great progress!! Ran the new branch against the metrics from one of our test systems
curl http://localhost:9126/metrics > NRMetrics.txt % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 73.7M 100 73.7M 0 0 688k 0 0:01:49 0:01:49 --:--:-- 18.0M
curl http://localhost:9126/metrics > NRMetrics.txt % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 73.5M 100 73.5M 0 0 527k 0 0:02:22 0:02:22 --:--:-- 18.1M
curl http://localhost:9126/metrics > NRMetrics.txt % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 73.5M 100 73.5M 0 0 529k 0 0:02:22 0:02:22 --:--:-- 16.6M
curl http://localhost:9126/metrics > NRMetrics.txt % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 73.6M 100 73.6M 0 0 738k 0 0:01:42 0:01:42 --:--:-- 20.7M
As we haven't run into any stability issues I've merged the parallel-requests branch. Given that it now completes in < 5 mins, do you still want the ability to filter out applications, or can I close this?
Ok to close. If we require additional support once we have done more explorations I'll open a new issue. Thanks for your support.
If you need anything tested on a large application/metric set in the future please do not hesitate to reach out.
Steps:
Download and install Go (go1.4.2.darwin-amd64-osx10.8.pkg) from https://golang.org/dl/ git clone https://github.com/jfindley/newrelic_exporter.git (notice I tool master branch to get logging) cd newrelic_exporter make ./newrelic_exporter -api.key=xxxxxxxxxxxxxxx -log.level=debug
Supporting data
Log output when I query http://exporter_ip:9126/metrics with curl or in a browser:
Browser / curl output:
Go version
go version go1.4.2 darwin/amd64
Can you assist?