Closed theo-mesnil closed 1 year ago
Did it result in an out-of-memory exception? Also, did the problem appear after a dd-trace upgrade or did you just install the library?
Hi @rochdev :)
We did a load test before and after dd-trace on our code :
Before dd-trace, result of load testing:
After dd-trace, result of load testing:
Looks like you're using profiling. Can you share what your heap profiler data looks like during that load test?
@theo-mesnil Did you try disabling the profiler? Since you have runtime metrics enabled as well, I'd try to disable that too so that we can narrow down the issue to tracing. Also, we've seen reports of upgrading Node fixing memory leak issues, so might be worth a shot to try upgrading to the latest 16.x. And lastly, did the issue start after upgrading dd-trace or was it a fresh install?
@Qard @rochdev i'll keep you updated tomorrow after new load testing :)
We upgraded nodejs for our tests to v16.18.0.
With profiling we got a memory leak who take more time to get the crash:
When we removed the profiling we got the same result:
@theo-mesnil What about runtime metrics and the last version of dd-trace that was not causing a leak?
@rochdev it's a fresh install for dd-trace, we got the last version 3.7.1 (same with 3.7.0) I removed metrics and profiling and the result:
I would recommend opening a support issue so that we can collect more information about your environment and look into your account.
Thanks @rochdev i will open a support issue :)
@theo-mesnil were you able to open a support issue? If so I'll close this GH issue out.
Hi @tlhunter, yes thanks i need to make some test with them :)
Expected behaviour No memory leak on server side
Actual behaviour We have done a load test with vegeta to check if we got a memory leak and we got this results:
Steps to reproduce
tracer.init({ service: 'xxxx', profiling: true, runtimeMetrics: true })
export default tracer