I'm encountering a memory issue which seems related to #748 while using sentry-go in a project that launches several hundred goroutines. I'd believe this would be solved with that issue, but since that was focused on CPU I thought I'd open a separate issue for tracking. In my setup, each goroutine opens its own transaction and spans for tracing, similar to the example from #748. However, I am using Sentry alongside OpenTelemetry (Otel) to handle third-party integrations such as Gorm and Redis.
In my implementation, transactions are created at the goroutine level to avoid exceeding span limits for jobs. This particular job I used for debugging can involve up to 1300 goroutines. However, when profiling is enabled, memory usage escalates significantly—what typically takes around 200MB of heap increases to 4GB.
pprof svg
Golang is a recent language for me, been using it only for about a year, so let me know if I missed some obvious debugging item.
My initialization of Sentry/Otel with profiling being disabled to stop the memory sprawling out.
I use the Otel sampler with Sentry set to 100%, and seems to have played nicely for me thus far. Though I have seen other issues open that seemed to state otherwise?
I'm encountering a memory issue which seems related to #748 while using sentry-go in a project that launches several hundred goroutines. I'd believe this would be solved with that issue, but since that was focused on CPU I thought I'd open a separate issue for tracking. In my setup, each goroutine opens its own transaction and spans for tracing, similar to the example from #748. However, I am using Sentry alongside OpenTelemetry (Otel) to handle third-party integrations such as Gorm and Redis.
In my implementation, transactions are created at the goroutine level to avoid exceeding span limits for jobs. This particular job I used for debugging can involve up to 1300 goroutines. However, when profiling is enabled, memory usage escalates significantly—what typically takes around 200MB of heap increases to 4GB. pprof svg
pprof top:
Watching the sentry-go.profileRecorder, I see the variables stacks and frames increases through out the lifetime. 2024-10-22 12-34-58 2024-10-22 12-35-21 2024-10-22 12-36-09 2024-10-22 12-36-29 2024-10-22 12-37-03 Though I'll admit, I haven't explored the actual size of these variables, as I need to move on at the moment.
Golang is a recent language for me, been using it only for about a year, so let me know if I missed some obvious debugging item.
My initialization of Sentry/Otel with profiling being disabled to stop the memory sprawling out. I use the Otel sampler with Sentry set to 100%, and seems to have played nicely for me thus far. Though I have seen other issues open that seemed to state otherwise?
My util function for creating spans:
Example usage: