brightalley / lighthouse-apollo

Integration between Laravel Lighthouse and Apollo Studio
8 stars 3 forks source link

Submit-tracing high memory consumption #3

Closed pyrou closed 3 years ago

pyrou commented 3 years ago

When using tracing feature in redis send_tracing_mode, I noticed an high memory consumption after a thousand of request for the artisan command submit-tracing : around 500Mb for around 1000 reqs.

Quick-win is to reduce the proposed window of everyFiveMinutes() to everyMinutes, to statistically divide by 5 the number of req to proceed. But this can't be a viable strategy on long-term. Moreover, if for any reason cron stops for several hours, this could end-up with an unrecoverable overflowing queue.

After a quick check on implementation I noticed all tracings are loaded in a single PHP array $results. This same array is later map to a protobuf version, causing this huge memory consumption.

One possible improvement would be to limit the number of tracings in this results array to let's say 100 requests, and do multiple requests to Apollo, 100 tracings by 100 tracings until the queue is empty.

Another way would be to completely revamp the way tracings are handled to use Jobs and Batches (see https://laravel.com/docs/8.x/queues#job-batching), and delegate async processing to horizon. It would remain possible to send events syncronously in a non-redis-environement by using QUEUE_CONNECTION=“sync” to process those jobs.

@MaartenStaa I would be happy to get involved in implementation. Let me know your thought about the two proposed approches.

MaartenStaa commented 3 years ago

@pyrou Thanks for your feedback. I have just pushed some changes to implement your suggestion of sending 100 tracings at a time, and have updated the readme as well. I looked at the batching option as well, but apart from being a bigger change, I'm not sure if it's a great fit.

I'll probably hold off on tagging a new release until your PR #2 is merged.