Open raulamoretti opened 10 months ago
Hi, sorry for the delay... I don't think there is a memory leak in the package as the worker is the Laravel worker itself. Maybe Horizon? But I don't know. I believe that If you want "a special method to decrease PHP footprint in worker memory", I think that you should think it for the standard Laravel worker. However, the 30 GB of memory usage is uniformely distributed across the worker processes? Have you tried memory usage, by using N Laravel installations, one for each domain?
For the second idea "a common queue to many clients", I think that you can't do this forking my package, but you could consider to rethink your application by building a "central" module which process the jobs of all your domains. But generally is not that simple :)
Anyway, I leave the issue open for quite bit: let's see if someone has some idea to improve the situation.
Thanks
Giacomo
hey @raulamoretti have you solved this issue ?
My dummiest solution for this is to run a loop and run each queue sepertly
setup MULTI_DOMAINS
env var eg,
#!/bin/bash
MULTI_DOMAINS=api17005.localhost,api16576.localhost,api22800.localhost
if [ -n "$MULTI_DOMAINS" ]; then
IFS=',' read -ra domains <<< "$MULTI_DOMAINS"
while [ true ]
do
for domain in "${domains[@]}"; do
#echo "queue work for $domain"
php /var/www/html/artisan queue:work --max-jobs=20 --stop-when-empty --domain=$domain
done
done
else
echo "Environment variable MULTI_DOMAINS is empty."
fi
Hey Giacomo!
Some time ago I opened a issue to config horizon.
Now I have a problem with memory / queues.
We run 70 domains, with huge queues, 4 queues in small clients e 18 queues in higher clients.
With this we spend 30 Gigabits of memory and CPU.
I think in a special method to decrease PHP footprint in worker memory or a common queue to many clients.