Open martin-coded opened 1 year ago
If you have a similar problem and just want to take some sleep or a lot of coffee for the next several hundred thousand seconds, without checking if the task was killed because it run out of memory? I introduce to you the super bad quick-fix loop ;) This is not a solution! I only use it for the initial generation in a dev env.
while true
do
nohup php craft imager-x/generate -f articleImage -t=articleHeaderAvif &
pid=$!
wait $pid || continue
break
done
I've noticed something similar to this, maybe there's some kind of memory leak related to console commands. I'll have a look.
4.2.0 has a fix that will greatly improve memory usage when using the generate command. Let me know if that helps.
@aelvan I am also running into memory issues when generating all assets (4 named transforms with 3-5 sizes each, jpeg and avif) of a volume with php craft imager-x/generate -v [myvolume]
After about an hour, the process has consumed about 8gb of memory and is killed by the kernel:
kernel: php invoked oom-killer: gfp_mask=0x140cca(GFP_HIGHUSER_MOVABLE|__GFP_COMP), order=0, oom_score_adj=0
[...]
kernel: Out of memory: Killed process 323612 (php) total-vm:6616156kB, anon-rss:3270960kB, file-rss:0kB, shmem-rss:0kB, UID:33 pgtables:12520kB oom_score_adj:0
From watching the script run I can see that each transform adds a few mb's of memory that does not get freed, so you have to run into some limit at some point.
I'd have two suggestions here:
After some further testing it turns out, most of our memory problems result while creating AVIF images. There definitely seems to be a memory leak here. We found two workarounds:
Similarly to https://github.com/spacecatninja/craft-imager-x/issues/210#issuecomment-1527109597 we use a loop, but kill the process after a defined time (e.g. 30m) instead of waiting for it to use up all available memory:
#!/bin/sh
while timeout 30m php craft imager-x/generate -v [YOUR_VOLUME] ; ret=$? ; [ $ret -ne 0 ]; do
echo "Restarting generate"
done
The second solution is to use the customEncoders
setting in config/imager-x.php
, and define an external encoder like avifenc (https://github.com/AOMediaCodec/libavif), cavif (https://github.com/kornelski/cavif-rs) etc:
<?php
return [
[...]
'customEncoders' => [
'avif' => [
'path' => '/usr/bin/cavif',
'options' => [
'quality' => 80,
'speed' => 7,
],
'paramsString' => '--quality {quality} --speed {speed} --overwrite -o {dest} {src}'
],
],
];
@dgsiegel Could you post more details about your environment (php version, image driver, etc).
@dgsiegel Could you post more details about your environment (php version, image driver, etc).
@aelvan sure thing! It's a pretty standard Debian 12 install, with:
This is a hard one to confidently resolve, mostly because it's hard to make a reliable test case. Imagick using external libraries, and PHP's wonky garbage collection doesn't help.
Anyway, there might've been a leak in my code related to avif specifically. I've swapped out my custom code, and instead use the Imagine library directly (when I implemented support for avif and jxl, Imagine didn't support it, but as of Craft 4.4, it does), which at least should increase the chance that this is resolved. But, I'll do more tests and see if I can find more issues that needs to be resolved. If you test this again, let me know if there're any improvements.
I've also added some new features to the generate command, it now has parameters for limit
, offset
and queue
:
./craft imager-x/generate -v images -t myTransform --limit=20 --offset=20 --queue
When adding --queue
, transforms will not be done runtime in the console, queue jobs for the transforms will be created instead. Kinda defeating the point of running a console command, but in cases like this, it's kinda more reliable to let a queue handler handle it, and have each queue job spin up a separate PHP process for each transform.
Just as information. When transforming a lot of images with "imager-x/generate" after some time, we are running out of memory. I am not sure if this is a docker problem or a general issue. At the moment the memory usage increases up to 12 GB, which is the max defined in the docker settings. After about 275 images we are out of memory and we have to restart the process.
craft imager-x/generate -f my-superField -t=articleHeaderJpeg