Apparently using exec.(*Cmd).Output() isn't a good idea, as it keeps bytes.Buffer instances open that cannot be closed properly. (io.Writer, has no Close method to trigger the appropriate write error to properly shut down routines depending on it within exec module ...)
This commit fixes that issue by replacing the Output() call with an stdout pipe and Start()+Wait() and closing the pipe afterwards (triggering io.Copy to fail with an error in the associated methods in the exec module, resulting in the heap artifacts to be freed).
However, there is still the issue of heap artifacts randomly remaining allocated despite not being in use anymore. (regexp buffers and others, json.Marshal buffers, opcodes slices (probably due to json.Marshal as well), ...)
We'll have to find a solution for this at some point.
But seems to be not too problematic as used memory doesn't exceed 20MB in my experiments and even drops down fairly heavily at times of low load (the extra call to gc seems to do wonders here after the stdout fix).
We'll need to do another stress test on this service once this pull is accepted to see if the issue is now solved or if we need to further investigate into those orphaned heap artifacts.
Apparently using exec.(*Cmd).Output() isn't a good idea, as it keeps bytes.Buffer instances open that cannot be closed properly. (io.Writer, has no Close method to trigger the appropriate write error to properly shut down routines depending on it within exec module ...) This commit fixes that issue by replacing the Output() call with an stdout pipe and Start()+Wait() and closing the pipe afterwards (triggering io.Copy to fail with an error in the associated methods in the exec module, resulting in the heap artifacts to be freed).
However, there is still the issue of heap artifacts randomly remaining allocated despite not being in use anymore. (regexp buffers and others, json.Marshal buffers, opcodes slices (probably due to json.Marshal as well), ...) We'll have to find a solution for this at some point. But seems to be not too problematic as used memory doesn't exceed 20MB in my experiments and even drops down fairly heavily at times of low load (the extra call to gc seems to do wonders here after the stdout fix). We'll need to do another stress test on this service once this pull is accepted to see if the issue is now solved or if we need to further investigate into those orphaned heap artifacts.