Open CoryXie opened 2 years ago
hmmm.. this may be the reason why memory utilization is poor causing: [Bug] numpy.core._exceptions.MemoryError #728
Wow I never realised this but after checking I can also see multiple python processes are created and dont terminate. Looks like an issue with the separator.separate_to_file
method as the multiprocessing pools are not being closed there. I referenced
this to arrive to the solution.
So it means you could subclass Separator
then implement an extra method that closes the pool. This seems to be able to close all processes at the end if synchronous
is set to True
in separate_to_file
method, e.g.
from spleeter.separator import Separator
class MySeparator(Separator):
def close_pool(self):
if self._pool:
self._pool.close()
self._pool.terminate()
separator = MySeparator('spleeter:2stems')
separator.separate_to_file(filepath, self.filedirectory, synchronous=True)
# Wait for batch to finish and terminate pool.
separator.join()
separator.close_pool()
@ls-milkyway #728 is not related to this. Memory usage is just high by default and you need high specs for things to work properly.
Description
I am trying to use spleeter for vocal extraction in a song, with the following code. I found it is trying to spawn several sub processes for its work (and it works fine to separate what I wanted), but even after the separation is done, these sub processes are still kept running.
I even tried to kill these sub processes with below code, it seems the sub processes are being re-created after being killed.
See below for the debug screenshot before kill (after it was run).
See below for the debug screenshot after kill.
Each sub processes takes around 170MB of memory. See below for the memory usage even after kill (these sub processes are not created elsewhere).
Step to reproduce
pip install spleeter
Output
Memory is kept high even after use and sub processes are kept running, even after killing they will be re-created.
Environment
Additional context