Open jdxyw opened 3 years ago
Hi, Thanks for reporting! I will try to look into this, but i cannot guarantee that i'll do that quickly, i'm a bit overloaded RN :(
Looking over the code, I think this is caused by the generator thread not joining after the iteration stops, leaking its resources.
A solution might be adding a handle to join/terminate the thread when GeneratorExit
and/or StopIteration
is raised
Hi,
I meet an issue about the memory leak. here is my usage,
The code below is fine with the default dataloader in PyTorch, the memory usage is stable.
If I use the
BackgroundGenerator
to replace the PyTorch dataloader, the CPU memory would increase over time.My new dataloader.
The memory usage comparison. The only difference is the dataloader.
Best Regards,