snakers4 / silero-models

Silero Models: pre-trained speech-to-text, text-to-speech and text-enhancement models made embarrassingly simple
Other
4.86k stars 303 forks source link

Bug report - [TTS Memory Leak in python 3.10 , torch 2.0.1, cpu mode] #237

Closed Likelihoood closed 1 year ago

Likelihoood commented 1 year ago

🐛 Bug

Memory Leak when TTS

To Reproduce

Inference in cpu : imp = package.PackageImporter(model_path) model = imp.load_pickle("tts_models", "model") model.model.eval() model.apply_tts(text=text, speaker=speaker, sample_rate=sample_rate, put_accent=False, put_yo=False)

Expected behavior

No Memory leak

Environment

Please copy and paste the output from this environment collection script (or fill out the checklist below manually).

You can get the script and run it with:

wget https://raw.githubusercontent.com/pytorch/pytorch/master/torch/utils/collect_env.py
# For security purposes, please check the contents of collect_env.py before running it.
python collect_env.py

Additional context

snakers4 commented 1 year ago

Hi,

Memory Leak when TTS

How does the leak manifest itself? Does the RAM consumption grow up to a certain limit, or does it continue to grow?

torch 2.0.1

Does using older PyTorch versions make any difference, e.g. 1.10 or 1.12?

pip

Does using the repo with torch.hub directly make any difference?

Likelihoood commented 1 year ago

Thanks for your response.

I am going to try the lower version for pytorch to see the difference.

Likelihoood commented 1 year ago

It is not fix the memory leak while using 1.10

Likelihoood commented 1 year ago

hub loading still cannot fix memory leak

snakers4 commented 1 year ago

yes, I deploy the tts model on k8s, memory consumption of the pod grow up quickly and it will restart due to OOM

Does the leak happen without kubernetes? How much ram is allocated per pod?