yifanai / video2anime

Turn your videos (and selfies) into anime with generative adversarial network (GAN)
Other
26 stars 14 forks source link

Cuda out of memory #3

Closed ofirkris closed 10 months ago

ofirkris commented 4 years ago

I'm trying to keep the Python interpreter up as a web-service, and constantly get out of memory and increase in GPU usage after testing several images, only thing that helps is reseting the service and python.

How can I free the GPU after an anime conversion? Tried gc.collect() and sess.close() with no success.

Please advise

yifanai commented 4 years ago

I think the "recommended" way to serve TF models may be TF Serving: https://www.tensorflow.org/tfx/serving/serving_basic.

It's hard to tell what the issue comes from without more context, so I'm just guessing some sanity checks here...

  1. Load model outside of API function calls
  2. Set allow gpu memory growth, https://www.tensorflow.org/guide/using_gpu#allowing_gpu_memory_growth
  3. Hard reset memory with CUDA: https://stackoverflow.com/questions/43930871/how-to-release-the-occupied-gpu-memory-when-calling-keras-model-by-apache-mod-ws