Open minsuk00 opened 2 months ago
Hi @minsuk00
you can also try the release_memory
utility method from accelerate.utils
- cc @muellerzr
@younesbelkada -cc @muellerz
Thanks for the suggestion, but it doesn't seem to work.
clip_text_model = accelerate.utils.release_memory(clip_text_model)
does not free any GPU memory.
Additionally, calling clip_text_model.cpu()
or torch.cuda.empty_cache()
simply results in the behavior described above.
cc @muellerzr regarding the accelerate behaviour.
Regarding torch.cuda.empty_cache()
it's recommended that this function is not manually used c.f. a related issue, and this discussion in the pytorch forum
System Info
transformers
version: 4.26.1Who can help?
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I can't free GPU memory after I use CLIPTextModel Also, memory is allocated in another device for some reason
problem should be reproduced by using the following code snippet
Expected behavior
I've also tried using garbage collection and explicitly moving model to cpu, but they don't work.