Open xd009642 opened 2 days ago
In the tch and torch-sys crates there doesn't appear to be a version of https://pytorch.org/docs/stable/generated/torch.cuda.empty_cache.html#torch-cuda-empty-cache or torch._C.cuda_emptyCache which it called. I'll have a deeper look into this and PRing it but any guidance would be appreciated as this is a fairly important feature when sharing GPUs with other jobs.
torch._C.cuda_emptyCache
That and the other memory controls - but for my own issue I'm happy going for the blunt force approach to get torch to free up some of it's excessive allocations.
In the tch and torch-sys crates there doesn't appear to be a version of https://pytorch.org/docs/stable/generated/torch.cuda.empty_cache.html#torch-cuda-empty-cache or
torch._C.cuda_emptyCache
which it called. I'll have a deeper look into this and PRing it but any guidance would be appreciated as this is a fairly important feature when sharing GPUs with other jobs.