torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 96.00 MiB (GPU 0; 31.74 GiB total capacity; 27.71 GiB already allocated; 91.12 MiB free; 31.22 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
🐛 bug 说明
finetune中途突然OOM,是不是需要限制输入长度呢,请问代码内部会做截断么?目前输入长度没有做限制
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 96.00 MiB (GPU 0; 31.74 GiB total capacity; 27.71 GiB already allocated; 91.12 MiB free; 31.22 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Python Version
None