Closed Joao-L-S-Almeida closed 1 month ago
Wiping out cache and foricng the Python garbage collector to act in order to avoid "CUDA out of memory" issues during inference (predict).
However, the issue still remains forbatch_size > 1.
batch_size > 1
Wiping out cache and foricng the Python garbage collector to act in order to avoid "CUDA out of memory" issues during inference (predict).
However, the issue still remains for
batch_size > 1
.