Closed sanbuphy closed 1 year ago
@sanbuphy You can reduce the GPU memory by changing the batch size while training the model. Example: python bin/train.py -cn lama-fourier data.batch_size=8 You can adjust the batch size as per your memory usage.
How can I save memory during inference? If the size is fixed
How can I save memory during inference? If the size is fixed
The same question about the GPU memory usage. I think you can try to do calculate some model on CPU
Hello, I think this model works very well, but the GPU memory usage is too high during refinement. Could you please advise me on how to reduce the memory usage during refinement? Thank you!