Closed moritzbrantner closed 7 months ago
The operation required too much VRAM such that my 3060 could not handle it. Lowering the batch size to 1 or 2 solved the issue.
What's currently the minimum amount of VRAM needed to perform inference?
The operation required too much VRAM such that my 3060 could not handle it. Lowering the batch size to 1 or 2 solved the issue.