Closed liclshixiaokeaiya closed 2 years ago
When I use the trained model for inference, there will be a problem that a certain image occupies too much GPU memory. The peak may reach more than 10G.
You can slightly reduce the number of limit here if you have a tight memory.
Do I need to retrain after reduce limit ?
No.
When I use the trained model for inference, there will be a problem that a certain image occupies too much GPU memory. The peak may reach more than 10G.