Closed VLadImirluren closed 12 months ago
I do not have a 3090 on my side, thus the training is not guaranteed. You can decrease the batch size and give it a try. You can definitely run the inference code on it as we have Colab demo where even smaller GPU than 3090 can host the model.
thanks!
At 2023-07-19 22:30:15, "Zhaoxi Chen" @.***> wrote:
I do not have a 3090 on my side, thus the training is not guaranteed. You can decrease the batch size and give it a try. You can definitely run the inference code on it as we have Colab demo where even smaller GPU than 3090 can host the model.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Can this code run on 24G 3090 GPU? thanks