Closed chunhualiao closed 2 months ago
It is shown in the Hardware requirements parts (We used 6 A100 GPUs with 80GB memory for the training).
I asked for minimal gpu memory for inference.
You respond with the configration for training.
Do you think we are taking about the same thing?
For the inference, the information is also shown in the Hardware requirements part (We used 2 A100 GPUs with 80GB memory for the inference.)
User may need this information to check if they have enough GPU resources.