Hello, thank you for your wonderful work! I noticed that the GPU memory requirements for different models during inference are listed in the ReadMe. Could you also provide the GPU memory requirements for training these models? This would be very helpful for selecting the appropriate model based on different devices.
Hello, thank you for your wonderful work! I noticed that the GPU memory requirements for different models during inference are listed in the ReadMe. Could you also provide the GPU memory requirements for training these models? This would be very helpful for selecting the appropriate model based on different devices.