Closed chrislouis0106 closed 11 months ago
Hi, we are preparing the code for the training stage. You can WATCH our repo; we will let you know when it is ready.
For now, you can refer to our paper for training configuration requirements.
For inference, I think you can use any GPU with memory larger than 24G.
Code and data have been updated at fdc593901f2cd6056d73a1e8016557abe4d2f72e
Hi, bro, could you upload the coding about how to training the model? And, please add some introduction about the configuration requirements, that is, what kind of machine and card is needed, VRAM?