Closed JoseMoFi closed 2 years ago
Thanks for your attention, it takes about half an hour to train one epoch on Phoenix14 with batch size=2 on a single 3090, and the total training process (40 epochs) costs about 20 hours. Using multiple GPUs can speed up this process.
Great! Thank you for your early answer.
Hello, great work with this paper and repo! I would like to ask you how much time you spent training the model (for the dataset Phoenix12) and what kind gpu you used for the training. Because I am trying to replicate it but with other dataset (specificly the Phoenix14-T), and in my first test I spent around 14h to train 10 epochs. I used a TitanXP with 12Gb for the training and a batch = 1.
Thank you again for your work and congratulation for this repo.