Open AlexSteveChungAlvarez opened 2 years ago
1 % would indicate GPU is not used here, but the training speed sounds like gpu is used. Maybe the speed is ok for a 2060, but there are also some bugs with training speed, which have unfortunately not really solutions yet, like #700
Hello, I have been trying to train a spanish model for the synthetizer for a month, and each time I trained the models on different datasets, it took a lot of time to finish each schedule. Right now I am training on another dataset, but my monitor shows this: It seems to me that it is not training in parallel as it should be, and it is using just about 5% of my GPU: Is there any param I should turn on to enable training in parallel? This is the output of the training in process: