I am using the following training function and librispeech dataset. Every time the output of the model while training become Nan as a result the loss is also nan. What could be the possible issue.
class IterMeter(object):
"""keeps track of total iterations"""
def init(self):
self.val = 0
I am using the following training function and librispeech dataset. Every time the output of the model while training become Nan as a result the loss is also nan. What could be the possible issue.
class IterMeter(object): """keeps track of total iterations""" def init(self): self.val = 0
def train(model, device, train_loader, criterion, optimizer, scheduler, epoch): model.train()
def test(model, device, test_loader, criterion, epoch,batch_size=20): print('\nevaluating...') model.eval() test_loss = 0 test_cer, test_wer = [], [] n_classes = 29
def main(learning_rate=5e-4, batch_size=20, epochs=10, train_url="train-clean-100", test_url="test-clean"):