Closed priyanka36 closed 1 year ago
I used many versions of lightning, including the latest one, but I never override lr_scheduler_step
. Does it report an error?
Yes i encountered in all versions of lightning i tried with every other version throws out the same error
The code is upgraded for lightning >= 2.0.0
now. And I just verified that the code is OK.
Yes i encountered in all versions of lightning i tried with every other version throws out the same error
You can post the error. See if I can help some.
So the error goes pytorch-lightning requires lr_scheduler to be overriden . By the way what is the default epoch number set for NBSS?
So the error goes pytorch-lightning requires lr_scheduler to be overriden .
You can post the error. Lightning doesn't require to override lr_scheduler_step
, and implementing the configure_optimizers
function is enough in lightning to configure an optimizer and lr_scheduler.
By the way what is the default epoch number set for NBSS?
100 for NBC2
. For other archs, you can refer to our paper.
Sorry but I encountered that error . and that was at the end of epoch so I overrode it . Any way to change the epoch number ?
Any way to change the epoch number ?
You can use --trainer.max_epochs=100
Sorry but I encountered that error .
It's OK. There might be something that I didn't notice.
Thankyou. How long should we train the model. Does 10 epoch give as good of performance as 100?
Does 10 epoch give as good of performance as 100?
You can check the validation loss curve using tensorboard:
tensorboard --logdir=logs --bind_all
. In my experience, the epochs needed depends on task.
Thankyou
Why is this lr_scheduler_step() function not overridden in the script the pytorchlightning version requires this function to be overloaded as well