Closed krisho007 closed 4 years ago
Hi! thanks for your contribution!, great first issue!
The problem does not seem to be present on the master branch, could you try upgrading?
So this seems to be a bug to be fixed by #1988 ?
The problem does not seem to be present on the master branch, could you try upgrading?
I am already on 0.7.6. So I am not sure how to upgrade to the master branch. Can you please guide?
bottom of the docs “bleeding edge”
I am now having the same question. I am using self.hparams with type of dict and on 0.7.6. Could someone give some suggestions?
I get the same error, while having hparams.lr, even with 0.8.0.
We need to adjust the learning rate finder to work with the new hparams. @SkafteNicki
@SkafteNicki is this still broken on master?
@edenlightning I checked this morning, and the problem still seems to be present. I will create a PR soon with a fix.
🐛 Bug
I am using auto_lr_find feature as below.
trainer = pl.Trainer(fast_dev_run=False, gpus=1, auto_lr_find=True)
My model has the self.learning_rate parameter as below (part of the model).
When I 'fit' using below line
trainer.fit(tweetModel, train_dataloader=training_loader, val_dataloaders=valid_loader)
I still get the errorMisconfigurationException: When auto_lr_find is set to True, expects that hparams either has field
lror
learning_ratethat can overridden
Expected behavior
No error while running the 'fit'
Environment