Lightning-AI / pytorch-lightning

Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
28.35k stars 3.38k forks source link

auto_lr_find does not work #1983

Closed krisho007 closed 4 years ago

krisho007 commented 4 years ago

🐛 Bug

I am using auto_lr_find feature as below. trainer = pl.Trainer(fast_dev_run=False, gpus=1, auto_lr_find=True)

My model has the self.learning_rate parameter as below (part of the model).

class TweetSegment(pl.LightningModule):
    def __init__(self, config, lr=3e-5):
        super(TweetSegment, self).__init__()
        self.bert = BertModel.from_pretrained('bert-base-uncased', config=config)
        self.drop_out = nn.Dropout(0.1)
        self.fullyConnected = nn.Sequential(nn.Linear(2*768, 2), nn.ReLU())
        self.learning_rate = lr
        self._init_initial()

    def configure_optimizers(self):
        return torch.optim.AdamW(self.parameters(), lr=self.learning_rate)   

When I 'fit' using below line trainer.fit(tweetModel, train_dataloader=training_loader, val_dataloaders=valid_loader) I still get the error MisconfigurationException: When auto_lr_find is set to True, expects that hparams either has fieldlrorlearning_ratethat can overridden

Expected behavior

No error while running the 'fit'

Environment

github-actions[bot] commented 4 years ago

Hi! thanks for your contribution!, great first issue!

SkafteNicki commented 4 years ago

The problem does not seem to be present on the master branch, could you try upgrading?

krisho007 commented 4 years ago

So this seems to be a bug to be fixed by #1988 ?

krisho007 commented 4 years ago

The problem does not seem to be present on the master branch, could you try upgrading?

I am already on 0.7.6. So I am not sure how to upgrade to the master branch. Can you please guide?

williamFalcon commented 4 years ago

bottom of the docs “bleeding edge”

Makoto1733 commented 4 years ago

I am now having the same question. I am using self.hparams with type of dict and on 0.7.6. Could someone give some suggestions?

dscarmo commented 4 years ago

I get the same error, while having hparams.lr, even with 0.8.0.

williamFalcon commented 4 years ago

We need to adjust the learning rate finder to work with the new hparams. @SkafteNicki

edenlightning commented 4 years ago

@SkafteNicki is this still broken on master?

SkafteNicki commented 4 years ago

@edenlightning I checked this morning, and the problem still seems to be present. I will create a PR soon with a fix.