Lightning-AI / pytorch-lightning

Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
28.15k stars 3.37k forks source link

save_hyperparameters incorrect documentation #2345

Closed david-waterworth closed 4 years ago

david-waterworth commented 4 years ago

📚 Documentation

The documentation has examples

self.save_hyperparameters(['layer_1_dim', 'learning_rate'])

This fails as due to type list not being supported as a hyperparameter. However, looking at the source for save_hyperparameters it appears that the correct usage is

self.save_hyperparameters('layer_1_dim', 'learning_rate')

As an aside I also do have a list as a hyperparameter for one model - it's a time convolutional network with stacked temporal convolutions, I was originally passing in the number of filters per layer as a list.

github-actions[bot] commented 4 years ago

Hi! thanks for your contribution!, great first issue!

Borda commented 4 years ago

true, @david-waterworth mind send a PR?

awaelchli commented 4 years ago

isn't it supposed to be passed as keyword arguments? self.save_hyperparameters(layer_1_dim=layer_1_dim, learning_rate=learning_rate)

Borda commented 4 years ago

isn't it supposed to be passed as keyword arguments? self.save_hyperparameters(layer_1_dim=layer_1_dim, learning_rate=learning_rate)

that was not implemented as it was quite dangerous... but it could be an extension in future

awaelchli commented 4 years ago

dangerous in what sense? this is just passing in variables with named args, it happens all the time in Python :)

Borda commented 4 years ago

dangerous in what sense? this is just passing in variables with named args, it happens all the time in Python :)

I think that the point was with creating new instance from checkpoint as with the new mapping your new parameters do not match names in the class init... we do not want to do any deep/recursive frame investigation and moreover when you update any of there parameters/arguments before you call save_hyperparameters there is no simple way to do init argument-hparams mapping

class MyModule(...):
  def __init__(self, arg1, arg2):
    param2 += 1
    self.save_hyperparameters(param1=arg1, param2=arg2)
awaelchli commented 4 years ago

It looks like these docs were fixed . thanks @david-waterworth