Closed david-waterworth closed 4 years ago
Hi! thanks for your contribution!, great first issue!
true, @david-waterworth mind send a PR?
isn't it supposed to be passed as keyword arguments?
self.save_hyperparameters(layer_1_dim=layer_1_dim, learning_rate=learning_rate)
isn't it supposed to be passed as keyword arguments?
self.save_hyperparameters(layer_1_dim=layer_1_dim, learning_rate=learning_rate)
that was not implemented as it was quite dangerous... but it could be an extension in future
dangerous in what sense? this is just passing in variables with named args, it happens all the time in Python :)
dangerous in what sense? this is just passing in variables with named args, it happens all the time in Python :)
I think that the point was with creating new instance from checkpoint as with the new mapping your new parameters do not match names in the class init... we do not want to do any deep/recursive frame investigation and moreover when you update any of there parameters/arguments before you call save_hyperparameters
there is no simple way to do init argument-hparams mapping
class MyModule(...):
def __init__(self, arg1, arg2):
param2 += 1
self.save_hyperparameters(param1=arg1, param2=arg2)
It looks like these docs were fixed . thanks @david-waterworth
📚 Documentation
The documentation has examples
self.save_hyperparameters(['layer_1_dim', 'learning_rate'])
This fails as due to type list not being supported as a hyperparameter. However, looking at the source for
save_hyperparameters
it appears that the correct usage isself.save_hyperparameters('layer_1_dim', 'learning_rate')
As an aside I also do have a list as a hyperparameter for one model - it's a time convolutional network with stacked temporal convolutions, I was originally passing in the number of filters per layer as a list.