Closed riversdark closed 2 years ago
hi, riversdark. All you say is right. In the current released version, we have not opened the config function fully, you need to revise the code directly if you want to change the config. Sorry for the bad use experience. We will complete the config function in the next released version. Thanks for your issue.
Hey no need to apologize and thanks for all your great work! I guess I can close this now, or should I wait for you to do it?
I'm looking at the TAPE protein pretraining tutorial, in the model config dict there is a keyword
layer_num
, but there is no such keyword argument in any of the TAPE models, only an_layers
argument. So whateverlayer_num
hidden layers we specify, since the models only search for the number of hidden layers with then_layers
keyword, and since there is no such keyword inmodel_config
, it will always use the default value8
. Am I reading this right?TLDR: I think the
layer_num
keyword in model configs should ben_layers
.