santhoshkolloju / Abstractive-Summarization-With-Transfer-Learning

Abstractive summarisation using Bert as encoder and Transformer Decoder
406 stars 99 forks source link

ValueError: Unknown hyperparameter: position_embedder_type. Only hyperparameters named 'kwargs' hyperparameters can contain new entries undefined in default hyperparameters. #15

Open huhanGitHub opened 5 years ago

huhanGitHub commented 5 years ago

ValueError: Unknown hyperparameter: position_embedder_type. Only hyperparameters named 'kwargs' hyperparameters can contain new entries undefined in default hyperparameters.

I clone code, and run main.py without any code change, file 'model.py', line 90, encoder = tx.modules.TransformerEncoder(hparams=bert_config.encoder)

what's wrong? what should i do next? thanks

santhoshkolloju commented 5 years ago

Please check the version of texar you are using

nutalk commented 5 years ago

I have the same problem. The version of texar is v0.2.0

claudiohfg commented 5 years ago

Same problem here. Using version 0.2.1.

nlp-shuai commented 5 years ago

you should use version 0.1, but it is not possible to install version 0.1 now. I made it myself.

ghost commented 4 years ago

Is there any fix for this? I've the same problem. The version of texar is 0.2.4

thuytrinht4 commented 4 years ago

uninstall texar and then install texar-pytorch pip install texar-pytorch==0.1.1 Fixed the problem for me

Gauravchats commented 3 years ago

Delete your previous texar and use pip3 install -e git://github.com/asyml/texar.git@573a80d818e7e1d24fe58d99d5a5100b56b571f7#egg=texar