atulkum / pointer_summarizer

pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Apache License 2.0
907 stars 242 forks source link

Retraining model cause optimizer duplicate parameter error #61

Closed Sixy1204 closed 3 years ago

Sixy1204 commented 3 years ago

Hello

When I use python3 train.py -m model_path to retrain the model, it throws a UserWarning: optimizer contains a parameter group with duplicate parameters; in future, this will cause an error; see github.com/pytorch/pytorch/issues/40967 for more information super(Adagrad, self).init(params, defaults) And then the code stop running. I believe it is related to reloading model trained parameters. How can I fix it? thank you

atulkum commented 3 years ago

The code was written for python 2.7, it won't be a tough task to upgrade it to pyhton 3. Sorry for the confusion.

Sixy1204 commented 3 years ago

The code was written for python 2.7, it won't be a tough task to upgrade it to pyhton 3. Sorry for the confusion.

Sorry, my bad. When retraining, user should set max_iteration bigger than log iter.

thx! lol

Wenjun-Peng commented 2 years ago

The code was written for python 2.7, it won't be a tough task to upgrade it to pyhton 3. Sorry for the confusion.

I wonder why you share parameters of encoder to decoder: decoder.embedding.weight = encoder.embedding.weight and it cause same problem mentioned above: UserWarning: optimizer contains a parameter group with duplicate parameters; in future, this will cause an error; see github.com/pytorch/pytorch/issues/40967 for more information