Sleepychord / CogLTX

The source code of NeurIPS 2020 paper "CogLTX: Applying BERT to Long Texts"
MIT License
268 stars 54 forks source link

Checkpoint contains hyperparameters but IntrospectorModule's __init__ is missing the argument 'hparams'. #4

Open exol-forlife opened 3 years ago

exol-forlife commented 3 years ago

/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/utilities/warnings.py:18: UserWarning: The dataloader, train dataloader, does not have many workers which may be a bottleneck. Consider increasing the value of the num_workers argumentin theDataLoader` init to improve performance. warnings.warn(*args, *kwargs) Traceback (most recent call last): File "run_20news.py", line 45, in main_loop(config) File "/data/CogLTX-main/main_loop.py", line 57, in main_loop trainer.fit(introspector) File "/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py", line 695, in fit self.load_spawn_weights(model) File "/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/trainer/distrib_data_parallel.py", line 373, in load_spawn_weights loaded_model = original_model.class.load_from_checkpoint(path) File "/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/core/lightning.py", line 1509, in load_from_checkpoint model = cls._load_model_state(checkpoint, args, **kwargs) File "/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/core/lightning.py", line 1533, in _load_model_state f"Checkpoint contains hyperparameters but {cls.name}'s init " pytorch_lightning.utilities.exceptions.MisconfigurationException: Checkpoint contains hyperparameters but IntrospectorModule's init is missing the argument 'hparams'. Are you loading the correct checkpoint?

miaomi1994 commented 3 years ago

@exol-forlife hi~,did you solve the problem?

exol-forlife commented 3 years ago

@exol-forlife hi~,did you solve the problem?

I couldn't find a solution to it, so I rewrote the code using pytorch rather than pytorch-lightning, but the result was not satisfying :(

miaomi1994 commented 3 years ago

@exol-forlife hi~,did you solve the problem?

I couldn't find a solution to it, so I rewrote the code using pytorch rather than pytorch-lightning, but the result was not satisfying :(

Do you mean that the result is not as stated in the paper?

Sleepychord commented 3 years ago

hi, sorry for ignoring the problem before. I think this is because this repo depends on a old version of pytorch-lightning. @exol-forlife could you describe more about your re-implementation problem?

miaomi1994 commented 3 years ago

I update the pytorch-lightning to 0.7.3 and still have this problem @Sleepychord

ParanoidW commented 2 years ago

I update the pytorch-lightning to 0.7.3 and still have this problem @Sleepychord

Hi, I have the same question. Did you solve this? @miaomi1994

ParanoidW commented 2 years ago

Same question in pytorch-lightning 0.6.0 or 0.7.3, looks like the version provided in README is wrong. @Sleepychord @dm-thu Please give some help.

yudong27 commented 2 years ago

modify paramters in class IntrospectorModule(pl.LightningModule) init function and code in below change: def init(self, config): ==> def init(self, hparams) add config = hparams

so do class ReasonerModule