Closed TheurgicDuke771 closed 3 years ago
Thank you for your interests. What kind of data you used for preprocessing? If you use the Spider, did you use both train_spider, train_other and dev, these three files? There are some mismatch in the size of embeddings between the trained model and the initialized model.
Yes I used spider dataset only. Edit : Thanks for pointing out, as you mentioned it seems like the issue was with the dataset, I downloaded the spider dataset again and it works fine.
Hey @Impavidity @TheurgicDuke771 , I'm facing the same issue, I think there is no issue with data( Spider) I'm using.
Loading model from logdir/bart_run_1/bs=12,lr=1.0e-04,bert_lr=1.0e-05,end_lr=0e0,att=1/model_checkpoint-00041000
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-45-2534e992a832> in <module>()
----> 1 model = inferer.load_model(model_dir, checkpoint_step)
6 frames
/content/gap-text2sql/rat-sql-gap/seq2struct/models/variational_lstm.py in _hook_remove_dropout_masks_from_state_dict(cls, instance, state_dict, prefix, local_metadata)
75 @classmethod
76 def _hook_remove_dropout_masks_from_state_dict(cls, instance, state_dict, prefix, local_metadata):
---> 77 del state_dict[prefix + '_input_dropout_mask']
78 del state_dict[prefix + '_h_dropout_mask']
79
KeyError: 'decoder.state_update._input_dropout_mask'
We can see the folder structure below,
Please guide me to tackle this issue. Thanks in advance.
@Impavidity , Also, getting error in executing the following
dataset = registry.construct('dataset_infer',{
"name": "spider", "schemas": schema, "eval_foreign_key_maps": eval_foreign_key_maps,
"db_path": "data/sqlite_files/"
})
ValueError Traceback (most recent call last)
<ipython-input-57-de0cacb4ab13> in <module>()
1 dataset = registry.construct('dataset_infer',{
2 "name": "spider", "schemas": schema, "eval_foreign_key_maps": eval_foreign_key_maps,
----> 3 "db_path": "data/sqlite_files/"
4 })
1 frames
/content/gap-text2sql/rat-sql-gap/seq2struct/utils/registry.py in instantiate(callable, config, unused_keys, **kwargs)
42 signature = inspect.signature(callable.__init__)
43 print('signature:',signature)
---> 44 for name, param in signature.parameters.items():
45 print("name:",name)
46 print("param:",param)
ValueError: Unsupported kind for param args: 2
Please help me out. Thanks again.
@ujjawalcse Did you resolve the issue I am also facing this KeyError: 'decoder.state_update._input_dropout_mask' issue
I also encountered the problem and my envrionment can't be changed to pytorch 1.5(mine is pytorch 1.9), so I changed the code of this file "variational_lstm.py", and change this function "_hook_remove_dropout_masks_from_state_dict", I modified the code to : if prefix + '_input_dropout_mask' in state_dict: del state_dict[prefix + '_input_dropout_mask'] if prefix + '_h_dropout_mask' in state_dict: del state_dict[prefix + '_h_dropout_mask'] Then the model works. Hope it can help.
I have a problem "Attempting to infer on untrained model in {logdir}, step={step}". I fixed : ` def load_model(self, logdir, step): '''Load a model (identified by the config used for construction) and return it'''
model = registry.construct('model', self.config['model'], preproc=self.model_preproc, device=self.device)
model.to(self.device)
model.eval()
# 2. Restore its parameters
saver = saver_mod.Saver({"model": model})
#last_step = saver.restore(logdir, step=step, map_location=self.device, item_keys=["model"])
#if not last_step:
# raise Exception(f"Attempting to infer on untrained model in {logdir}, step={step}")
return model`
What happens if I fix it like this?
Hi,
Thanks for open-sourcing the project. I was trying this on a non gpu windows 10 machine (conda environment, python 3.7.9, pytorch 1.5) I was able to run Preprocess dataset, but got the bellow error while running Inference
Can you guide me, where I need to make changes.