Open Tushar-Faroque opened 3 years ago
@Tushar-Faroque Hello, I'd like to ask which file do you use as the parameter of "- pm_p"?I tried the files in the Bert model in turn:
bert_config.json
bert_model.ckpt.data-00000-of-00001
bert_model.ckpt.index
vocab.txt
but the program reported errors:
Traceback (most recent call last): File "E:/2-Unrunable code/exBERT-master/Pretraining.py", line 119, in <module> stat_dict = t.load(args['pretrained_model_path'], map_location='cpu') File "D:\Anaconda3\lib\site-packages\torch\serialization.py", line 593, in load return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args) File "D:\Anaconda3\lib\site-packages\torch\serialization.py", line 762, in _legacy_load magic_number = pickle_module.load(f, **pickle_load_args) _pickle.UnpicklingError: invalid load key, '\x0a'.
Hello,
I have used this file /content/exBERT-master/state_dict_model.pt
for the -pm_p
from path_to_state_dict_of_the_OFF_THE_SHELF_MODEL
to any wheel pre-trained model in BERT architecture.
I have successfully run the model and now I have this file
Best_stat_dic_exBERT
. How can I load this file and use it for tokenization and creating embeddings?
Hi, Do you know how to load this model now? Recently I came acoss this problem as well.
I have successfully run the model and now I have this file
Best_stat_dic_exBERT
. How can I load this file and use it for tokenization and creating embeddings?