Dear Mr @Ravoxsg ,
I'm trying to reproduce the evaluation result as your suggested steps.
I modified two lines in the file main_candidate_generation.py
Line 7: sys.path.append("/content/SummaReranker/src/") # todo: change to your folder path
Line 49: default = "/content/SummaReranker/models/summareranker_reddit_bs_dbs_rouge_1_2_l/checkpoint-1000/pytorch_model.bin") # todo: change to where you saved the finetuned checkpoint
The command !bash candidate_generation.sh run for a while and then throw the error
Traceback (most recent call last): File "main_candidate_generation.py", line 182, in <module> main(args) File "main_candidate_generation.py", line 155, in main model.load_state_dict(torch.load(args.load_model_path)) File "/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py", line 1407, in load_state_dict self.__class__.__name__, "\n\t".join(error_msgs))) RuntimeError: Error(s) in loading state_dict for FTModel: Missing key(s) in state_dict: "pretrained_model.final_logits_bias", "pretrained_model.model.shared.weight", "pretrained_model.model.encoder.embed_tokens.weight", "pretrained_model.model.encoder.embed_positions.weight", "pretrained_model.model.encoder.layers.0.self_attn.k_proj.weight", "pretrained_model.model.encoder.layers.0.self_attn.k_proj.bias", "pretrained_model.model.encoder.layers.0.self_attn.v_proj.weight", "pretrained_model.model.encoder.layers.0.self_attn.v_proj.bias", "pretrained_model.model.encoder.layers.0.self_attn.q_proj.weight", "pretrained_model.model.encoder.layers.0.self_attn.q_proj.bias", "pretrained_model.model.encoder.layers.0.self_attn.out_proj.weight", "pretrained_model.model.encoder.layers.0.self_attn.out_proj.bias", "pretrained_model.model.encoder.layers.0.self_attn_layer_norm.weight", "pretrained_model.model.encoder.layers.0.self_attn_layer_norm.bias", "pretrained_model.model.encoder.layers.0.fc1.weight", "pretrained_model.model.encoder.layers.0.fc1.bias", "pretrained_model.model.encoder.layers.0.fc2.weight", "pretrained_model.model.encoder.layers.0.fc2.bias", "pretrained_model.model.encoder.layers.0.final_layer_norm.weight", "pretrained_model.model.encoder.layers.0.final_layer_norm.bias", "pretrained_model.model.encoder.layers.1.self_attn.k_proj.weight", "pretrained_model.model.encoder.layers.1.self_attn.k_proj.bias", "pretrained_model.model.encoder.layers.1.self_attn.v_proj.weight", "pretrained_model.model.encoder.layers.1.self_attn.v_proj.bias", "pretrained_model.model.encoder.layers.1.self_attn.q_proj.weight", "pretrained_model.model.encoder.layers.1.self_attn.q_proj.bias", "pretrained_model.model.encoder.layers.1.self_attn.out_proj.weight", "pretrained_model.model.encoder.layers.1.self_attn.out_proj.bias", "pretrained_model.model.encoder.layers.1.self_attn_layer_norm.weight", "pretrained_model.model.encoder.layers.1.self_attn_layer_norm.bias", "pretrained_model.model.encoder.layers.1.fc1.weight", "pretrained_model.model.encoder.layers.1.fc1.bias", "pretrained_model.model.encoder.layers.1.fc2.weight", "pretrained_model.model.encoder.layers.1.fc2.bias", "pretrained_model.model.encoder.layers.1.final_layer_norm.weight", ... "pretrained_model.encoder.layer.23.intermediate.dense.weight", "pretrained_model.encoder.layer.23.intermediate.dense.bias", "pretrained_model.encoder.layer.23.output.dense.weight", "pretrained_model.encoder.layer.23.output.dense.bias", "pretrained_model.encoder.layer.23.output.LayerNorm.weight", "pretrained_model.encoder.layer.23.output.LayerNorm.bias", "pretrained_model.pooler.dense.weight", "pretrained_model.pooler.dense.bias".
Dear Mr @Ravoxsg , I'm trying to reproduce the evaluation result as your suggested steps. I modified two lines in the file main_candidate_generation.py Line 7: sys.path.append("/content/SummaReranker/src/") # todo: change to your folder path Line 49: default = "/content/SummaReranker/models/summareranker_reddit_bs_dbs_rouge_1_2_l/checkpoint-1000/pytorch_model.bin") # todo: change to where you saved the finetuned checkpoint
The command
!bash candidate_generation.sh
run for a while and then throw the errorTraceback (most recent call last): File "main_candidate_generation.py", line 182, in <module> main(args) File "main_candidate_generation.py", line 155, in main model.load_state_dict(torch.load(args.load_model_path)) File "/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py", line 1407, in load_state_dict self.__class__.__name__, "\n\t".join(error_msgs))) RuntimeError: Error(s) in loading state_dict for FTModel: Missing key(s) in state_dict: "pretrained_model.final_logits_bias", "pretrained_model.model.shared.weight", "pretrained_model.model.encoder.embed_tokens.weight", "pretrained_model.model.encoder.embed_positions.weight", "pretrained_model.model.encoder.layers.0.self_attn.k_proj.weight", "pretrained_model.model.encoder.layers.0.self_attn.k_proj.bias", "pretrained_model.model.encoder.layers.0.self_attn.v_proj.weight", "pretrained_model.model.encoder.layers.0.self_attn.v_proj.bias", "pretrained_model.model.encoder.layers.0.self_attn.q_proj.weight", "pretrained_model.model.encoder.layers.0.self_attn.q_proj.bias", "pretrained_model.model.encoder.layers.0.self_attn.out_proj.weight", "pretrained_model.model.encoder.layers.0.self_attn.out_proj.bias", "pretrained_model.model.encoder.layers.0.self_attn_layer_norm.weight", "pretrained_model.model.encoder.layers.0.self_attn_layer_norm.bias", "pretrained_model.model.encoder.layers.0.fc1.weight", "pretrained_model.model.encoder.layers.0.fc1.bias", "pretrained_model.model.encoder.layers.0.fc2.weight", "pretrained_model.model.encoder.layers.0.fc2.bias", "pretrained_model.model.encoder.layers.0.final_layer_norm.weight", "pretrained_model.model.encoder.layers.0.final_layer_norm.bias", "pretrained_model.model.encoder.layers.1.self_attn.k_proj.weight", "pretrained_model.model.encoder.layers.1.self_attn.k_proj.bias", "pretrained_model.model.encoder.layers.1.self_attn.v_proj.weight", "pretrained_model.model.encoder.layers.1.self_attn.v_proj.bias", "pretrained_model.model.encoder.layers.1.self_attn.q_proj.weight", "pretrained_model.model.encoder.layers.1.self_attn.q_proj.bias", "pretrained_model.model.encoder.layers.1.self_attn.out_proj.weight", "pretrained_model.model.encoder.layers.1.self_attn.out_proj.bias", "pretrained_model.model.encoder.layers.1.self_attn_layer_norm.weight", "pretrained_model.model.encoder.layers.1.self_attn_layer_norm.bias", "pretrained_model.model.encoder.layers.1.fc1.weight", "pretrained_model.model.encoder.layers.1.fc1.bias", "pretrained_model.model.encoder.layers.1.fc2.weight", "pretrained_model.model.encoder.layers.1.fc2.bias", "pretrained_model.model.encoder.layers.1.final_layer_norm.weight", ... "pretrained_model.encoder.layer.23.intermediate.dense.weight", "pretrained_model.encoder.layer.23.intermediate.dense.bias", "pretrained_model.encoder.layer.23.output.dense.weight", "pretrained_model.encoder.layer.23.output.dense.bias", "pretrained_model.encoder.layer.23.output.LayerNorm.weight", "pretrained_model.encoder.layer.23.output.LayerNorm.bias", "pretrained_model.pooler.dense.weight", "pretrained_model.pooler.dense.bias".
Can you please check this issue soon. Thank you.