kosugi11037 / bert-int

46 stars 20 forks source link

result = self.forward(*input, **kwargs) File "/data/my_project/nlp_text/nlp_models/bert-int-master/basic_bert_unit/Basic_Bert_Unit_model.py", line 20, in forward cls_vec = sequence_output[:,0] #14

Closed KangChou closed 2 years ago

KangChou commented 2 years ago

n Batch_TrainData_Generator, train ill num: 4500
In Batch_TrainData_Generator, ent_ids1 num: 15000
In Batch_TrainData_Generator, ent_ids2 num: 15000
start training...
+++++++++++
Epoch:  0
+++++++++++
train ent1s num: 4500 train ent2s num: 4500 for_Candidate_ent1s num: 15000 for_candidate_ent2s num: 15000
Traceback (most recent call last):
  File "main.py", line 75, in <module>
    main()
  File "main.py", line 70, in main
    train(Model,Criterion,Optimizer,Train_gene,train_ill,test_ill,ent2data)
  File "/data/my_project/nlp_text/nlp_models/bert-int-master/basic_bert_unit/train_func.py", line 115, in train
    for_candidate_ent2s,entid2data,Train_gene.index2entity)
  File "/data/my_project/nlp_text/nlp_models/bert-int-master/basic_bert_unit/train_func.py", line 44, in generate_candidate_dict
    temp_emb = entlist2emb(Model,train_ent1s[i:i+batch_size],entid2data,CUDA_NUM).cpu().tolist()
  File "/data/my_project/nlp_text/nlp_models/bert-int-master/basic_bert_unit/train_func.py", line 26, in entlist2emb
    batch_emb = Model(batch_token_ids,batch_mask_ids)
  File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/data/my_project/nlp_text/nlp_models/bert-int-master/basic_bert_unit/Basic_Bert_Unit_model.py", line 20, in forward
    cls_vec = sequence_output[:,0]
TypeError: string indices must be integers
kosugi11037 commented 2 years ago

Thanks for your interest in our work! I think the error is due to differences between different versions of transformers (I guess you use transformer version 4.X). You can resolve this error in two ways:

(a). Using transformers version 2.1.1 or (b). Modify the code to make it work for the transformer version you are using. For example, you can:

If you have any questions, please feel free to contact me.