LeeSureman / Flat-Lattice-Transformer

code for ACL 2020 paper: FLAT: Chinese NER Using Flat-Lattice Transformer
1k stars 178 forks source link

代码报错 #132

Open wl1055834702 opened 5 months ago

wl1055834702 commented 5 months ago

谁能帮我看一下是什么问题吗 代码报错:ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (17831,) + inhomogeneous part.

bert-wwm-cn模型是自己下载的,包含pytorch_model.bin, bert_config.json, vocab.txt文件。

报错代码:Traceback (most recent call last): File "/mnt/d/pythonProject/pythonProject/Flat-Lattice-Transformer-master/V1/flat_main.py", line 522, in bert_embedding = BertEmbedding(vocabs['lattice'],model_dir_or_name='../bertmodels/chinese_wwm_pytorch',requires_grad=False, File "/mnt/d/pythonProject/pythonProject/Flat-Lattice-Transformer-master/fastNLP_module.py", line 364, in init self.model =_WordBertModel(model_dir_or_name=model_dir_or_name, vocab=vocab, layers=layers, File "/home/ubuntu/anaconda3/envs/pytorch/lib/python3.8/site-packages/fastNLP/embeddings/bert_embedding.py", line 314, in init self.word_to_wordpieces = np.array(word_to_wordpieces)