LeeSureman / Batch_Parallel_LatticeLSTM

Chinese NER using Lattice LSTM. Reproduction for ACL 2018 paper.
130 stars 16 forks source link

关于一些参数的问题 #16

Open 980202006 opened 4 years ago

980202006 commented 4 years ago

请问,args中的bi是什么含义,当我将bi设置为True的时候,运行main_without_fitlog.py,报了如下错误:(batch_size为10,数据集用的是resumeNER) Traceback (most recent call last): File "D:/work/NerAPIS/Batch_Parallel_LatticeLSTM-master/Batch_Parallel_LatticeLSTM-master/main_without_fitlog.py", line 205, in callbacks=callbacks) File "C:\ProgramData\Anaconda3\lib\site-packages\fastNLP\core\trainer.py", line 520, in init batch_size=check_batch_size) File "C:\ProgramData\Anaconda3\lib\site-packages\fastNLP\core\trainer.py", line 920, in _check_code pred_dict = model(refined_batch_x) File "C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 532, in call result = self.forward(*input, *kwargs) File "D:\work\NerAPIS\Batch_Parallel_LatticeLSTM-master\Batch_Parallel_LatticeLSTM-master\models.py", line 214, in forward embed_word_back, lexicon_count_back) File "C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 532, in call result = self.forward(input, kwargs) File "D:\work\NerAPIS\Batch_Parallel_LatticeLSTM-master\Batch_Parallel_LatticeLSTM-master\modules.py", line 598, in forward skip_word_flat = skip_word_flat.view(batch_size*max_lexicon_count,self.word_input_size) RuntimeError: shape '[8, 50]' is invalid for input of size 300

LeeSureman commented 4 years ago

指的是“是否双向”

980202006 commented 4 years ago

请问,这个参数目前可以使用吗?在设置为True的时候,报了如上的错误

LeeSureman commented 4 years ago

我在linux的系统上跑了下,没报错呀

LeeSureman commented 4 years ago

可以用的

980202006 commented 4 years ago

好的,我试一下linux

LeeSureman commented 4 years ago

我看着可能是embedding_size的设置出了点问题,300有点眼熟

980202006 commented 4 years ago

好的