请问,args中的bi是什么含义,当我将bi设置为True的时候,运行main_without_fitlog.py,报了如下错误:(batch_size为10,数据集用的是resumeNER)
Traceback (most recent call last):
File "D:/work/NerAPIS/Batch_Parallel_LatticeLSTM-master/Batch_Parallel_LatticeLSTM-master/main_without_fitlog.py", line 205, in
callbacks=callbacks)
File "C:\ProgramData\Anaconda3\lib\site-packages\fastNLP\core\trainer.py", line 520, in init
batch_size=check_batch_size)
File "C:\ProgramData\Anaconda3\lib\site-packages\fastNLP\core\trainer.py", line 920, in _check_code
pred_dict = model(refined_batch_x)
File "C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 532, in call
result = self.forward(*input, *kwargs)
File "D:\work\NerAPIS\Batch_Parallel_LatticeLSTM-master\Batch_Parallel_LatticeLSTM-master\models.py", line 214, in forward
embed_word_back, lexicon_count_back)
File "C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 532, in call
result = self.forward(input, kwargs)
File "D:\work\NerAPIS\Batch_Parallel_LatticeLSTM-master\Batch_Parallel_LatticeLSTM-master\modules.py", line 598, in forward
skip_word_flat = skip_word_flat.view(batch_size*max_lexicon_count,self.word_input_size)
RuntimeError: shape '[8, 50]' is invalid for input of size 300
请问,args中的bi是什么含义,当我将bi设置为True的时候,运行main_without_fitlog.py,报了如下错误:(batch_size为10,数据集用的是resumeNER) Traceback (most recent call last): File "D:/work/NerAPIS/Batch_Parallel_LatticeLSTM-master/Batch_Parallel_LatticeLSTM-master/main_without_fitlog.py", line 205, in
callbacks=callbacks)
File "C:\ProgramData\Anaconda3\lib\site-packages\fastNLP\core\trainer.py", line 520, in init
batch_size=check_batch_size)
File "C:\ProgramData\Anaconda3\lib\site-packages\fastNLP\core\trainer.py", line 920, in _check_code
pred_dict = model(refined_batch_x)
File "C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 532, in call
result = self.forward(*input, *kwargs)
File "D:\work\NerAPIS\Batch_Parallel_LatticeLSTM-master\Batch_Parallel_LatticeLSTM-master\models.py", line 214, in forward
embed_word_back, lexicon_count_back)
File "C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 532, in call
result = self.forward(input, kwargs)
File "D:\work\NerAPIS\Batch_Parallel_LatticeLSTM-master\Batch_Parallel_LatticeLSTM-master\modules.py", line 598, in forward
skip_word_flat = skip_word_flat.view(batch_size*max_lexicon_count,self.word_input_size)
RuntimeError: shape '[8, 50]' is invalid for input of size 300