brightmart / bert_language_understanding

Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
960 stars 211 forks source link

Three undefined names #2

Closed cclauss closed 6 years ago

cclauss commented 6 years ago

flake8 testing of https://github.com/brightmart/bert_language_understanding on Python 3.7.1

$ flake8 . --count --select=E901,E999,F821,F822,F823 --show-source --statistics

./pretrain_task.py:260:21: F821 undefined name 'count'
            count = count + 1
                    ^
./model/bert_cnn_model.py:318:41: F821 undefined name 'batch_size'
            p_mask_lm=[i for i in range(batch_size)]
                                        ^
./model/bert_model.py:227:41: F821 undefined name 'batch_size'
            p_mask_lm=[i for i in range(batch_size)]
                                        ^
3     F821 undefined name 'batch_size'
3
guotong1988 commented 6 years ago

The code here is not used.

cclauss commented 6 years ago

Should the dead code be eliminated?

brightmart commented 6 years ago

yeah. thank you.