yao8839836 / kg-bert

KG-BERT: BERT for Knowledge Graph Completion
Apache License 2.0
679 stars 141 forks source link

'NoneType' object has no attribute 'lower' #4

Closed By-he closed 4 years ago

By-he commented 4 years ago

Hello, first of all, thank you for your contribution to the field. I am very happy to learn a lot from your paper. When repeating your Triple Classification experiment, I encountered this problem, 'NoneType' object has no attribute 'lower', the problem is in line 542 of run_bert_triple_classifier.py, and can't find a solution, it is very risky to give you Write an email and look forward to your reply, which can help me solve my doubts. Thanks again

yao8839836 commented 4 years ago

@By-he

Thanks for your interests.

You may forget to use --task_name kg in the command, because I only found task_name = args.task_name.lower() in run_bert_triple_classifier.py which is not in line 542.

By-he commented 4 years ago

Thank you very much for your answer and the problem has been resolved. But there is a new problem: TypeError: _isdir: path should be string, bytes or os.PathLike, not NoneType. The problem is in line 555: tokenizer = BertTokenizer.from_pretrained(args.bert_model, do_lower_case=args.do_lower_case). I look forward to your reply, thank you.

yao8839836 commented 4 years ago

@By-he

Have you used all the args in the following command? it's another NoneType error, which means some requried inputs are None.

python run_bert_triple_classifier.py --task_name kg --do_train
--do_eval --do_predict --data_dir ./data/WN11 --bert_model bert-base-uncased --max_seq_length 20 --train_batch_size 32 --learning_rate 5e-5 --num_train_epochs 3.0 --output_dir ./output_WN11/
--gradient_accumulation_steps 1 --eval_batch_size 512

By-he commented 4 years ago

First of all, I am very sorry, because I am still a beginner, so many details have not been noticed. Now the experiment has been running successfully. Thank you for taking the time to reply. I wish your research all the best.

yao8839836 commented 4 years ago

@By-he

Thank you.