I currently want to use (PhoBERT: Pre-trained language models for Vietnamese) as a BERT pre-trained language model for this repo. PhoBERT was built and deploy in huggingface/transformers library (huggingface/transformers).
As I know, self-attentive-parser is using pytorch_pretrained_bert for getting BERT Model. I have tried to change the code of function get_bert in parse_nk.py to use PhoBERT:
Traceback (most recent call last):
File "src/main.py", line 612, in <module>
main()
File "src/main.py", line 608, in main
args.callback(args)
File "src/main.py", line 564, in <lambda>
subparser.set_defaults(callback=lambda args: run_train(args, hparams))
File "src/main.py", line 312, in run_train
_, loss = parser.parse_batch(subbatch_sentences, subbatch_trees)
File "/home/kynh/codes/self-attentive-parser/src/parse_nk.py", line 1026, in parse_batch
features_packed = features.masked_select(all_word_end_mask.to(torch.bool).unsqueeze(-1)).reshape(-1, features.shape[-1])
AttributeError: 'str' object has no attribute 'masked_select'
I read a paper using PhoBert for training in this repo so I am pretty sure this can be done, but do not know how to do it.
Any solution? thanks!
Dear authors,
I currently want to use (PhoBERT: Pre-trained language models for Vietnamese) as a BERT pre-trained language model for this repo. PhoBERT was built and deploy in huggingface/transformers library (huggingface/transformers). As I know, self-attentive-parser is using
pytorch_pretrained_bert
for getting BERT Model. I have tried to change the code of functionget_bert
inparse_nk.py
to use PhoBERT:But get this error:
I read a paper using PhoBert for training in this repo so I am pretty sure this can be done, but do not know how to do it. Any solution? thanks!