-
If I run the code with default arguments (and use data.txt from the repository) I get the following message:
```
Traceback (most recent call last):
File "C:/Users/matej/git/xlnet-Pytorch/main.p…
-
When I do "python run_bert.py --do_data", I met this problem. My pytorch version is 1.5.0+cuda10.1
`OSError: [Errno 22] Invalid argument: 'D:\\Desktop\\BERT+XLNET Multi-label\\pybert\\output\\log\\…
CS18B updated
4 years ago
-
If I want to train your model on text classification task, I think the input has only one sentence(segment) each batch,In other words, the input is [A,CLS] without B and SEP?
-
Hello,
I want to use the Text Classification task on our own data:
1- In BERT the data is formatted in (id, label, etc). I understood that for XLNet, the data required to be formatted in the s…
-
I wonder if there is a vocabulary for xlnet, so that giving a sentence, I could generate input_ids according to this vocab, instead of getting it from prepro_utils.encode_ids()
-
I'm using the "Custom usage of XLNet" section. I noticed that I was never asked to provide a directory where the actual weights of the pre-trained model are loaded from. I was asked only for a path to…
-
*In this paper, we propose the Attention on Attention
(AoA) module, an extension to conventional attention
mechanisms, to address the irrelevant attention issue. Fur-
thermore, we propose AoANet fo…
-
Hi,
Can XLNet be applied to sequence labelling problem for example Named Entity Recognition?
-
@dirkneuhaeuser Thanks for making the world a better place, your classifier is extremely helpful for natural language understanding.
Unfortunately, 91% accuracy is still not really great for widespre…
-
roberta모델이 bert보다 2~20% 성능 향상 된다고 하는데 실제 적용해보았을때 향상이되는지 확인해보기
> 참고자료
> - RoBERTa : A Robustly Optimized BERT Pretraining Approach 논문 : https://arxiv.org/abs/1907.11692
> - BERT, RoBERTa, Di…