issues
search
CAMeL-Lab
/
CAMeLBERT
Code and models for "The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models". EACL 2021, WANLP.
https://aclanthology.org/2021.wanlp-1.10
MIT License
43
stars
10
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Max length problem with bert-base-arabic-camelbert-mix-pos-msa
#6
dearden
opened
8 months ago
2
Loading dediacritic tool fails due to Emoji library dependency, and tokenizer model_max_length seems incorrect
#5
ghost
opened
2 years ago
0
Question regarding pretraining task
#4
ghaddarAbs
closed
2 years ago
2
How many epochs for pretraining?
#3
ghaddarAbs
closed
2 years ago
2
Same result for different tokenizers
#2
GRIGORR
closed
2 years ago
2
Can you provide the splits?
#1
moussaKam
closed
3 years ago
1