issues
search
helboukkouri
/
character-bert
Main repository for "CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters"
Apache License 2.0
195
stars
47
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Error when Downloading Pretrained models
#27
mauriciotoro
opened
5 months ago
0
BUG in the downloader
#26
Sequential-circuits
closed
12 months ago
1
tarfile.ReadError: not an lzma file
#25
thibault-roux
closed
12 months ago
2
Small error in the Readme
#24
thibault-roux
closed
1 year ago
1
Does this support mask filling task?
#23
ZonglinL
closed
12 months ago
1
How to finetuning the model
#22
CsAbdulelah
closed
12 months ago
1
MLM head pre-trained weights of character-bert
#21
alexlimh
closed
12 months ago
1
CharacterBERT in information retrieval task
#20
ArvinZhuang
closed
2 years ago
1
MEDNLI noisy text
#19
loubnabnl
closed
2 years ago
2
Word-level padding vs Character-level padding
#18
IstiaqAnsari
closed
12 months ago
3
Multilingual version
#17
Mrs-Hudson
closed
2 years ago
1
Printing character level vectors
#16
ozturkoktay
closed
12 months ago
2
Pretrain character bert with new data
#15
steveguang
closed
3 years ago
1
CharacterBertTokenizer support for latin characters
#14
MatheusNtg
closed
3 years ago
3
How do I use word embeddings?
#13
steveguang
closed
3 years ago
1
Fine tuning with Trainer from huggingface
#12
MatheusNtg
closed
3 years ago
8
unroling of the word representations for the loss function
#11
XMaster96
closed
3 years ago
1
Access hidden layers of character-bert
#10
vguptai
closed
3 years ago
1
Multi-Label Support
#9
thiagosantos1
closed
3 years ago
3
CharBERT out of date with latest Transformers? + using charBERT to output probability of sequence of characters?
#8
bhargavvader
closed
3 years ago
1
CharBERT will not generate vocab.txt?
#7
Believer215
closed
3 years ago
1
Huggingface implementation
#6
Tahlor
closed
12 months ago
3
Which layer should I use if I only want to embed char
#5
zmddzf
closed
3 years ago
3
How do I pre-train CharacterBERT?
#4
gianfilippo
closed
3 years ago
4
pre-trained models for bert-multilingual-base-uncased
#3
adeepH
closed
3 years ago
1
Generating word embeddings
#2
mlaugharn
closed
3 years ago
1
CharacterCNN mask not use ?
#1
Shiro-LK
closed
3 years ago
2