utterworks / fast-bert

Super easy library for BERT based NLP models
Apache License 2.0
1.87k stars 341 forks source link

Support for multi-label and multi-class text classification using DistilBERT #60

Open emtropyml opened 5 years ago

emtropyml commented 5 years ago

How can I use DistilBERT for multi-label classification for building a fast and deploy-able model?

kaushaltrivedi commented 5 years ago

just set the model_type parameter to distilbert and the pretrained weights name as distilbert-base-uncased

mohammedayub44 commented 4 years ago

@kaushaltrivedi How about for multilingual model. "DistilmBERT" (I want to run this for different languages)

Is the following config correct ? modeltype='DistilmBERT' and tokenizer='distilbert-base-multilingual-cased'

When I check databunch.tokenizer it shows me transformers.tokenization_distilbert.DistilBertTokenizer

Thanks !