Closed pratikchhapolika closed 3 years ago
Hello! We have several documents that can help you get started! First of all, the quicktour, and the free course of the HF ecosystem may help you out.
Hello! We have several documents that can help you get started! First of all, the quicktour, and the free course of the HF ecosystem may help you out.
What about my code above. Is this correct way of doing things?
@pratikchhapolika Hi Pratik, yes you can use most models for sequence classification. You can do the following
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("name_of_base_model")
model = AutoModelForSequenceClassification("name_of_base_model")
//name_of_base_model can be bert-base-cased, albert-base-v2, roberta-large etc.
The full list is here You can then use the model & finetune it on the desired classification task (e.g. GLUE / SUPERGLUE)
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
I want to do
Multiclass-Multilabel ( MLMC)
classification problem usingConv-BERT
model.Steps that I have taken is:
I downloaded the Conv-Bert model from this link: https://huggingface.co/YituTech/conv-bert-base << YituTech/conv-bert-base>>
I want to understand can we call any classification module from Hugging face and pass any pre-trained models to it like
Roberta, Conv-BERT.. so on
. ? << As in above example>> Is it mandatory to use Conv-Bert classification pre-trained model ?