Open juliensalinas opened 2 years ago
Lots of different configuration to consider. :)
Thank you again for reporting.
Actually this model is not fully supported. It uses the architecture BartForSequenceClassification
, but we don't support the additional classification head at the moment.
Thanks for investigating @guillaumekln !
I am seconding this. It would be great to implement Bert-like models with encoders only + classification head. More specifically if we can use pre-trained parser like this: https://ufal.mff.cuni.cz/udpipe/2/models it would make things easier to integrate in pipelines.
This issue is for BART, the sequence to sequence model. I created another issue for encoder only models like BERT: #1008.
Hello again,
I'm trying to convert this adaptation of Bart Large MNLI: https://huggingface.co/joeddav/bart-large-mnli-yahoo-answers
It returns the following error (but the base Bart Large MNLI model works well):
Thanks in advance!