-
Hello 👋
I tried using `ContextualWordEmbsAug` using the `xlm-roberta-base` model but it seems to be not supported. I needed it to do augmentation on a language that is not available in the `bert-base…
-
### Describe the issue
Greetings,
Are there any plans on releasing instructions or at least the dataset format so we can fine-tune the `llmlingua-2-xlm-roberta-large-meetingbank` or the base `xlm-…
-
Hello,
I added a bit of simialr code(adding xlmroberta tokenizer, encoder description) to use xlm-roberta-base from huggingface instead of bert encoder model. Problem is when I trying to train on xlm…
-
I plan to implement 2 transformer models, and finally choose the one which performs the best on evaluation.
T5 (Text-to-Text Transfer Transformer) is a powerful and flexible Transformer-based languag…
-
While compiling models like [HuggingFace protectai/xlm-roberta-base-language-detection-onnx](https://huggingface.co/protectai/xlm-roberta-base-language-detection-onnx) or [mistralai/Mistral-7B-v0.1](h…
-
In reflection-classification/models/README.md, the link in the gdown snipet should be updated.
Original line:
`gdown https://drive.google.com/uc?id=1Sv0OpLyi13HA5miRxbi5HNGmziJ0yQwt -O xlm-roberta…
-
### Motivation.
As vllm supports more and more models and functions, they require different attention, scheduler, executor, and input output processor. . These modules are becoming increasingly com…
-
你好,我想基于自己的数据集进行fine-tune,给以给出教程吗,谢谢
-
Hi, thanks for the great example on training RoBERTa with long attention.
Followed this example: https://github.com/allenai/longformer/blob/master/scripts/convert_model_to_long.ipynb
Was able to s…
-
This has been a long time request.
Recently resurfaced with #1534 .
#1342 is also an example on how to perform classification but HF library is needed.
Technically this is not very difficult sinc…