-
Cool package.
Wanted to try this with better and newer models
-
Are BERT models supported?
-
Hi there. Thanks for the great library!
I have one issue regarding the usage of Bert-based models. I trained different models finetuning them on my custom dataset (roberta, luke, deberta, xlm-rober…
-
### Feature request
I would like to request that BetterTransformer not be deprecated.
### Motivation
I have come to rely on BetterTransformer significantly for accelerating RoBERTa and BERT models.…
-
Confirm valid implementation
References:
> Loss? Loss is:
> Total span extraction loss is the sum of a Cross-Entropy for the start and end positions.
https://huggingface.co/transformers/v4…
-
### Describe the issue
Hello,
see [also this discussion](https://github.com/microsoft/onnxruntime/discussions/22427). I'm opening this one as I think it's an issue as sifting through previous issues…
-
**Describe the bug**
I tried to optimize BERT model with bert_ptq_cpu.json but it gave 7 output models.
It there any ways or change the config to get only one output model?
```
[2024-10-25 10:54:59,1…
-
Hi,
I want to take a sentence-transformer model( say xlmr) and extend its context length using rope. How to do this? Can you provide a code for this ?
-
This issue will be used to track compilation failures for migraphx models on CPU and GPU. Compile failures for each model should have a link to an issue with a smaller reproducer in the notes column.
…
-
privacy_engine = PrivacyEngine(
File "/home/idris/.local/lib/python3.10/site-packages/private_transformers/privacy_engine.py", line 176, in __init__
raise ValueError(
ValueError: Model type …