-
Research and evaluate different LLM models (e.g., BERT, RoBERTa, XLNet) for their suitability in the bioinformatics domain.
-> Research and document the strengths and weaknesses of each model. Crea…
-
您好,之前我微调模型已经完成,融合模型也没有出问题,但是本周使用的时候突然发现,不论FlagEmbedding或者Huggingface的调用都会出现:
File "/opt/conda/lib/python3.8/site-packages/FlagEmbedding/flag_reranker.py", line 158, in __init__
self.tokenizer …
-
hi,
I am trying to replicate the Natural Language Inference (NLI) task example to train a cross-encoder on a 15 labels dataset.
When using the `distilroberta-base` I get the following error on tr…
-
Hi,
I am trying to setup sem2vec-BERT based on the instructions given in `README`. I am facing the following issues while fine-tuning the RoBERTa model running `python src/fine_tune.py` :
```
…
-
### **I am trying to Deploy and inference the XLM_Roberta model on TRT-LLM.**
I followed the example guide for BERT and built the engine: (https://github.com/NVIDIA/TensorRT-LLM/tree/main/examples/be…
-
Hello, recently when replicating this project, I found that the 'reberta target' model no longer exists on the hugging face website. May I ask everyone, can this project be replaced with other models?
-
Hey! Thanks for this work, brilliant!
Im trying to run your model on a macbook m1 (no cuda) with following code:
```
from accord_nlp.text_classification.ner.ner_model import NERModel
model = NE…
-
### Describe the issue
Following the [issue 155](https://github.com/microsoft/LLMLingua/issues/155), I'm trying to reproduce the results of the official [llmlingua-2-xlm-roberta-large-meetingbank](…
-
### Describe the issue
GPU RAM gets exhausted during inference (after a certain number of calls), i.e. it keeps increasing randomly
### To reproduce
### Reproduction instructions
```
sess_optio…
-
## Information
The problem arises in chapter:
* [ ] Introduction
* [*] Text Classification
* [ ] Transformer Anatomy
* [ ] Multilingual Named Entity Recognition
* [ ] Text Generation
* [ ] …