-
Hi,
I want to take a sentence-transformer model( say xlmr) and extend its context length using rope. How to do this? Can you provide a code for this ?
-
In `lavis/models/blip_models/blip_rel_det.py`
```python
tokenizer = StageBertTokenizer.from_pretrained(
"/public/home/lirj2/projects/LAVIS_GITM/data/bert-base-uncased",
loc…
-
This issue will be used to track compilation failures for migraphx models on CPU and GPU. Compile failures for each model should have a link to an issue with a smaller reproducer in the notes column.
…
-
Now with the introduction of embeddings: https://github.com/vllm-project/vllm/pull/3734, are there plans on the roadmap to support BERT models?
-
To train a mm_grodunding_dino ,we need to load both BERT and Swin two pre-trained models。
To fine-tune a mm_grounding_dino using my dataset, I need to load a pre-trained MM_Grounding_DINO and the con…
-
**Describe the bug**
Downloaded https://huggingface.co/google-bert/bert-base-uncased/blob/main/tf_model.h5 to local machine and ran modelscan --show-skipped -p ./models/bert-base-uncased/tf_model.h5 …
-
Research and evaluate different LLM models (e.g., BERT, RoBERTa, XLNet) for their suitability in the bioinformatics domain.
-> Research and document the strengths and weaknesses of each model. Crea…
-
#### Description
We aim to simplify the deployment process of our FastAPI application, which serves as an interface to our BERT and FastText models for SMS classification. The current setup process…
-
Hi,
I also encountered the issue with the key problem:
RuntimeError: Error(s) in loading state_dict for Tagger:
Unexpected key(s) in state_dict: "bert.embeddings.position_ids".
and I finally…
-
Hello!
I've noticed that #269 introduces support for BERT-based model merging. I've tried it out on a few that I fancy, and I've been having a few issues.
### My Config
```yaml
models:
- mo…