-
If there are more than ~1000 tokens, the model will give an error. We need to either handle this with a graceful error or somehow break up the segment into multiple segments and then recombine.
-
### Short description of the problem
I was trying to load a model larger than my GPU size using load_checkpoint_and_dispatch, but ran into an error
### System Info
```Shell
accelerate version: 0…
-
When I followed the readme document to execute the initial environment, I was able to successfully deploy local_tools to 127.0.0.1:8079. Only the api-keys for weather and openai were filled in, and th…
-
### System Info
- `transformers` version: 4.21.1
- Platform: Linux-5.15.0-56-generic-x86_64-with-glibc2.31
- Python version: 3.9.7
- Huggingface_hub version: 0.13.3
- PyTorch version (GPU?): 1.…
-
There should be at least 3, if not 10 auto-retries when this happens:
```
2023-06-08 16:31:00,036 - silnlp.common.environment - INFO - Uploading MT/experiments/FT-Ingush/NLLB_13_CHE_ING_3/val.trg.txt…
-
### The bug
When I try launching the face detection, whatever model I use, I get the following error:
```bash
immich_machine_learning | [03/23/24 22:46:05] INFO Setting 'buffalo_l' execution p…
-
In this issue, we are going to provide a suitable model or API for translating the reviews from English to Language L and then back-translate the reviews from L to English.
-
## 🐛 Bug
When calling the `load_from_checkpoint` function to load a model from a checkpoint, the `hparams.yml` file located in the parent folder does not get taken into account. For example, the `p…
-
### System Info
```shell
torch= 2.0.1 + cu118
transformers= from source (4.32.dev0)
optimum= from source (1.11.3dev0)
```
### Who can help?
_No response_
### Information
- [X] The official exa…
-
**Description**
Add support for multilingual machine translation LLMs (MMTLLM)
**Additional Context**
There are some MMTLLMs designed to translate text such as https://huggingface.co/facebook…