-
### System Info
```shell
transformers[torch]==4.33.2
onnxruntime 1:
/Users/goodspeed/Downloads/transformers.js-main/scripts/myenv/lib/python3.11/site-packages/transformers/models/llama/modeling_…
-
LoRA is loaded but is not applied. Full logs is attached as file below.
related issue #370
**lora_down|lora_up** [flux_lora.log](https://github.com/user-attachments/files/17118078/flux_lora.log…
-
transformers下载并更新到最新版本。把huggingface上面关于模型的文件都下载了,也通过 pip install --upgrade torch命令下载最新的Pytorch库。并访问[英伟达toolkit仓库](https://developer.nvidia.com/cuda-toolkit-archive)下载toolkit12.4.1版本,最后终端运行了
pip3 in…
-
If for some reason the llm service fails to respond. Jobs might get stuck forever
See logs :
```bash
linto_llm-gateway.1.1qeqpejy2j4n@linagora-linto-bm-02 | 25/11/2024 15:46:13 http_server I…
-
Here's the Error:
```
python functioncall.py --query "I need the current stock price of Tesla (TSLA)"
…
-
### Feature request
Seems that there is no config for DeBERTa v1-2-3 as decoder (while there are configs for BERT/RoBERTa et similia models)... This is needed in order to perform TSDAE unsupervised…
-
FETCH DATA from: /data/flux/ComfyUI/custom_nodes/ComfyUI-Manager/extension-node-map.json [DONE]
got prompt
/root/anaconda3/envs/comfyui/lib/python3.12/site-packages/torch/functional.py:513: UserWarn…
-
Hello, I want to do some benchmarking using OpenRLHF in a memory constrained environment (1-2 nodes each with one A30 GPU, 24 GB each). Thus, I have had to use other HF models as the ones used in the …
-
not sure if this is supposed to work on forge in the first place. but when trying magic prompt i get this error:
(i got more than enough spare VRAM)
```
WARNING:dynamicprompts.generators.magicpro…
-
环境:4090*4,python=3.10.12,ubuntu
报错如下:
User: 请描述图片内容
Exception in thread Thread-7 (generate):
Traceback (most recent call last):
File "/home/user/anaconda3/envs/ms-swift/lib/python3.10/threading…