-
xinference用docker安装时可以用-v /.cache/huggingface:/root/.cache/huggingface改变huggingface模型的默认位置,但是用pip安装后设置 HF_HOME不管用,还是在XINFERENCE_HOM生成huggingface目录,模型下载到里面,如何将huggingface模型目录设置到指定位置?
-
python run_baseline_refactor.py
error:
**python: can't open file 'run_baseline_refactor.py': [Errno 2] No such file or directory**
This python file doesn't exist, I think it's still run_baseline_lm…
-
Currently, there are two problems:
- `eval.batch_size` is used for spinning up multiple environments (see [code](https://github.com/huggingface/lerobot/blob/main/lerobot/common/envs/factory.py#L51)…
-
-
I tryed to use distil-whisper-v3 in stable-ts and it can be used.
However, it's unable to be used when I try to use "distil-large-v2".
Other model can't be used too.(ex:kotoba-whisper,"kotoba-tech/k…
-
### What version of Bun is running?
1.0.20
### What platform is your computer?
Darwin 23.2.0 arm64 arm
### What steps can reproduce the bug?
```ts
import { pipeline } from "@xenova/tra…
-
### Motivation
as titled @lvhan028 @lzhangzz @grimoire
blog: https://research.nvidia.com/publication/2024-06_nemotron-4-340b
tech report: https://d1qx31qr3h6wln.cloudfront.net/publications/Nem…
-
Following the onnx opt tutorial I run into trouble with a version mismatch of huggingface-hub and the transformers package:
```
File "C:\Users\jogo\.conda\envs\ryzenai-transformers\lib\site-package…
-
### Describe the bug
The `huggingface-cli` fails to download the `microsoft/phi-3-mini-4k-instruct-onnx` model because the `.incomplete` file of the `.onnx` data file is missing.
I assume the fi…
-
when loading the model, it seems that the function `get_fond()` in ixc_utils.py will be run. However, some errors occurred:
def get_font():
truetype_url = 'https://huggingface.co/internlm/inte…