marella / ctransformers

Python bindings for the Transformer models implemented in C/C++ using GGML library.
MIT License
1.79k stars 135 forks source link

CTransformers doesn't store model on right location #143

Open Yanni8 opened 11 months ago

Yanni8 commented 11 months ago

You can modify the location where the Hugging Face model should be stored (when using the transformers library) by setting the environment variable TRANSFORMERS_CACHE. The default location is the directory ~/.cache/huggingface/

If I use CTransformers with the TRANSFORMERS_CACHE environnement variable won't it change the download location of the files.

It would be nice if also CTransformers uses the TRANSFORMERS_CACHE or a similar environment variable to define the download location of the LLM.

I would create a PR if this feature doesn't already exist.

B0rner commented 5 months ago

Same problem here.

The Hugging Face version of transformersalso supports a cache_dir - parameter in several methods, like .from_pretrained.
It seams, there is now way to set the model-cahce in any way for ctransformers. This is a problem/show-stopper, especially if you use that insight containers system, where the main storage is not allocated to the home-directories.
Any idea for a workaround?