-
pip install -r requirements.txt
Collecting accelerate==0.24.1 (from -r requirements.txt (line 1))
Downloading accelerate-0.24.1-py3-none-any.whl.metadata (18 kB)
Collecting aiohttp==3.9.0 (fr…
-
Hi folks!
Grounding DINO is now available in the Transformers library, enabling easy inference in a few lines of code.
Here's how to use it:
```python
from transformers import AutoProcessor,…
-
I'd like to propose a new feature for Ollama: the ability to access attention matrices and/or the KV-Cache during model inference. This functionality is similar to what's available in the Hugging Face…
-
### Feature request
Hi @xenova ,
I would like to request the creation of a lightweight node package, `tokenizers.js`. It would be great if developers have flexibility to use `tokenizers.js` instead …
-
### Feature request
Many prominent transformers (e.g. BERT) under the transformers library have specific implementations for SequenceClassification. I believe implementing these could potentially s…
-
我已经下载了模型,报错说找不到config.json文件。请问各位大佬是路径问题还是什么问题?
2024-10-09 23:13:19 | ERROR | stderr | OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files…
-
Hello, VENumML team!
I came across VENumML, and I’m very impressed by its vision for enabling `Privacy-Preserving Machine Learning (PPML)` with `Fully Homomorphic Encryption (FHE)`. I believe this …
-
`pip list | grep "transformers"`
得到:
sentence-transformers 3.0.1
transformers 4.45.0.dev0
----------------------------------------------
进入pytho…
-
First of all, thank you for developing torchtune. This has been very helpful for our group with limited GPU credits. I'm impressed by its capabilities, particularly its memory efficiency. I've noticed…
-
Hello, thanks for your awesome aligner!
Unfortunately, a recent change in the huggingface_hub has broken your code: https://github.com/huggingface/huggingface_hub/releases/tag/v0.26.0
The cached_d…