-
### Question
# Load model directly
from transformers import AutoProcessor, AutoModelForCausalLM
processor = AutoProcessor.from_pretrained("liuhaotian/llava-lcs558k-scienceqa-vicuna-13b-v1.3")
mo…
-
Hello, sadly the installer does not work for me( Really want to try the project, but always get the same error... be it a fresh Docker install or a regular one
huggingface-hub>=0.23.2, (venv) D:\au…
-
`---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
[](https://localhost:8080/#) in ()
19
…
-
### System Info
Automatic publicPath is not supported in this browser
https://stackoverflow.com/questions/68115467/error-automatic-publicpath-is-not-supported-in-this-browser-when-i-am-running-m…
-
### Is your feature request related to a problem? Please describe.
The HuggingFace Hub provides an elegant python [client](https://huggingface.co/docs/huggingface_hub/index) to allow users to control…
-
Hi,
Could you give me some insights whether is possible to plug in inltk with huggingface transformer library
-
### System Info
version: 2.17.2
browser: brave
JS library: vite
### Environment/Platform
- [X] Website/web-app
- [ ] Browser extension
- [ ] Server-side (e.g., Node.js, Deno, Bun)
- [ ] Desktop …
-
Hi,
Congrats on this work! I discovered it from the paper page: https://huggingface.co/papers/2408.15881 (feel free to claim the paper in case you're one of the authors, so that it appears at your …
-
`@huggingface/transformers` has been announced to be available on `NPM`, we might try to upgrade it since we are using `transformers-branch:v3-one_commit` to leverage `webgpu` support.
#### Some no…
-
In `performance_optimization/prompt_reuse.py`, the current method of storing the cached prompt does not correctly discard the KV cache for the last token (and instead follows the same caching recipe a…