-
Hugginface hub login successful
Used gemma2-27b LLM to testing:
cargo run --release -- -m "google/gemma-2-27b-it" -c
Finished release [optimized] target(s) in 0.03s
Running `target/re…
-
I am getting the following error: from huggingface_hub import CommitOperationAdd, SpaceHardware, SpaceStage
ImportError: cannot import name 'CommitOperationAdd' from 'huggingface_hub' (/Users/con…
-
### System Info
```
- `transformers` version: 4.40.0
- Platform: Linux-6.5.0-26-generic-x86_64-with-glibc2.35
- Python version: 3.10.13
- Huggingface_hub version: 0.23.1
- Safetensors version: 0…
-
today when I use `model = UniDepthV2.from_pretrained("lpiccinelli/unidepth-v2-vitl14")`, it occurs the problem below, is this occurs because every time when i want to use unidepth pretrained model the…
-
### System Info
- peft 0.12.0
- `transformers` version: 4.43.3
- Platform: Linux-5.15.0-113-generic-x86_64-with-glibc2.35
- Python version: 3.12.4
- Huggingface_hub version: 0.24.2
- Safetensors…
-
### Feature request
Implement `v1/models` like OpenAI API to list available local **loras**. This is dependent on #199
There is also a hurdle to this: A user may have multiple base models and mu…
-
Opening an issue as per @osanseviero's [suggestion on Twitter](https://twitter.com/osanseviero/status/1488512516162572290).
Issue imported from https://github.com/pyannote/pyannote-audio/issues/835
…
-
Hello @kjsman,
this is more a feature proposal than an actual issue. Instead of requiring the user to download and open the tar file containing the weights and the vocabulary from your huggingface …
-
**Is your feature request related to a problem? Please describe.**
For deployment of dockers on data centres, models needs to be cache locally. Either copying manually/scripts or auto-downloaded by c…
-
* Name of dataset: brWaC (Brazilian Portuguese Web as Corpus)
* URL of dataset: https://www.inf.ufrgs.br/pln/wiki/index.php?title=BrWaC
* License of dataset: not specified
* Short description of da…