-
### Feature request
[Stay on topic with Classifier-Free Guidance](https://arxiv.org/abs/2306.17806)
CFG brings non trivial improvements on many standard benchmarks.
### Motivation
The response q…
-
### System Info
Python 3.10.12
peft @ git+https://github.com/huggingface/peft.git@25dec602f306d52b6cc078ec8353ba6eac249097
transformers @ git+https://github.com/huggingface/transformers.git@8a0ed…
-
# 软件版本
- [Umi-OCR_Rapid_dev_20231114.7z](https://github.com/hiroi-sora/Umi-OCR_v2/releases/download/dev%2F20231114/Umi-OCR_Rapid_dev_20231114.7z)
运行环境
- Ubuntu20.04
- wine-8.0.2
# 如图
…
-
### System Info
HF-TGI server running on Kubernetes, I executed `text-generation-launcher --env` inside the pod:
```
2023-07-12T12:58:48.739266Z INFO text_generation_launcher: Runtime environment:…
-
### System Info
OS version: Ubuntu 22.04
Model being used: Qwen/Qwen2-72B-Instruct
Hardware being used: 4x 40GB A100
Deployment specificities: Running via docker using the `latest` tag as of 06/26…
-
### System Info
model=meta-llama/Llama-2-7b-chat-hf
docker run -d --gpus all \
--shm-size 1g -e HUGGING_FACE_HUB_TOKEN=$token -p 8080:80 \
-v $volume:/data ghcr.io/huggingface/text-generation-in…
-
### Issue you'd like to raise.
I'd like to use Hugging Face's Chat UI frontend with LangChain.
https://github.com/huggingface/chat-ui
But it looks like the Chat UI is only available through H…
-
### Feature request
Similar to Text Generation Inference (TGI) for LLMs, HuggingFace created an inference server for text embeddings models called Text Embedding Inference (TEI).
See: https://githu…
-
### System Info
The full command line used that causes issues:
```
docker run -it --rm -p 8080:80 --gpus all --name tgi \
-v /dev/shm/models:/models --shm-size 2g -e CUDA_LAUNCH_BLOCKING=1 \
…
-
I encounter an error when using the last version of the `text-generation` library (0.7.0).
When using the `generate` method, TGI v1.4.4 returns a json dictionary but the method is trying extract the …