-
### Presentation of the new feature
Logits processors in outlines.processors support nearly every inference engine, offering a "write once, run anywhere" implementation of business logic.
Curren…
lapp0 updated
3 months ago
-
Hi, it would we awesome to have mermaid support. I'm not sure if this would be helpful to others but I can look into adding support in the future (unless someone else is working on this sooner)
drbh updated
4 months ago
-
## Describe the bug
There is a synchronization issue at the launch of the Pod with the current images:
* the containers get all `Ready`:
```
flan-t5-small-gpu-predictor-00001-deployment-6768c5…
-
### The Feature
Write a custom class which we can use to fake a 'local HF model' and instead call hosted TGI endpoints.
These eval libraries all only support local HF models.
https://github.c…
-
I have a finetuned llama 2 7B chat model which I am deploying to an endpoint using DJL container. After deploying when I tested the model, the model output quality has degraded (The output seems to be…
ghost updated
4 months ago
-
### System Info
I am testing using the TGI Tool Call.
But The error continues to occur, can you check it?
### Information
- [X] Docker
- [ ] The CLI directly
### Tasks
- [X] An offici…
-
This tool currently supports the HF TGI container, and DJL Deep Speed container on SageMaker and both use the same format but in future other containers might need a different payload format.
Goal:…
-
### The Feature
How we currently get this list: https://github.com/BerriAI/litellm/blob/main/cookbook/get_hf_models.py
This should be auto-updated: https://github.com/BerriAI/litellm/tree/main/lit…
-
I am trying to use llm-vscode with a locally deployed Text Generation Inference (TGI) server but I keep getting the following error:
_Error decoding response body: expected value at line 1 column 1…
-
I have noted that when using vllm worker sometimes mistral MOE model stops generating in middle. Is it happening for just me or anyone else. I am using the instruct version of mistral-8x-7b ie mistral…