-
### Which component is this bug for?
Langchain Instrumentation
### š Description
Databricks supports the OpenAI Client for querying LLM models (foundation and external models). I am using it with Lā¦
-
### Describe the bug
I am using Langfuse to monitor a LangChain Chatbot using Ollama LLAMA3 model.
An error catch and logged by langfuse/callback/langchain.py
```
2024-05-25 18:35:21,554 DEBUGā¦
-
### Describe the bug.
raise ValueError(f"Model not found in the model list, uid: {model_uid}")
ValueError: [address=127.0.0.1:18140, pid=771] Model not found in the model list, uid: 3dc22fb0-740c-ā¦
-
### System Info
1.4.1 or 1.4.3 via docker:
```
docker run -d --gpus '"device=0,1"' --shm-size 12g -v $HOME/.cache/huggingface/hub/:/data -p 5002:80 ghcr.io/huggingface/text-generation-inferenceā¦
-
What's the name of the python package? Or do you have to clone this repository to be able to use the MistralAI python client?
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain.js documentation with the integrated search.
- [X] I used the GitHub search to find a ā¦
sneko updated
8 months ago
-
I currently have the following chain and I was wondering if it was possible to add stream for the llm response? I have checked docs and tried using chain.stream but it doesn't work, however if I use cā¦
-
I encountered an unexpected behavior when running in the following command:
`./parallel -m ./models/llama_7b/llama-2-7b/ggml-model-f16.gguf -t 1 -ngl 100 -c 4096 -b 512 -s 1 -np 8 -ns 128 -n 100 -cā¦
-
**Is your feature request related to a problem? Please describe.**
Current version of FlutterFlow (4.1) is incompatible with our package.
**Describe the solution you'd like**
We want to create a ā¦
-
### Describe the bug
When I make a call with OpenAI example code to the server the response returns with the default chat template. I also see the following warning message in the console:
```
ā¦