-
-
### Discussed in https://github.com/langchain-ai/langchain/discussions/22882
Originally posted by **rvasa779** June 13, 2024
### Checked other resources
- [X] I added a very descriptive tit…
-
I'm using xinference and changed .env and settings.yaml
but start app.py still got error like this:
Exception while fetching openai_chat models: HTTPConnectionPool(host='localhost', port=11434)…
-
### Do you need to file an issue?
- [x] I have searched the existing issues and this bug is not already filed.
- [x] My model is hosted on OpenAI or Azure. If not, please look at the "model provid…
-
### The Feature
It's undocumented, but `2024-08-01-preview` supports `"stream_options": {"include_usage": true}}`.
Example:
```sh
export AZURE_OPENAI_AD_TOKEN=$(az account get-access-token --s…
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a sim…
-
[ ] I have checked the [documentation](https://docs.ragas.io/) and related resources and couldn't resolve my bug.
**Describe the bug**
Further request for LLamaIndex support regarding Azure OpenAI…
-
I run `chat-ui` with the `chat-ui-db` docker image. I would like to connect it to my Azure OpenAI API endpoint.
I have setup the `env.local` file as stated in your docs and binded it with the docker …
gqoew updated
3 months ago
-
I noticed that the url in this crate is hardcoded, which currently only supports the official API endpoints of each AI service provider.
Would there be a plan to make the endpoint configurable? For…
-
### Feature Description
Hi @dosu and @logan-markewich ,
We have 'azure' model when I put our model 'azure-gpt-4o' llm = OpenAI(temperature=0.1, model="azure-gpt-4o"). I got error:
ValueError: Unk…