-
### What feature would you like to be added?
How Magentic-One be used with local LLMs or Ollama?
### Why is this needed?
This will enable users to use Magentic-One with open-source LLMs other than …
-
### Confirm this is an issue with the Python library and not an underlying OpenAI API
- [X] This is an issue with the Python library
### Describe the bug
When running the `openai migrate` command, …
-
**Why**
Currently we can only use Azure with openai models only. However, Azure has vast selection of models. They can be deployed only as serverless API.
**Description**
So would it be possible…
-
### Issue
Aider appears to not work when I pass an OpenAI organization ID.
I can confirm that my OpenAI key and Org ID are correct since other AI tools work correctly with it.
I've tried multiple G…
-
### Describe the feature or improvement you're requesting
### Description
Some integrations with LLM providers require setting custom headers in API requests. For example, I am using [Helicone](http…
-
### Bug Description
the node agent: Sequential Crew does not recognize anymore my openai compatible llm (llama 3)
in previous version i did not face this issue, i tried a few other agent, same error…
-
-
Hi ! I have a little enhancement suggestion for Houdini; Houdini already has support for chat-gpt, but today, a lot of AI services online provide "openai-like API's", basically apis almost identical t…
V3NCO updated
1 month ago
-
### Please search before asking
- [X] I searched in the [issues](https://github.com/yetone/openai-translator/issues) and found nothing similar.
### Please read README
- [X] I have read the trouble…
-
There are multiple open PRs looking to add inference adapters for services that offer OpenAI-compatible APIs.
* databricks - https://github.com/meta-llama/llama-stack/pull/83
* sambanova - https:/…