Open nkkko opened 1 month ago
/bounty $20
/attempt #10
with your implementation plan/claim #10
in the PR body to claim the bountyIf no one is assigned to the issue, feel free to tackle it, without confirmation from us, after registering your attempt. In the event that multiple PRs are made from different people, we will generally accept those with the cleanest code.
Please respect others by working on PRs that you are allowed to submit attempts to.
e.g. If you reached the limit of active attempts, please wait for the ability to do so before submitting a new PR.
If you can not submit an attempt, you will not receive your payout.
Thank you for contributing to daytonaio/devcontainer-generator!
Add a bounty • Share on socials
Attempt | Started (GMT+0) | Solution |
---|---|---|
🟢 @madman024 | Oct 10, 2024, 7:16:37 PM | WIP |
/attempt #10
Hey there! 👋 Here's my plan for adding multi-LLM support to the project: Adding Multiple LLM Providers - Implementation Plan What will i do:
Create a simple but flexible system to swap between different AI providers (OpenAI, Anthropic, Groq, etc.) Update our config to handle different provider settings Refactor the existing Azure OpenAI code to use this new system Add proper error handling for when providers act up
Main changes needed:
New provider system managing diff llms Update to our .env setup Tweaks to the database to track which provider was used Some UI updates to show/select providers
What do you think? doe this seems right?
@madman024 sounds high level but sure you can try
@nkkko ok sure i am working on it
Is your feature request related to a problem? Please describe. The current implementation supports only Azure OpenAI as the LLM provider. It lacks the flexibility to support other LLM providers such as OpenAI directly, Anthropic, Google or Groq. This would open opportunities for easy future migration to more efficient provider.
Describe the solution you'd like
.env
configuration file to support specifying alternative LLM providers.main.py
to read and initialize the appropriate LLM client based on the selected provider from the.env
file.Additional context
.env
to include environment variables for configuring alternative LLM providers.