All-Hands-AI / OpenHands

🙌 OpenHands: Code Less, Make More
https://all-hands.dev
MIT License
37.61k stars 4.25k forks source link

Request to Add Qwen2.5 as a Supported Provider in OpenHands #5201

Closed AhmedProab closed 6 days ago

AhmedProab commented 6 days ago

What problem or use case are you trying to solve? I am looking to integrate Qwen2.5 (a powerful AI model) into OpenHands to expand its supported AI providers under LiteLLM. Adding Qwen2.5 will provide users with an additional AI model option, enhancing flexibility and functionality.

Describe the UX of the solution you'd like Users should be able to select Qwen2.5 as an option within LiteLLM providers, just like other supported models (e.g., OpenAI, Claude, etc.). Once added, it should seamlessly integrate into existing workflows without additional configuration complexities.

Do you have thoughts on the technical implementation? The Qwen2.5 repository is available on GitHub: https://github.com/QwenLM/Qwen2.5. It includes comprehensive documentation and examples for integration. I believe the team can review the APIs and implement a connector similar to the current provider integrations.

Describe alternatives you've considered No alternative solutions are currently available for integrating Qwen2.5, but existing support for similar models in LiteLLM can provide a template for implementation.

Additional context This enhancement aligns with OpenHands' mission to support diverse AI models. Adding Qwen2.5 would strengthen OpenHands' position as a flexible, multi-model AI tool. Please let me know if further technical details or contributions are needed.

neubig commented 6 days ago

Hi @AhmedProab , thanks for the feedback! You already can select arbitrary LMs following the doc here: https://docs.all-hands.dev/modules/usage/llms

I will note that while we have high hopes for qwen-2.5, we haven't yet confirmed that it actually performs well as a backend for OpenHands