Open tsinggggg opened 2 weeks ago
@microsoft-github-policy-service agree
Hi @Hk669 , would you please review this PR? I saw you review the latest addition of non-oai client bedrock. I am happy to follow up with a PR towards the main branch later as well. Thank you!
@tsinggggg thank you for your contribution!
Given we are close to release of v0.4 version. Would you like to create a community package for Wasonx client following the v0.4 ChatCompletionClient protocol? You can find existing implementations here: https://github.com/microsoft/autogen/tree/main/python/packages/autogen-ext/src/autogen_ext/models.
For guidance on how to create a community package: https://microsoft.github.io/autogen/dev/user-guide/extensions-user-guide/index.html
Once you create one you can submit a PR to our documentation so we can index it.
The reason we ask for community package is because it is not feasible for us to maintain clients to APIs that we don't have access to.
Hi @ekzhu , thanks a lot for the suggestions!
I created an extension for using watsonx with autogen 0.4. I released it to pypi here https://pypi.org/project/autogen-watsonx-client/
This is the PR https://github.com/microsoft/autogen/pull/4130 for the doc update in the main branch if you would review. Thank you!
Attention: Patch coverage is 8.60927%
with 138 lines
in your changes missing coverage. Please review.
Project coverage is 29.05%. Comparing base (
8a8fcd8
) to head (a966cae
). Report is 3 commits behind head on 0.2.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Why are these changes needed?
To enhance the support of non-OpenAI models with AutoGen, adding watsonx.ai into the mix to support both IBM granite series of models and others non-IBM models hosted in watsonx.ai
Related issue number
Checks