microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
31.63k stars 4.6k forks source link

[Feature Request]: support cloudflare AI gateway #2786

Open weifang74 opened 4 months ago

weifang74 commented 4 months ago

Is your feature request related to a problem? Please describe.

Cloudflare now provide competitive AI service price package but cannot be used by autogen. I tried to create a configuration like llama but autogen will add /chat/completes after the endpoint I provided, then make the function call fail.

image

Describe the solution you'd like

access cloudlfare AI service through gateway just like other non-openai LLM vendors.

Additional context

No response

jtoy commented 4 months ago

@weifang74 can you talk more about your use case and how its helpful? I haven't used this service much before, but looks interesting.

rafaelpierre commented 4 months ago

@weifang74 perhaps this can be achieved with CustomModelClient?

e.g. https://microsoft.github.io/autogen/docs/notebooks/agentchat_custom_model/

weifang74 commented 4 months ago

@weifang74 perhaps this can be achieved with CustomModelClient?

e.g. https://microsoft.github.io/autogen/docs/notebooks/agentchat_custom_model/

Great, I will do some tests later.

weifang74 commented 4 months ago

@weifang74 perhaps this can be achieved with CustomModelClient?

e.g. https://microsoft.github.io/autogen/docs/notebooks/agentchat_custom_model/

Cloudflare AI gateway can be accessed in this format:

curl -X POST https://gateway.ai.cloudflare.com/v1/8e26##############/lx/groq/chat/completions \ --header 'content-type: application/json' \ --header 'Authorization: Bearer GROQ_TOKEN' \ --data '{"model": "mixtral-8x7b-32768", "messages": [{"role": "user", "content": "What is Cloudflare?"}]}'

or like this: curl -X POST https://gateway.ai.cloudflare.com/v1/8e###############/lx/workers-ai/@cf/meta/llama-3-8b-instruct \ --header 'Authorization: Bearer CF_TOKEN' \ --header 'Content-Type: application/json' \ --data '{"prompt": "What is Cloudflare?"}'

I hope it can be accessed in autogen like other LLM endpoint.

weifang74 commented 4 months ago

I tried, it works! I wrote a customized class named CFGatewayClient, and bind it to an assistant then got response as below:

assistant-pro (to user_proxy):

Here is a Python program that says "Hello Cloudflare!":

# filename: hello_cloudflare.py
print("Hello Cloudflare!")

This program uses the print() function to output the string "Hello Cloudflare!" to the console.

To run this program, save it to a file named hello_cloudflare.py and execute it using Python:

$ python hello_cloudflare.py
Hello Cloudflare!

This will output the message to the console.

TERMINATE

@weifang74 perhaps this can be achieved with CustomModelClient? e.g. https://microsoft.github.io/autogen/docs/notebooks/agentchat_custom_model/

Great, I will do some tests later.

I tried, it works! I wrote a customized class named CFGatewayClient, and bind it to an assistant then got response as below:

assistant-pro (to user_proxy):

Here is a Python program that says "Hello Cloudflare!":

# filename: hello_cloudflare.py
print("Hello Cloudflare!")

This program uses the print() function to output the string "Hello Cloudflare!" to the console.

To run this program, save it to a file named hello_cloudflare.py and execute it using Python:

$ python hello_cloudflare.py
Hello Cloudflare!

This will output the message to the console.

TERMINATE