jackmpcollins / magentic

Seamlessly integrate LLMs as Python functions
https://magentic.dev/
MIT License
1.92k stars 95 forks source link

how could we adding extra header into LitellmChatModel #260

Open ashinwz opened 1 month ago

ashinwz commented 1 month ago

@prompt( "Say hello", model=LitellmChatModel( model = "gpt-4o", api_base = "", custom_llm_provider="azureopen", extra_header= { "Api-Key": "xxxx", "Content-Type": "application/json" } ), ) def say_hello_litellm() -> str: ...

format seem as above. Currently, litellm can support me to add the extra_header

response = completion( model="gpt-4o", messages=[{ "content": "Hello, how are you? say something","role": "user"}], api_base="", custom_llm_provider="azure", # litellm will use the openai.ChatCompletion to make the request, extra_headers = { "Api-Key": "xxx", "Content-Type": "application/json" } )

jackmpcollins commented 1 month ago

Hi @ashinwz Currently there isn't a way to do this in magentic. What do you put in the extra headers?

I would accept a PR to add this as an init parameter for the LitellmChatModel. For an example of that see PR https://github.com/jackmpcollins/magentic/pull/185 or PR https://github.com/jackmpcollins/magentic/pull/221