Closed ColinLeverger closed 1 year ago
I have deployed an OpenLLM on a managed service that is protected by the use of an auth Bearer token:
curl -X 'POST' \ 'https://themodel.url/v1/generate' \ -H "Authorization: Bearer Sh6Kh4[ . . . super long bearer token ]W18UiWuzsz+0r+U" -H 'accept: application/json' \ -H 'Content-Type: application/json' \ -d '{ "prompt": "What is the difference between a pigeon", "llm_config": { "use[ . . . and so on] } }'
Curl works like a charm.
In LangChain, I try to create my new llm as such:
llm = OpenLLM(server_url="https://themodel.url/v1/generate", temperature=0.2)
After digging your repo, did not find any options for including the Bearer token nor to modify the headers of a distant openllm session!
Modifying the Headers would enable the use of self deployed LLM without any auth proxy
No response
This is now supported with #605, you can export the envvar with OPENLLM_AUTH_TOKEN
OPENLLM_AUTH_TOKEN
Feature request
I have deployed an OpenLLM on a managed service that is protected by the use of an auth Bearer token:
Curl works like a charm.
In LangChain, I try to create my new llm as such:
llm = OpenLLM(server_url="https://themodel.url/v1/generate", temperature=0.2)
After digging your repo, did not find any options for including the Bearer token nor to modify the headers of a distant openllm session!
Motivation
Modifying the Headers would enable the use of self deployed LLM without any auth proxy
Other
No response