VapiAI / advanced-concepts-custom-llm

This is a sample example repo on how to use Vapi with your custom-llm
0 stars 3 forks source link

add support for Bearer Auth or providing the key as part of the request to the server (as you would do with openai / openrouter etc) #1

Closed syberkitten closed 5 months ago

syberkitten commented 5 months ago

Since the idea is to create an external llm call, user should be able to provide an API key in vAPI itself so that the request can be Auth'd.

Otherwise you leave the custom LLM completely vulnerable.

I was actually not able to find any documentation how to make a custom LLM call in VAPI itself, thus:

  1. How to provide the host for the external LLM (Custom LLM)
  2. How to provide an API key for that call?
sahilsuman933 commented 5 months ago
  1. You will need to go to the assistant section and select the custom-llm as your provider then you can add your host url. image

  2. You must go here: https://dashboard.vapi.ai/keys and add your API key to the CustomLLM Section.