eyurtsev / kor

LLM(😽)
https://eyurtsev.github.io/kor/
MIT License
1.61k stars 88 forks source link

Integration with Openrouter #293

Closed singhtanmay6735 closed 3 weeks ago

singhtanmay6735 commented 4 months ago

Hi,

I'm looking to integrate the Openrouter API, which provides a unified interface for various LLMs, allowing us to choose any LLM for extraction. Based on their documentation, it appears that we can use Openrouter with OpenAI's client API as follows:

from openai import OpenAI
from os import getenv

# Get the API Key from the environment variable OPENAI_API_KEY
client = OpenAI(
  base_url="https://openrouter.ai/api/v1",
  api_key=getenv("OPENROUTER_API_KEY"),
)

completion = client.chat.completions.create(
  extra_headers={
    "HTTP-Referer": $YOUR_SITE_URL,  # Optional, for including your app on openrouter.ai rankings.
    "X-Title": $YOUR_APP_NAME,  # Optional, shows in rankings on openrouter.ai.
  },
  model="openai/gpt-3.5-turbo",
  messages=[
    {
      "role": "user",
      "content": "Say this is a test",
    },
  ],
)
print(completion.choices[0].message.content)

I've attempted this integration several times but keep encountering issues. Could someone please provide guidance or assistance?

References:

Openrouter Documentation - https://openrouter.ai/docs#models Openrouter Models - https://openrouter.ai/models/ Thanks in advance!

eyurtsev commented 3 weeks ago

Kor is for langchain chat models and llms.

Use one of these https://python.langchain.com/v0.2/docs/integrations/chat/

or create a custom one: https://python.langchain.com/v0.2/docs/how_to/custom_chat_model/