emirsahin1 / llm-axe

A simple, intuitive toolkit for quickly implementing LLM powered applications.
MIT License
66 stars 14 forks source link

OpenAI LLMs Issues #5

Closed mdwoicke closed 1 month ago

mdwoicke commented 1 month ago

Local LLMs work as expected. Used the example to put in OpenAI LLM but getting errors. Do you happen to have a working OpenAI example you can provide? Thanks!

emirsahin1 commented 1 month ago

Hi. Yes, any LLM will work as long as you give llm-axe a way to communicate with it. This is done by making a custom llm class with an "ask" function.

Here is a basic working example for Chatgpt's OpenAI API:


from llm_axe import Agent, AgentType
from openai import OpenAI

# Our custom llm class
class MyCustomLLM:

    def __init__(self):
        self.client = OpenAI(api_key="API_KEY")

    def ask(self, prompts: list, format: str = "", temperature: float = 0.8):
        return self.client.chat.completions.create(model="gpt-3.5-turbo", messages=prompts).choices[0].message.content

# example usage:
llm = MyCustomLLM()
fc = Agent(llm, agent_type=AgentType.GENERIC_RESPONDER)
print(fc.ask("Hi how are you today?"))

Let me know if that helps or not. Thanks.