uripeled2 / llm-client-sdk

SDK for using LLM
MIT License
74 stars 11 forks source link

Add chat function in BaseLLMClient #30

Closed uripeled2 closed 11 months ago

uripeled2 commented 1 year ago

We can add and abstract method to BaseLLMClient like: def chat_completion(messages, **kwargs) -> list[str] We need to implement it in the different clients, we can define a scheme similar to what LangChain did with AIMessage, HumanMessage and SystemMessage Here is OpenAI docs for this kind of endpoint: https://platform.openai.com/docs/api-reference/chat