radicalxdev / kai-ai-backend

This is the Kai Teaching Assistant ai repo.
MIT License
12 stars 46 forks source link

Kai Chat model dependency #22

Closed mikhailocampo closed 2 weeks ago

mikhailocampo commented 1 month ago

Kai Chat's core.py file is hardcode to set Gemini 1.0 Pro. Although it works, a more robust solution is to create a model-agnostic system

def executor(user_name: str, user_query: str, messages: list[Message], k=10):

    # create a memory list of last k = 3 messages
    chat_context = [
        ChatMessage(
            role=message.role, 
            type=message.type, 
            text=message.payload.text
        ) for message in messages[-k:]
    ]

    prompt = build_prompt()

    llm = VertexAI(model_name="gemini-1.0-pro") 

    chain =  prompt | llm

    response = chain.invoke({"chat_history": chat_context, "user_name": user_name, "user_query": user_query})

    return response

Need a way to possibly specify the model within the metadata.json and then set the model accordingly. For example, developers can test the performance of Gemini 1.5 Flash or Gemini 1.5 Ultra versus Gemini 1.0 Pro for Kai Chat.

Potentially, a working solution can then be adopted to other features like tools, so this is a great first start.