Closed westlongtime closed 5 months ago
LLM_API_FUNCTION
can be any LLM API function that takes msg:list
and shrink_idx:int
, and outputs llm_result:str
and usage:dict
. Where msg is a prompt (OpenAI format by default), and shrink_idx:int
is an index at which the LLM should reduce the length of the prompt in case of overflow.
You can write your custom LLM function by extending https://github.com/Holmeswww/AgentKit/blob/main/src/agentkit/llm_api/base.py
how to bulid AgentKit without using api? I want to load my local LLM