TheLion-ai / Chattum

Other
8 stars 0 forks source link

Implement bot settings backend #37

Open A-Huli opened 1 year ago

AleksanderObuchowski commented 11 months ago

We want users to be able to select different llms and their params

Generally, we want do it similarly as with tools so see https://github.com/TheLion-ai/Chattum/tree/43-implement-tools-backend for reference

create a LLMTemplate class the class should store user_variables ( currently in backend/pydantic_models/tools.py) see ToolTemplate in the tools branch for reference and PostTool for example - those variables will tell the user what he needs to specify if he wants to use the llm

the class should also store name - the name of the llm and description properties (similarly to ToolTemplate)

the class should have __init__(self, user variables) function that sets user variables (same as in ToolTemplate)

the class sould havecreate_llm() function that creates the model using user variables

the class sould havetemplate function similarly to ToolTemplate that returns the template of the llm

Then we want to have /{username}/bots/{bot_id}/model/available_models endpoint similar to backend/app/routes/tools.py that tells frontend what models are available and what are their parameters

Then we want to have PUT /{username}/bots/{bot_id}/model that accept model name and user variables, model name and variables should be stored in database in bots collection under model filed

Then we want to have GET /{username}/bots/{bot_id}/model/ endpoint similar to backend/app/routes/tools.py that gets the model name and user varibales

The models that we want to support

Those models have to be checked manually for variables they accept, also take into account things such as API keys