zmedelis / bosquet

Tooling to build LLM applications: prompt templating and composition, agents, LLM memory, and other instruments for builders of AI applications.
https://zmedelis.github.io/bosquet/
Eclipse Public License 1.0
280 stars 19 forks source link

Support Chat and Completion modes #25

Closed zmedelis closed 1 year ago

zmedelis commented 1 year ago

Must haves

Implementation

Possible interface

(chat llm-service context user-message)

context holds the initial system prompt or whatever else is needed/allowed by the current chained prompt definition

(def context
  {:bosquet.conversation/system 
   "The following is a friendly conversation between a human and an AI.  
   The AI is talkative and provides lots of specific details from its context. If the 
   AI does not know the  answer to a question, it truthfully says it does not know."})

As conversation turns are made, history is kept under :bosquet.conversation/history in the same growing context. History is recorded in ChatML (example):

[{"role": "system", "content": "You are a helpful assistant."},
 {"role": "user", "content": "Who won the world series in 2020?"},
 {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
 {"role": "user", "content": "Where was it played?"}]

Possible shape of calls and data


(def context (chat oai context "What is 2+2?"))
=>
{:bosquet.conversation/system "The following is a friendly conversation between a human and an AI. 
 The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, 
 it truthfully says it does not know."

 :bosquet.conversation/history
 [{:role        :user 
   :content "What is 2+2?"}
  {:role       :assistant 
   :content "4"}]

  :bosquet.conversation/asistant-last-response "4"}

This call creates a new context where conversation history is added to :bosquet.conversation/history. New call with new user message will be added to it

(chat oai context "Where was Albert Einstein born?" :bosquet.conversation/system :bosquet.conversation/history)
=>
{:bosquet.conversation/system "The following is a friendly conversation between a human and an AI. 
 The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, 
 it truthfully says it does not know."

 :bosquet.conversation/history
 [{:role :user 
   :content "What is 2+2?"}
  {:role :assistant 
   :content "4"}
  {:role :user 
   :content "Where was Albert Einstein born?"}
  {:role :assistant 
   :content "He was born in Germany."}]

 :bosquet.conversation/asistant-last-response
 "He was born in Germany."}

Memory

Chat will quickly outgrow the LLM context window. Memory must be added

(chat llm-service context memory user-message)

where the memory parameter would specify what kind of memory to use. TODO spec in other issue

Memory output will be added to context under :bosquet.conversation/memory

When Bosquet executes chat, it should send memory to LLM, if not present history data.