zmedelis / bosquet

Tooling to build LLM applications: prompt templating and composition, agents, LLM memory, and other instruments for builders of AI applications.
https://zmedelis.github.io/bosquet/
Eclipse Public License 1.0
277 stars 18 forks source link

support other models #8

Closed behrica closed 1 year ago

behrica commented 1 year ago

I think we should add something to easily use non-openai models. Main reason being that both OpenAI and AzureOpenAI are not really open or are even paid.

Ideally we should be able to use the same code (using bosquet) and we can easely "swap" the model

behrica commented 1 year ago

I think a good way to solve this in bosquet is to allow to "pass" custom complete functions ina template:

{{text}}

What is the name of the licence ?
{% llm-generate  
   model=testtextdavanci003 
   impl=azure
   complete-fn=bosquet.openai/create-completion
   % }
behrica commented 1 year ago

Doing this probably means that we do not need "impl" any more, as "different functions" could be created, one for "openAI" and an other for "Azure OpenAI" This might help to fix #9

behrica commented 1 year ago

I tried this out and it looks very clean to me:

Having 2 functions:

(defn azure-openai-create-completion
  "Create completion (not chat) for `prompt` based on model `params` and invocation `opts`"
  [prompt params opts]
  (-> (api/create-completion
       (assoc params :prompt prompt)
       (assoc opts :impl :azure))
      :choices first :text))

(defn openai-create-completion
  "Create completion (not chat) for `prompt` based on model `params` and invocation `opts`"
  [prompt params opts]
  (-> (api/create-completion
       (assoc params :prompt prompt)
       (dissoc opts :impl))
      :choices first :text))

allows ten to switch in teh templates:

  "
What is the name of the licence ?
{% llm-generate  
   model=testtextdavanci003 
   complete-fn=bosquet.openai/openai-create-completion
   % }"

vs

  "
{% llm-generate  
   model=testtextdavanci003 
   complete-fn=bosquet.openai/azure-openai-create-completion
   % }"

Then we could start to add more models, or the user could do himself. Just implementing a fn with this signature:
[prompt params opts] returning String

behrica commented 1 year ago

So "impl" is just an internal flag now, the end user will not see / use it anymore

zmedelis commented 1 year ago

Adding other models would be great. Maybe as a first step a new library is needed. Just like https://github.com/wkok/openai-clojure but supporting other models. Once this is in place Bosquet can use it as a dependency

behrica commented 1 year ago

I am not sure, if a "new library first" is the right step. This would assume, that we should be able to make a single library, which abstracts over all LLMs.

But LLMs are very different and have different APIs.

I believe that bosquet should do "something" , so that a user can plugin his own function. The reason being that bosquet will only support a very small set out of "all possible operation of an LLM", start by "completion", where

"text" + params go in, and text comes back. So the user of bosquet would decide what and if he uses an other library or just does http calls to a model completion API himself

behrica commented 1 year ago

Ones PR #15 is merged, we can then easely allow to use either "keywords" or a user given fn as the ":impl":

(def azure-open-ai-config
  {
   :impl (fn [prompt args] "the completion")
   :parameter-1 "my-model-key-if-any"
   )

This would allow a user of bosquet to plugin his own function. (Maybe even stop supporting the keywords and provide functions for OpenAI, so a user would use a config like this:

(def azure-open-ai-config
  {
   :impl  bosquet.complete/complete-azure-openai

   )

This is then a first step to support other models and would close this concrete issue.

zmedelis commented 1 year ago

I tried this out and it looks very clean to me:

Having 2 functions:

(defn azure-openai-create-completion
  "Create completion (not chat) for `prompt` based on model `params` and invocation `opts`"
  [prompt params opts]
  (-> (api/create-completion
       (assoc params :prompt prompt)
       (assoc opts :impl :azure))
      :choices first :text))

(defn openai-create-completion
  "Create completion (not chat) for `prompt` based on model `params` and invocation `opts`"
  [prompt params opts]
  (-> (api/create-completion
       (assoc params :prompt prompt)
       (dissoc opts :impl))
      :choices first :text))

allows ten to switch in teh templates:

  "
What is the name of the licence ?
{% llm-generate  
   model=testtextdavanci003 
   complete-fn=bosquet.openai/openai-create-completion
   % }"

vs

  "
{% llm-generate  
   model=testtextdavanci003 
   complete-fn=bosquet.openai/azure-openai-create-completion
   % }"

Then we could start to add more models, or the user could do himself. Just implementing a fn with this signature: [prompt params opts] returning String

Thats good. We can then build on top of it, maybe adding extra tags wrapping some of the config and specifics of model calls