open-build / BabbleBeaver

Micro AI for multiple LLM switching, preparing datasets, training models, and deploying them in isolated environments using Docker
https://collab.buildly.io
GNU General Public License v3.0
4 stars 6 forks source link

Create a scalable way for devs to add different integrations to proprietary LLMs #6

Closed lxy009 closed 2 months ago

lxy009 commented 5 months ago

The idea here is to create enough abstraction so that a dev could add a connection to Anthropic or Cohere if they so choose.

In the README, we have stated the intent of devs to use Github Submodules. Therefore the actual integration layer can reside in other repositories. Therefore we need to provide the consistent system of usage within this codebase and the abstraction layer of the LLM.

Right now we have this in the AIConfigurator class but hardcoded the integration with OpenAI and Gemini.

This task is scoped to a basic chatbot implementation.

lxy009 commented 2 months ago

For those working on this, we should consider recent changes that keeps the conversation history "in memory" because every LLM provider will have different token limits on how much can be pushed. It might make sense to add this token limit into some required configuration parameter whenever someone builds an LLM integration.