Adding compatibility with Ollama gives users flexibility with the LLM they'd like to interface with about JBrowse 2 and their config files -- this comes with some caveats:
We may need to write 3 different prompts, one for each language model, that works best with the language model of choice. Presently the sys prompt is written to work best with OpenAI
We will need to add another "switch case" for Ollama if the user specifies they want to use it
We need to be transparent that the effectiveness of the bot is intrinsic to the model they are using -- they may get worse responses from other language models (perhaps this is something we could add to the system prompts -- make the models provide some kind of citation or referral for every response they make so the user can have a higher degree of certainty about the response they get)
Adding compatibility with Ollama gives users flexibility with the LLM they'd like to interface with about JBrowse 2 and their config files -- this comes with some caveats:
https://python.langchain.com/docs/integrations/providers/ollama/ https://github.com/ollama/ollama