Closed mjtechguy closed 12 months ago
I already have a plan to support local LLMs via the dialoqbase UI. It will be included in future updates :)
Of course you did. :) You have always been a step ahead. Thanks for all your efforts sir!
Released :)
Would it be possible to add a variable to target a different OpenAI API Compatible Endpoint?
Basically, there are numerous ways to host private models that currently support all of the OpenAI API endpoints, so we could chat with private LLMs and not use 3rd party services.
Maybe something like adding
OPENAI_API_ENDPOINT="HTTP://localhost:11434"
in the .env file.Thanks!