briefercloud / briefer

Dashboards and notebooks in a single place. Create powerful and flexible dashboards using code, or build beautiful Notion-like notebooks and share them with your team.
https://briefer.cloud
GNU Affero General Public License v3.0
3.62k stars 209 forks source link

[feature] Allow Using OpenAI-Compatible LLM API #215

Open d8rt8v opened 1 week ago

d8rt8v commented 1 week ago

Currently, the llm function in ai/api/llms.py is created using ChatOpenAI from Langchain with a fixed endpoint for OpenAI.

To make the usage of AI in Briefier OSS more flexible and enable users to utilize other OpenAI-compatible LLM APIs (such as hosted or self-hosted models with OpenAI-compatible interfaces), we could allow specifying a custom base_url via other_args in Langchain's ChatOpenAI options.

llm = ChatOpenAI(
            temperature=0,
            verbose=False,
            openai_api_key=openai_api_key,
            model_name=model_id if model_id else config("OPENAI_DEFAULT_MODEL_NAME"),
        )

To make the endpoint customizable, allow accepting a base_url parameter in other_args, allowing users to set up their own OpenAI-compatible LLM endpoint and map this variable to .env. This would look something like:

llm = ChatOpenAI(
            temperature=0,
            verbose=False,
            openai_api_key=openai_api_key,
            model_name=model_id if model_id else config("OPENAI_DEFAULT_MODEL_NAME"),
            other_args={"base_url": custom_base_url} if custom_base_url else {}
        )