sobelio / llm-chain

`llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
https://llm-chain.xyz
MIT License
1.27k stars 127 forks source link

openai: add support for setting the base url #267

Closed danbev closed 5 months ago

danbev commented 5 months ago

This commit adds support for setting the base url for the openai client. This is useful for testing and for using the client with a different provider.

The motivation for this came from wanting to try out Perplexities API using llm-chain-openai. For this I needed to set the base url to the one provided by Perplexity, but I could find a way to that with the current implementation.

With the change in this commit, an example can be written like this:

use llm_chain::options;
use llm_chain::options::ModelRef;
use llm_chain::{executor, parameters, prompt};

async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let opts = options!(
         Model: ModelRef::from_model_name("pplx-70b-online")
    );
    let exec = executor!(chatgpt, opts.clone())?;
    let query = "What is the capital of Sweden?";
    println!("Query: {query}\n");
    let res = prompt!("", query,).run(&parameters!(), &exec).await?;
    println!("Perplixity AI:\n{res}");
    Ok(())
}

And the following environment variables need to be set:

export OPENAI_API_KEY=<Perplexity-API-Key>
export OPENAI_API_BASE_URL=https://api.perplexity.ai