This commit adds support for setting the base url for the openai client. This is useful for testing and for using the client with a different provider.
The motivation for this came from wanting to try out Perplexities API using llm-chain-openai. For this I needed to set the base url to the one provided by Perplexity, but I could find a way to that with the current implementation.
With the change in this commit, an example can be written like this:
use llm_chain::options;
use llm_chain::options::ModelRef;
use llm_chain::{executor, parameters, prompt};
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let opts = options!(
Model: ModelRef::from_model_name("pplx-70b-online")
);
let exec = executor!(chatgpt, opts.clone())?;
let query = "What is the capital of Sweden?";
println!("Query: {query}\n");
let res = prompt!("", query,).run(¶meters!(), &exec).await?;
println!("Perplixity AI:\n{res}");
Ok(())
}
And the following environment variables need to be set:
This commit adds support for setting the base url for the openai client. This is useful for testing and for using the client with a different provider.
The motivation for this came from wanting to try out Perplexities API using
llm-chain-openai
. For this I needed to set the base url to the one provided by Perplexity, but I could find a way to that with the current implementation.With the change in this commit, an example can be written like this:
And the following environment variables need to be set: