I would like to propose exposing async_openai so that I can change the configuration from the outside e.g.
use llm_chain_openai::async_openai::{config::OpenAIConfig, Client};
#[tokio::main(flavor = "current_thread")]
async fn main() {
let cfg: OpenAIConfig = OpenAIConfig::new()
.with_api_base("https://openrouter.ai/api/v1")
.with_api_key("sk-xxx");
let open_router_ai_client = Client::with_config(cfg);
let model = "text-embedding-ada-002";
let embeddings =
llm_chain_openai::embeddings::Embeddings::for_client(open_router_ai_client, model);
}
FYI: https://openrouter.ai/ is drop in replacement for OpenAI with some free tier that easier to start with.
Not sure we need proper impl for this or just expose async_openai but for present quick win I did choose to expose it first and maybe impl it later if need.
I would like to propose exposing
async_openai
so that I can change the configuration from the outside e.g.FYI: https://openrouter.ai/ is drop in replacement for
OpenAI
with some free tier that easier to start with. Not sure we need properimpl
for this or just exposeasync_openai
but for present quick win I did choose to expose it first and maybeimpl
it later if need.Thanks