sobelio / llm-chain

`llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
https://llm-chain.xyz
MIT License
1.36k stars 132 forks source link

feat: expose async_openai #252

Closed katopz closed 10 months ago

katopz commented 10 months ago

I would like to propose exposing async_openai so that I can change the configuration from the outside e.g.

use llm_chain_openai::async_openai::{config::OpenAIConfig, Client};

#[tokio::main(flavor = "current_thread")]
async fn main() {
    let cfg: OpenAIConfig = OpenAIConfig::new()
        .with_api_base("https://openrouter.ai/api/v1")
        .with_api_key("sk-xxx");
    let open_router_ai_client = Client::with_config(cfg);
    let model = "text-embedding-ada-002";

    let embeddings =
        llm_chain_openai::embeddings::Embeddings::for_client(open_router_ai_client, model);
}

FYI: https://openrouter.ai/ is drop in replacement for OpenAI with some free tier that easier to start with. Not sure we need proper impl for this or just expose async_openai but for present quick win I did choose to expose it first and maybe impl it later if need.

Thanks