sobelio / llm-chain

`llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
https://llm-chain.xyz
MIT License
1.3k stars 128 forks source link

Executor returned by llm_chain_openai::chatgpt::Executor::for_client() does not implement llm_chain::traits::Executor ([E0277]) #118

Closed AnthonyMichaelTDM closed 1 year ago

AnthonyMichaelTDM commented 1 year ago

So, in my use case, I need an openai executor that uses an organization-id read from an environment variable, looking at the source code for the llm_chain_openai crate (as the documentation is kinda lacking), it would seem that the llm_chain_openai::chatgpt::Executor::for_client() would let me do that by letting me specify the async_openai client to use.

However, while this works

use async_openai::Client;
use lazy_static::lazy_static;
use llm_chain_openai::chatgpt::{Executor, Model, PerInvocation};

lazy_static! {
    pub static ref OPENAI_EXECUTOR: Executor = Executor::for_client(
        {
            let org_id = std::env::var("OPENAI_ORG_ID").unwrap_or_else(|_| "".to_string());
            if org_id.is_empty() {
                Client::new()
            } else {
                Client::new().with_org_id(org_id)
            }
        },
        Some(PerInvocation::new().for_model(Model::ChatGPT3_5Turbo)),
    );
}

it doesn't give me an Executor that implements the llm_chain::traits::Executor trait, meaning I can't actually use it for anything useful

is this intended behavior? if so, why? and if not, what can I/we do about it?

here is the specific error: rustc: the trait bound `OPENAI_EXECUTOR: llm_chain::traits::Executor` is not satisfied the trait `llm_chain::traits::Executor` is implemented for `llm_chain_openai::chatgpt::Executor` [E0277]

AnthonyMichaelTDM commented 1 year ago

in terms of potential solutions, I think the easiest is to either:

I could probably make a PR for any one of those solutions, the real question is which one, if any, is most in line with the goals of this project

AnthonyMichaelTDM commented 1 year ago

more in depth review of the options:

pub fn with_client(
    mut self,
    client: async_openai::Client,
) -> Self {
    self.client = Arc::new(client);
    self
}

Summary:

realistically, unless you're planning a major release soon, option 1 is too destructive to existing functionality, so that leaves 2 and 3,

IF the for_client function is currently operating how it is intended, then I should do option 3, otherwise option 2 because this is a bug.

So, final question: is the for_client function currently operating as is intended?

AnthonyMichaelTDM commented 1 year ago

option 4, probably the best, will make a PR with this tomorrow if I don't get any feedback:

change the for_client function to something like this:

    pub fn for_client(
        client: async_openai::Client,
        per_invocation_options: Option<PerInvocation>,
    ) -> Self {
        use llm_chain::traits::Executor as _;
        let mut exec = Self::new_with_options(None, per_invocation_options).unwrap;
        exec.client = Arc::new(client);
        exec
    }

maintains documented behavior, unless I'm missing something this is probably the best