Abraxas-365 / langchain-rust

🦜️🔗LangChain for Rust, the easiest way to write LLM-based programs in Rust
MIT License
500 stars 65 forks source link

Add the ability to add a prompt to ConversationalRetrieverChainBuilder #132

Closed rxdiscovery closed 4 months ago

rxdiscovery commented 4 months ago

Hello, First of all I would like to thank you for the enormous effort you make for the Rust community.

Concerning my request, I wanted to create a ConversationalRetrieverChainBuilder, but before starting the discussion with LLM, I wanted to add an initialization prompt.

But apparently this function only exists on LLMChainBuilder. [ .prompt() ]

I tried adding the prompt during chain.invoke(), with :

let input_variables = prompt_args! {  ... };

but it doesn't work, LLM doesn't take it into account, as if I hadn't passed it any parameters.

Thank you in advance for your help

Abraxas-365 commented 4 months ago

Hi @rxdiscovery , right now the only way of making this is by building the ConversationalRetrieverChainBuilder with the custom chains. This weekend, I will add a way to send the prompt in the builder, so you will be able to build your own prompts. Sorry for the inconvenience.

rxdiscovery commented 4 months ago

@Abraxas-365

Thank you for your response and your responsiveness. there is no inconvenience, everything you do is on your own free time, thank you very much for everything. :1st_place_medal:

Abraxas-365 commented 4 months ago

Hi @rxdiscovery just made the pr for this issue #134

#[tokio::main]
async fn main() {
    let llm = OpenAI::default().with_model(OpenAIModel::Gpt35.to_string());
    let prompt=message_formatter![
                    fmt_message!(Message::new_system_message("You are a helpful assistant")),
                    fmt_template!(HumanMessagePromptTemplate::new(
                    template_jinja2!("
Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.

{{context}}

Question:{{question}}
Helpful Answer:

        ",
                    "context","question")))

                ];
    let chain = ConversationalRetrieverChainBuilder::new()
        .llm(llm)
        .rephrase_question(true)
        .retriever(RetrieverMock {})
        .memory(SimpleMemory::new().into())
        //If you want to sue the default prompt remove the .prompt()
        //Keep in mind if you want to change the prmpt; this chain need the {{context}} variable
        .prompt(prompt)
        .build()
        .expect("Error building ConversationalChain");

    let input_variables = prompt_args! {
        "question" => "Hi",
    };

    let result = chain.invoke(input_variables).await;
    if let Ok(result) = result {
        println!("Result: {:?}", result);
    }

    let input_variables = prompt_args! {
        "question" => "Which is luis Favorite Food",
    };

    //If you want to stream
    let mut stream = chain.stream(input_variables).await.unwrap();
    while let Some(result) = stream.next().await {
        match result {
            Ok(data) => data.to_stdout().unwrap(),
            Err(e) => {
                println!("Error: {:?}", e);
            }
        }
    }
}
Abraxas-365 commented 4 months ago

Will realse the new version tomorrow or later today, I will try to complete some more issues

rxdiscovery commented 4 months ago

Thank you so much @Abraxas-365 , I'm going to test it now :+1:

Abraxas-365 commented 4 months ago

@rxdiscovery let me know if you have any issue, it's not release yet , but you can test it using the github method

rxdiscovery commented 4 months ago

@Abraxas-365 it works wonderfully :+1: