pepperoni21 / ollama-rs

A Rust library allowing to interact with the Ollama API.
MIT License
478 stars 73 forks source link

feat: Store chat history #32

Closed ushinnary closed 6 months ago

ushinnary commented 6 months ago

Adding feature to store chat history on lib's side. User send only one message and lib pre-fill all history with stored messages.

Example usage (discord bot) :

lazy_static! {
    static ref OLLAMA: Mutex<Ollama> = Mutex::new(Ollama::new_default_with_history(30));
}

// ...
let mut ollama = OLLAMA.lock().await;

ollama.send_chat_messages_with_history(
            ChatMessageRequest::new(
                env::var("OLLAMA_MODEL").unwrap_or("llama2".to_string()),
                vec![ChatMessage::user(message)],
            ),
            user_id, // Or any id you provide for history_id
        )
        .await

Hope you find this feature useful. Open for improvements, thanks.

pepperoni21 commented 6 months ago

Hey, that's an interesting feature, I will review the code in details later in the day. Some of the imports are unused when the feature is not enabled, can you make them optional in order for the check to suceed?

ushinnary commented 6 months ago

Yep, once im home i'll fix it

ushinnary commented 6 months ago

I've just pushed a commit that removes an unused import I previously overlooked. My apologies for any inconvenience this might have caused.

pepperoni21 commented 6 months ago

Great, can you just provide an example script inside the examples folder?

ushinnary commented 6 months ago

@pepperoni21 yep, here it is. Tell me if I need to provide anything else. You can also contact me on discord, I have the same username there :)