pepperoni21 / ollama-rs

A Rust library allowing to interact with the Ollama API.
MIT License
367 stars 47 forks source link

Error if running in daemonized application on macOS #35

Closed fameowner99 closed 3 months ago

fameowner99 commented 3 months ago

Hi! I use daemonize crate to create another process and when I try yo init ollama i have an error: Err("error sending request for url (http://127.0.0.1:11434/api/generate): error trying to connect: tcp connect error: Bad file descriptor (os error 9)")

Here is code example:

`let stdout = File::create( app_data_path.clone().join("llama-daemon.out")).unwrap(); let stderr = File::create( app_data_path.clone().join("llama-daemon.err")).unwrap();

let daemonize = Daemonize::new()
    .pid_file(app_data_path.clone().join("llama-daemon.pid")) // Every method except `new` and `start`
    .stdout(stdout)  // Redirect stdout to `/tmp/daemon.out`.
    .stderr(stderr);  // Redirect stderr to `/tmp/daemon.err`.

match daemonize.start() {
    Ok(_) => {
        // By default it will connect to localhost:11434
        let ollama = routes:: Ollama::default();

        let model = "llama2:latest".to_string();
        let prompt = "Why is the sky blue?".to_string();

        println!("My pid daemon is {}", process::id());

        let res = ollama.generate(GenerationRequest::new(model, prompt)).await;

        println!("{:?}", res);
        if let Ok(res) = res {
            println!("{}", res.response);
        }
        else {
            println!("ERROR");
        }
    }
    Err(e) => {
        println!("Encountered an error: {}", e);
        Err(e).unwrap()
    }`

Without daemonize everyrhing works. Looking for a solution to fix it

fameowner99 commented 3 months ago

Found solution it is forbidden to start tokio main before daemonize