Open alescdb opened 9 months ago
First, thanks for your work :)
I'm trying to silence
llama.cpp
output and keep only the answer. I've closedstderr
temporally while loading the model (this is not a nice approach, but it works).unsafe { libc::close(libc::STDERR_FILENO); } let llama = LLama::new(model,&options); unsafe { let wr = "w".as_ptr() as *const c_char; let fd = libc::fdopen(libc::STDERR_FILENO, wr); libc::dup2(fd as i32, libc::STDERR_FILENO); }
But when I call
predict
I still have an unwanted outputcount 0
. Maybe you can change it tolog::debug!("count {}", reverse_count);
?
I will add logging instead of println! that will be better thanks!
First, thanks for your work :)
I'm trying to silence
llama.cpp
output and keep only the answer. I've closedstderr
temporally while loading the model (this is not a nice approach, but it works).But when I call
predict
I still have an unwanted outputcount 0
. Maybe you can change it tolog::debug!("count {}", reverse_count);
?