pythops / tenere

🤖 TUI interface for LLMs written in Rust
https://crates.io/crates/tenere
GNU General Public License v3.0
303 stars 10 forks source link

Fail to build from source #28

Closed grouzen closed 1 week ago

grouzen commented 1 week ago

I'm trying to build it from source but cargo build --release fails with the error:

  --> /var/tmp/portage/dev-rust/tenere-0.11.1/work/cargo_home/gentoo/time-0.3.34/src/format_description/parse/mod.rs:83:9
   |
83 |     let items = format_items
   |         ^^^^^
...
86 |     Ok(items.into())
   |              ---- type must be known at this point
   |
help: consider giving `items` an explicit type, where the placeholders `_` are specified
   |
83 |     let items: Box<_> = format_items

Looking at the dependencies tree cargo tree, it is clear that the culprit is bat version 0.24. The problem is that no newer version of bat exists.

pythops commented 1 week ago

yeah something strange happened ! I had the same issue but did not dig into it very deep. I will look into it tomorrow

grouzen commented 1 week ago

Great! Thanks for the lightning-fast answer!

pythops commented 1 week ago

That's the issue https://github.com/uttarayan21/ansi-to-tui/pull/49 once it's merged then I can update the dependencies and the build will work again

pythops commented 1 week ago

Fixed

grouzen commented 1 week ago

Thank you! Waiting for the fix to be released

pythops commented 1 week ago

you can use the binaries in the release page. This fix does not bring any new functionality, it just fix building from source, so I am not planning to do any release for it.

grouzen commented 1 week ago

@pythops I'm writing a ebuild for Gentoo, for this I need to fetch a source code. But I understand your point. I don't mind to wait until the next release is cut. Thanks for the fix!

pythops commented 1 week ago

ah okay, in that case, give me till tomorrow and I will do some refactoring then I can release.

pythops commented 1 week ago

I have just released a new version. Let me know if it works for you

grouzen commented 1 week ago

Thank you so much! It compiles now. I have another issue - I need help getting it to work with Ollama. I've added the Ollama configuration that I copied from the README. But when I run tenere without arguments, it asks for the OPENAPI key anyway. I'm wondering what I'm missing here.

 $ tenere 
Can not find the openai api key
You need to define one wether in the configuration file or as an environment variable
$ cat ~/.config/tenere/config.toml 
[ollama]
url = "http://localhost:11434/api/chat"
model = "llama3.1:8b"
pythops commented 1 week ago

You need to specify that you want to use ollama, and for that you need to add llm config

llm  = "ollama"

[ollama]
url = "http://localhost:11434/api/chat"
model = "llama3.1:8b"
grouzen commented 1 week ago

Thank you! It works like a charm!

pythops commented 1 week ago

Nice, glad that it works for you :)