Closed grouzen closed 1 week ago
yeah something strange happened ! I had the same issue but did not dig into it very deep. I will look into it tomorrow
Great! Thanks for the lightning-fast answer!
That's the issue https://github.com/uttarayan21/ansi-to-tui/pull/49 once it's merged then I can update the dependencies and the build will work again
Fixed
Thank you! Waiting for the fix to be released
you can use the binaries in the release page. This fix does not bring any new functionality, it just fix building from source, so I am not planning to do any release for it.
@pythops I'm writing a ebuild for Gentoo, for this I need to fetch a source code. But I understand your point. I don't mind to wait until the next release is cut. Thanks for the fix!
ah okay, in that case, give me till tomorrow and I will do some refactoring then I can release.
I have just released a new version. Let me know if it works for you
Thank you so much! It compiles now.
I have another issue - I need help getting it to work with Ollama. I've added the Ollama configuration that I copied from the README. But when I run tenere
without arguments, it asks for the OPENAPI key anyway. I'm wondering what I'm missing here.
$ tenere
Can not find the openai api key
You need to define one wether in the configuration file or as an environment variable
$ cat ~/.config/tenere/config.toml
[ollama]
url = "http://localhost:11434/api/chat"
model = "llama3.1:8b"
You need to specify that you want to use ollama, and for that you need to add llm
config
llm = "ollama"
[ollama]
url = "http://localhost:11434/api/chat"
model = "llama3.1:8b"
Thank you! It works like a charm!
Nice, glad that it works for you :)
I'm trying to build it from source but
cargo build --release
fails with the error:Looking at the dependencies tree
cargo tree
, it is clear that the culprit is bat version0.24
. The problem is that no newer version ofbat
exists.