Open smjure opened 5 months ago
When you say it doesn't work, what happens?
Sorry. Here it goes:
To get this working using llama3.1
in ollama
with a custom api_base
you can use the openai
method as Ollama server has an Open AI compatible API:
default_model = "openai/llama3.1"
[[models]]
name = "openai/llama3.1"
api_base = "http://192.168.1.145:11434/v1"
api_key = "test_or_anything_should_be_fine"
Change the api_base
to wherever your ollama server is and make sure it ends in /v1
. The api_key
can't be empty but can be anything.
Using the documented ollama method didn't work for me with a custom api_base
Hey there! Thanks for the awesome app!!
I have a problem with defining
api_base
, namely I haveollama3
model working well on different port, e.g. 12345 and not the default one 11434, so in this regard I modified/defined the~/.config/elia/config.toml
file as follows:i.e. I simply added:
api_base = "http://localhost:12345"
and with it, the elia does not work when selected (ctrl+o
). If I remove this line, it works fine with my definedllama3
model. Could this be fixed to work on a customapi_base
? FYI I installed/clone the latestelia
version.