darrenburns / elia

A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
Apache License 2.0
1.88k stars 116 forks source link

Can't run local model. #65

Closed Ji-Shao closed 4 months ago

Ji-Shao commented 6 months ago

according the README. to run local model, "The location of the configuration file is noted at the bottom of the options window (ctrl+o)."

"ctrl+o" didn't work on Linux. How can I place the config file?

another issue: If you don't export OPENAI_API_KEY, it won't start. even if you would like to use local model.

$ elia Traceback (most recent call last): File "/home/terx/.local/bin/elia", line 5, in from elia_chat.main import cli File "/home/terx/.local/share/pipx/venvs/elia-chat/lib/python3.10/site-packages/elia_chat/main.py", line 10, in from elia_chat.app import Elia File "/home/terx/.local/share/pipx/venvs/elia-chat/lib/python3.10/site-packages/elia_chat/app.py", line 8, in from elia_chat.models import EliaContext File "/home/terx/.local/share/pipx/venvs/elia-chat/lib/python3.10/site-packages/elia_chat/models.py", line 9, in from elia_chat.widgets.chat_options import GPTModel, MODEL_MAPPING, DEFAULT_MODEL File "/home/terx/.local/share/pipx/venvs/elia-chat/lib/python3.10/site-packages/elia_chat/widgets/chat_options.py", line 44, in model=ChatOpenAI( File "/home/terx/.local/share/pipx/venvs/elia-chat/lib/python3.10/site-packages/langchain/load/serializable.py", line 97, in init super().init(**kwargs) File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init pydantic.error_wrappers.ValidationError: 1 validation error for ChatOpenAI root Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error)

darrenburns commented 6 months ago

You're using an old version of Elia - hopefully this helps: https://github.com/darrenburns/elia/issues/56#issuecomment-2132438352