Closed anoldguy closed 9 months ago
In mod --settings
, you can edit the values to point to a locally hosted OpenAI compatible API like that provided by LMStudio or llamafile.
example:
apis:
openai:
base-url: http://localhost:1234/v1
api-key: abc123
I'm not sure yet, but it might be possible to add llamafile or lmstudio, etc.. as additional apis in the settings.
I confirm, this works now. Running a llamafile works with some settings tweaks:
llamafile:
base-url: http://localhost:8383/
models:
mixtral:
aliases: ["llamafile", "mixtral"]
max-input-chars: 98000
fallback:
This is the worst kind of issue. A feature request based on a newly released related tool. I'm sorry. 🫣
With the recent release of llamafile it'd be really cool to be able to run this completely locally without sending data up to OpenAI.
Edit: This seems similar in spirit to #162. 👍