charmbracelet / mods

AI on the command line
MIT License
3.08k stars 120 forks source link

Support Local Llamafile binary/server? #168

Closed anoldguy closed 9 months ago

anoldguy commented 1 year ago

This is the worst kind of issue. A feature request based on a newly released related tool. I'm sorry. 🫣

With the recent release of llamafile it'd be really cool to be able to run this completely locally without sending data up to OpenAI.

Edit: This seems similar in spirit to #162. 👍

garyblankenship commented 10 months ago

In mod --settings, you can edit the values to point to a locally hosted OpenAI compatible API like that provided by LMStudio or llamafile.

example:

apis:
  openai:
    base-url: http://localhost:1234/v1
    api-key: abc123

I'm not sure yet, but it might be possible to add llamafile or lmstudio, etc.. as additional apis in the settings.

anoldguy commented 9 months ago

I confirm, this works now. Running a llamafile works with some settings tweaks:

  llamafile:
    base-url: http://localhost:8383/
    models:
      mixtral:
        aliases: ["llamafile", "mixtral"]
        max-input-chars: 98000
        fallback: