danielmiessler / fabric

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.
https://danielmiessler.com/p/fabric-origin-story
MIT License
24k stars 2.57k forks source link

[Bug]: Local Models not loading from Windows to WSL #489

Closed johnaiii closed 2 months ago

johnaiii commented 4 months ago

What happened?

I was installing fabric, when I wrote: ":~/fabric$ fabric --listmodels" and local models are empty, when I am running Ollama in Windows 11: :~/fabric$ fabric --listmodels GPT Models:

Local Models:

Claude Models:

Google Models:

Version check

Relevant log output

No response

Relevant screenshots (optional)

No response

ClumsyWoodsman3 commented 4 months ago

Getting the same issue running lm-studio, and fabric through WSL. fabric

ClumsyWoodsman3 commented 4 months ago

I got models to list by manually adding the api key to the .env config

svisnaw commented 4 months ago

I came across the same issue. There was an ollama instance running in WSL, and I couldn't figure out why it wasn't sending prompts to LM-Studio.

@danielmiessler - I see you are holding most updates for after the GO version is released, not sure how that will change the setup.

From what I can tell having the API_KEY set is required by the OpenAI python package.

Below is the proposed update to the documentation.

  1. Now you are up and running! You can test by running the help.
# Making sure the paths are set up correctly
fabric --help

[!NOTE] If you're using the server functions, fabric-api and fabric-webui need to be run in distinct terminal windows.

Using a local OpenAI API compatible server

If you want to use fabric with OpenAI API compatible inference servers, such as FastChat, Helmholtz Blablador, LM Studio and others, simply export the following environment variables:

And if your server needs authentication tokens, like Blablador does, you export the token the same way you would with OpenAI:

If using LM-Studio you must set your API key to lm-studio

[!NOTE] When running in WSL2 your fabric CLI client must be pointed at your computers network IP address, not https://localhost or https://127.0.0.1 For example export OPENAI_BASE_URL=https://192.168.0.10:1234/v1/

Using the fabric client

Once you have it all set up, here's how to use it.

  1. Check out the options fabric -h
snafu4 commented 4 months ago

Thanks! I could never get this working.

On Sat, Jun 1, 2024 at 9:03 PM Sean Visnaw @.***> wrote:

I came across the same issue. There was an ollama instance running in WSL, and I couldn't figure out why it wasn't sending prompts to LM-Studio.

@danielmiessler https://github.com/danielmiessler - I see you are holding most updates for after the GO version is released, not sure how that will change the setup.

From what I can tell having the API_KEY set is required by the OpenAI python package. Below is the proposed update to the documentation.

  1. Now you are up and running! You can test by running the help.

Making sure the paths are set up correctly

fabric --help

Note

If you're using the server functions, fabric-api and fabric-webui need to be run in distinct terminal windows. Using a local OpenAI API compatible server

If you want to use fabric with OpenAI API compatible inference servers, such as FastChat https://github.com/lm-sys/FastChat, Helmholtz Blablador http://helmholtz-blablador.fz-juelich.de, LM Studio https://lmstudio.ai and others, simply export the following environment variables:

And if your server needs authentication tokens, like Blablador does, you export the token the same way you would with OpenAI:

  • export OPENAI_API_KEY="YOUR TOKEN"

If using LM-Studio you must set your API key to lm-studio

  • export OPENAI_API_KEY="lm-studio"

Note

When running in WSL2 your fabric CLI client must be pointed at your computers network IP address, not https://localhost or https://127.0.0.1 For example export OPENAI_BASE_URL=https://192.168.0.10:1234/v1/ Using the fabric client

Once you have it all set up, here's how to use it.

  1. Check out the options fabric -h

— Reply to this email directly, view it on GitHub https://github.com/danielmiessler/fabric/issues/489#issuecomment-2143652912, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABPVKQWVTVEUKJF3GBHJCS3ZFJVPNAVCNFSM6AAAAABIUALZGWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNBTGY2TEOJRGI . You are receiving this because you are subscribed to this thread.Message ID: @.***>

-- Neil M. Greisman @.*** 416-277-8907

vo5tr0 commented 4 months ago

Same problem here with openai models. fabric --listmodels gives me just an empty result. Trying to summarize something with

fabric --text "test" --pattern summarize

results in:

Error: Connection error.
Connection error.

Environment: Windows 10 with WSL2 + Ubuntu 22.04.4 LTS ~/.config/fabric/.env exists + API key is configured.

 cat ~/.config/fabric/.env
YOUTUBE_API_KEY=XXXXXXXX
OPENAI_API_KEY=XXXXXXXX

I also tried setting the api key with "export OPENAI_API_KEY="YOUR TOKEN"" and even "export OPENAI_BASE_URL="https://api.openai.com/v1""