danielmiessler / fabric

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.
https://danielmiessler.com/p/fabric-origin-story
MIT License
22.7k stars 2.37k forks source link

[Feature request]: Add Ollama as a default question #268

Open iplayfast opened 6 months ago

iplayfast commented 6 months ago

What do you need?

I don't like using api keys, (just one more hurdle to jump) so I'd rather just use local models as much as possible. It's not clear from the readme how to use ollama, So two enhancements I guess.

  1. when running fabric --setup as the question, using local model first. If [Y]es then narrow in on the popular choices (Ollama being #1 on my list.
  2. Update the readme to explain how to use ollama
YorkyPoo commented 5 months ago

Yeah, I think I'm stuck here too. I was finally able to get everything installed but I still can't test if anything works. I don't know if I need to purchase GPT-4. I tried using claude-3-opus-20240229 and gpt-3.5-turbo but both tell me I've exceeded my quota. Not sure when I need to be running docker or how to set up Ollama. I also need to find out what all the commands are for Powershell and how to create shortcuts for them.

blade1981m commented 3 months ago

I 100% agree that the readme needs to be updated with this information. I consider myself somewhat tech-savvy and am more than willing to watch some youtube videos or read some guides online, but even then I struggled to get this working with ollama. I did eventually get there though so will share my steps here in case it helps others:

Step 1: Install ollama - I did this with sudo curl -fsSL https://ollama.com/install.sh | sh Step 2: Run ollama with a model specified - I did this with ollama run mixtral then ctrl+C to exit ollama Step 3 (optional): Verify fabric can see the model - fabric --listmodels This gave me the following output:-

GPT Models:

Local Models:
mixtral:latest

Claude Models:

Google Models:

Step 4: Set fabric to use that model as the default - fabric --changeDefaultModel mixtral:latest

Alternatively to Step 4, you could just pass that model name into any fabric command you run with -m mixtral:latest if you just want to use ollama in some cases.