Open iplayfast opened 6 months ago
Yeah, I think I'm stuck here too. I was finally able to get everything installed but I still can't test if anything works. I don't know if I need to purchase GPT-4. I tried using claude-3-opus-20240229 and gpt-3.5-turbo but both tell me I've exceeded my quota. Not sure when I need to be running docker or how to set up Ollama. I also need to find out what all the commands are for Powershell and how to create shortcuts for them.
I 100% agree that the readme needs to be updated with this information. I consider myself somewhat tech-savvy and am more than willing to watch some youtube videos or read some guides online, but even then I struggled to get this working with ollama. I did eventually get there though so will share my steps here in case it helps others:
Step 1: Install ollama - I did this with sudo curl -fsSL https://ollama.com/install.sh | sh
Step 2: Run ollama with a model specified - I did this with ollama run mixtral
then ctrl+C to exit ollama
Step 3 (optional): Verify fabric can see the model - fabric --listmodels
This gave me the following output:-
GPT Models:
Local Models:
mixtral:latest
Claude Models:
Google Models:
Step 4: Set fabric to use that model as the default - fabric --changeDefaultModel mixtral:latest
Alternatively to Step 4, you could just pass that model name into any fabric command you run with -m mixtral:latest
if you just want to use ollama in some cases.
What do you need?
I don't like using api keys, (just one more hurdle to jump) so I'd rather just use local models as much as possible. It's not clear from the readme how to use ollama, So two enhancements I guess.