Open johnaiii opened 1 month ago
Without the details of your setup, helping is quite a challenge.
I installed Ollama and Fabric on WSL/Ubuntu-22.04 (Windows11), ran into the same issue and found out OLLAMA_HOST was missing or incorrect. ksylvan provided the solution in https://github.com/danielmiessler/fabric/discussions/271#discussioncomment-8968990. These steps fixed it for me:
~/.bashrc
(used by bash shell)export OLLAMA_HOST=$(ip route | grep default | awk "{print \$3}")
source ~/.bashrc
fabric --listmodels
...
Local Models:
llava:latest
llama3:latest
...
Depending on your setup, this may or may not help in your situation.
Hi, What is the difference between Google API Key and YouTube API key. Are they both the same and from Google cloud console? Or is it Gemini API Key? I get Google Models showing empty while doing --listmodels. I get the list for OpenAI, Claude and Ollama models populated
Hi, What is the difference between Google API Key and YouTube API key. Are they both the same and from Google cloud console? Or is it Gemini API Key? I get Google Models showing empty while doing --listmodels. I get the list for OpenAI, Claude and Ollama models populated
The Google and Youtube API keys are seperate. The Google API keys are AI keys for Gemini, the YouTube API keys give you access to the YouTube features of Fabric, easy downloading of public transcripts and such.
What happened?
I was doing fabric --listmodels, when local models is empty while I am running ollama:
GPT Models: gpt-3.5-turbo gpt-3.5-turbo-0125 gpt-3.5-turbo-0301 gpt-3.5-turbo-0613 gpt-3.5-turbo-1106 gpt-3.5-turbo-16k gpt-3.5-turbo-16k-0613 gpt-3.5-turbo-instruct gpt-3.5-turbo-instruct-0914
Local Models:
Claude Models:
Google Models:
I don't want to pay the OpenAI API KEY, any solution?
Version check
Relevant log output
No response
Relevant screenshots (optional)
No response