Closed defconhaya closed 6 months ago
OK i think this was an error. I fixed the code. Can you try again? git pull then pipx upgrade fabric and let me know if it works
I just did a pull and pipx upgrade and still have the same error
What happens if you fabric --listmodels
@xssdoctor My PR #315 fixed some OLLAMA issues for Windows. @danielmiessler asked for you to look it over and approve it - I tested it on Linux, MacOS and Windows (both native and WSL) as well.
What happens if you fabric --listmodels
Here's what I got:
fabric --remoteOllamaServer http://192.168.100.231:11434 --listmodels
OpenAI API key not set. Skipping GPT models.
OpenAI API key not set. Skipping GPT models.
GPT Models:
Local Models:
Claude Models:
claude-3-opus-20240229
claude-3-sonnet-20240229
claude-3-haiku-20240307
claude-2.1
@MonOttawa Can you apply my fix from #315 to installer/client/cli/utils.py and see if that fixes it? I suspect it might.
@ksylvan Thanks for your help. I can now see the ollama models with your fix.
fabric --remoteOllamaServer http://192.168.100.231:11434 --listmodels
OpenAI API key not set. Skipping GPT models.
OpenAI API key not set. Skipping GPT models.
GPT Models:
Local Models:
qwen:14b
qwen:72b
qwen:7b
qwen:latest
However, when I execute this command:
pbpaste | fabric --pattern summarize
I see:
OpenAI API key not set. Skipping GPT models. Error: Connection error. Connection error.
What am I doing wrong?
I believe you need to set the model with the -m parameter (or set a new default model). On Apr 6, 2024 at 10:09 PM -0400, MondyLab @.***>, wrote:
@ksylvan Thanks for your help. I can now see the ollama models with your fix.
fabric --remoteOllamaServer http://192.168.100.231:11434 --listmodels OpenAI API key not set. Skipping GPT models. OpenAI API key not set. Skipping GPT models. GPT Models: Local Models: qwen:14b qwen:72b qwen:7b qwen:latest
However, when I execute this command: pbpaste | fabric --pattern summarize I see: OpenAI API key not set. Skipping GPT models. Error: Connection error. Connection error. What am I doing wrong? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>
@snafu4 You're the best! Thanks! Works like a charm.
@xssdoctor @danielmiessler please merge #315 - it fixes all cases where people are trying to use --remoteOllamaServer
and @MonOttawa just confirmed it too.
ok done. let me know if it works for all of you
Still same error. I've done a clean install with latest commits, both in WSL and native windows.
curl http://localhost:11434 Ollama is running
fabric --remoteOllamaServer http://localhost:11434 --listmodels Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401
Is there any config file where I need to add something ?
If you're running on 11434 you don't have to use that flag. Just run fabric -m with your ollama model.
Also remember to pipx upgrade fabric every time you git pull
I did a fresh install om MacOS
ollama list NAME ID SIZE MODIFIED codellama:7b-code fc84f39375bc 3.8 GB 2 months ago codellama:7b-instruct 8fdf8f752f6e 3.8 GB 2 months ago codellama:latest 8fdf8f752f6e 3.8 GB 4 weeks ago llama2:latest 78e26419b446 3.8 GB 2 months ago llava:latest 8dd30f6b0cb1 4.7 GB 2 months ago mistral:latest 61e88e884507 4.1 GB 2 months ago
fabric --listmodels Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401
ok, question. is there a chance you have an incorrect api key in your .config/fabric/.env? its ok to have no key, but if there is a key that is incorrect you could get that error
Solved: there was a dummy key in .env file. Thanks a lot !
I can confirm it works now on Windows, WSL, and Mac
I’ve confirmed the latest pull with the latest merges works on Linux, Mac, Windows, and WSL with Ollama
Great!
I did a fresh install om MacOS
ollama list NAME ID SIZE MODIFIED codellama:7b-code fc84f39375bc 3.8 GB 2 months ago codellama:7b-instruct 8fdf8f752f6e 3.8 GB 2 months ago codellama:latest 8fdf8f752f6e 3.8 GB 4 weeks ago llama2:latest 78e26419b446 3.8 GB 2 months ago llava:latest 8dd30f6b0cb1 4.7 GB 2 months ago mistral:latest 61e88e884507 4.1 GB 2 months ago
fabric --listmodels Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401
I have cleaned out the directories I know about, uninstalled and reinstalled fabric but still get this. I feel like I am missing the correct file to delete/remake or edit but I guess I don't know where it is. I'm running on ubuntu.
I did a fresh install om MacOS
ollama list NAME ID SIZE MODIFIED codellama:7b-code fc84f39375bc 3.8 GB 2 months ago codellama:7b-instruct 8fdf8f752f6e 3.8 GB 2 months ago codellama:latest 8fdf8f752f6e 3.8 GB 4 weeks ago llama2:latest 78e26419b446 3.8 GB 2 months ago llava:latest 8dd30f6b0cb1 4.7 GB 2 months ago mistral:latest 61e88e884507 4.1 GB 2 months ago
fabric --listmodels Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401
I have cleaned out the directories I know about, uninstalled and reinstalled fabric but still get this. I feel like I am missing the correct file to delete/remake or edit but I guess I don't know where it is. I'm running on ubuntu.
I found it. Working now.
What is your question?
After reading the documentation, I am still not clear how to get ollama working. I've tried running this:
fabric --pattern explain_code --model codellama:latest Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401