Closed richardstevenhack closed 4 months ago
If you set provider to ollama, it will be ollama, never phind. If provider name is incorrect, it will just tell you that.
If your model name is incorrect you will get error
I think its redundant to list ollama models from tgpt when you simply can list them with ollama list
So I just assume everything is OK? Interesting approach... I still think being able to list from tgpt might be useful, but OK I accept the answer.
You can ask the AI. lol
I did.
Oh yeah, my bad.
I have set the AI_PROVIDER=ollama.
I then did a question such as: tgpt --model qwen2:latest "What is the name of your model?"
I can't tell if the answer is coming from the default Phind model or the qwen2 model Ollama is running.
How can I tell if the model I told Ollama to use is actually the model responding. I can't assume it's just working.
UPDATE: I asked it who created it and it told me Alibaba and finally told me its name, Qwen. So it is working.
However, I would suggest some way of confirming that should be an option.
Also, can one list the available and/or installed Ollama models from the TerminalGPT interface?