aandrew-me / tgpt

AI Chatbots in terminal without needing API keys
GNU General Public License v3.0
2.04k stars 171 forks source link

How do you determine what model is actually answering the question? #271

Closed richardstevenhack closed 4 months ago

richardstevenhack commented 4 months ago

I have set the AI_PROVIDER=ollama.

I then did a question such as: tgpt --model qwen2:latest "What is the name of your model?"

I can't tell if the answer is coming from the default Phind model or the qwen2 model Ollama is running.

How can I tell if the model I told Ollama to use is actually the model responding. I can't assume it's just working.

UPDATE: I asked it who created it and it told me Alibaba and finally told me its name, Qwen. So it is working.

However, I would suggest some way of confirming that should be an option.

Also, can one list the available and/or installed Ollama models from the TerminalGPT interface?

aandrew-me commented 4 months ago

If you set provider to ollama, it will be ollama, never phind. If provider name is incorrect, it will just tell you that. If your model name is incorrect you will get error I think its redundant to list ollama models from tgpt when you simply can list them with ollama list

richardstevenhack commented 4 months ago

So I just assume everything is OK? Interesting approach... I still think being able to list from tgpt might be useful, but OK I accept the answer.

k14lb3 commented 2 months ago

You can ask the AI. lol

image

richardstevenhack commented 2 months ago

I did.

k14lb3 commented 1 month ago

Oh yeah, my bad.