Darkdriller / PowerToys-Run-LocalLLm

use Local LLM on PowerToys Run
MIT License
8 stars 1 forks source link

Ability to choose model from PowerToys run #3

Open otravers opened 2 months ago

otravers commented 2 months ago

Thanks for this great plugin! It would be great if it was possible to choose the model from the command line, with something like this:

llm > llama3 > my llm prompt

Darkdriller commented 2 months ago

Hi thanks for the feedback!! I really appreciate it.

can you try it and give feedback. I implemented the ability to chose the model. The initialisation will take some time so you might initially get a timeout but subsequent queries will work perfectly. I am currently loading all the models from Ollama Library and parsing the webpage to generate the list of models supported by Ollama. I was unable to find a faster option. LocalLLM-v1.0.2.zip

otravers commented 2 months ago

Wow, didn't expect such rapid action! It seems to work (hopefully the anwer is not hallucinated!): image

Could you specify the model in use next to the "LLM Response" header?

Darkdriller commented 2 months ago

Hi I was actually thinking of implementing what you were talking about some time ago but tbh had completely forgotten about this. So when I got this issue I actually had the exact changes mapped out which weren't much. You raising this issue helped a lot because I came back to this project. Anyways here is the updated version Try it out and I will release it in a day or 2 after doing some testing.

LocalLLM-v1.0.3.zip

otravers commented 2 months ago

This is looking good! image

Further improvements for consideration:

otravers commented 2 months ago

I am currently loading all the models from Ollama Library and parsing the webpage to generate the list of models supported by Ollama. I was unable to find a faster option

Have you looked into the list_models endpoint? https://github.com/ollama/ollama/blob/main/docs%2Fapi.md#list-local-models

You can know which models are already running in memory from this: https://github.com/ollama/ollama/blob/main/docs%2Fapi.md#list-running-models

Darkdriller commented 2 months ago

This is looking good! image

Further improvements for consideration:

  • Could you autosuggest the engines already installed in Ollama once "llm > " has been input?
  • Can you make the response pane bigger to accommodate longer responses, and make sure there's text wrapping?
  • Is there a way (spinning hourglass or similar) you could indicate that the prompt has been sent and being processed by Ollama?

Hi if it’s possible could you open up a separate issue with your suggestions so I would be able to look at them and open it up for others to possibly work on. I was planning on the auto suggestions feature next

Darkdriller commented 2 months ago

I am currently loading all the models from Ollama Library and parsing the webpage to generate the list of models supported by Ollama. I was unable to find a faster option

Have you looked into the list_models endpoint? https://github.com/ollama/ollama/blob/main/docs%2Fapi.md#list-local-models

You can know which models are already running in memory from this: https://github.com/ollama/ollama/blob/main/docs%2Fapi.md#list-running-models

I did see them but it doesn’t give the global models just local models installed in your local ollama. I thought global model support would be good but I might change it for local model support if u believe it would help.

otravers commented 2 months ago

This is looking good! image Further improvements for consideration:

  • Could you autosuggest the engines already installed in Ollama once "llm > " has been input?
  • Can you make the response pane bigger to accommodate longer responses, and make sure there's text wrapping?
  • Is there a way (spinning hourglass or similar) you could indicate that the prompt has been sent and being processed by Ollama?

Hi if it’s possible could you open up a separate issue with your suggestions so I would be able to look at them and open it up for others to possibly work on. I was planning on the auto suggestions feature next

Done: https://github.com/Darkdriller/PowerToys-Run-LocalLLm/issues/4 https://github.com/Darkdriller/PowerToys-Run-LocalLLm/issues/5 https://github.com/Darkdriller/PowerToys-Run-LocalLLm/issues/6

otravers commented 2 months ago

Hi if it’s possible could you open up a separate issue with your suggestions so I would be able to look at them and open it up for others to possibly work on. I was planning on the auto suggestions feature next

Done: https://github.com/Darkdriller/PowerToys-Run-LocalLLm/issues/4 https://github.com/Darkdriller/PowerToys-Run-LocalLLm/issues/5 https://github.com/Darkdriller/PowerToys-Run-LocalLLm/issues/6

Darkdriller commented 2 months ago

Hi if it’s possible could you open up a separate issue with your suggestions so I would be able to look at them and open it up for others to possibly work on. I was planning on the auto suggestions feature next

Done: #4 #5 #6

Thanks for the feedback I will try to add this functionality soon.