Closed naripok closed 2 months ago
Got this up and running with the OpenAI API quite easily. Big +1 for this.
Also, just as a heads that from what I've seen dropping the temperature
and top_p
will yield better code results w/ most models (this conceptually makes a fair bit of sense, but YMMK especially with code fine-tuned models 🤷 )
Got this up and running with the OpenAI API quite easily. Big +1 for this.
I was envisioning putting each of this configs into its own user command too, so I could go :ChatOllama
, :ChatClaudeOpus
, :ChatLammaGroq
, as needed.
They all more-less share API interface, and it would be really nice to have all the LLMs at my fingers at will, so I could use the best (and cheaper) one for the task at any given moment.
Also, for fine-tunes. :ChatOllamaLlama3Code
, :ChatOllamaLlama3LongContext
...
Got this up and running with the OpenAI API quite easily. Big +1 for this.
I was envisioning putting each of this configs into its own user command too, so I could go
:ChatOllama
,:ChatClaudeOpus
,:ChatLammaGroq
, as needed.They all more-less share API interface, and it would be really nice to have all the LLMs at my fingers at will, so I could use the best (and cheaper) one for the task at any given moment.
Also, for fine-tunes.
:ChatOllamaLlama3Code
,:ChatOllamaLlama3LongContext
...
Just wanted to say thank you for this change, and the sample. While there are other plugins that support multiple providers, something about the flow of this one clicks for me.
I took a slightly different path than using different commands, and iterated on the original sample with dynamic building of the commands, based on the model set on the prompt. This approach defaults prompts to Ollama, but lets me create some that use groq or openai, and holds to the standard keymappings and Gen menu. Sample is available at this gist if anyone is looking to do something similar.
I think this should be added in README, searching for issue but found gem!
Hey David!
All good? :)
Do you take PRs?
And is integration with cloud providers out of scope here?
I wanted to test with Llama3 70b, and since I can't run it locally and groq.com provides some fast and (up to now) free endpoints to it, I thought I'd integrate it here. I know that there are other plugins doing the cloud provider thing, but I like the ergonomics of yours better.
Here is a plugin config example for using it with groq:
Anyway, if this is out of scope, just let me know and I'll close the PR. Or you can close it yourself. :)
Cheers!