nvms / wingman

Your pair programming wingman. Supports OpenAI, Anthropic, or any LLM on your local inference server.
https://marketplace.visualstudio.com/items?itemName=nvms.ai-wingman
ISC License
61 stars 10 forks source link

New Goinfer provider #20

Closed synw closed 8 months ago

synw commented 12 months ago

I started to add support for a Goinfer provider here: https://github.com/synw/wingman

https://github.com/nvms/wingman/compare/main...synw:wingman:main

nvms commented 12 months ago

Nice! Really excited to try this.

synw commented 12 months ago

I added experimental support for templating and context window ( #13 , #17 ) for now in settings. Please check it out and tell me what you think. Here are the steps to try it out:

Get Goinfer (Linux):

wget https://github.com/synw/goinfer/releases/download/0.8.0/goinfer 
# or compile from source cf docs

Create a models directory somewhere and download a gguf model:

mkdir models
cd models
wget https://huggingface.co/TheBloke/CodeLlama-7B-Instruct-GGUF/resolve/main/codellama-7b-instruct.Q4_K_M.gguf

Configure Goinfer:

mkdir tasks # internally required
./goinfer -localconf

Edit the generated goinfer.config.json file to add the absolute path to your models directory

Now clone https://github.com/synw/wingman and install it, then run code . and press F5. Configure the settings with these:

Now set the api key with the one in the goinfer.config.json and it should work after a restart

Note: this setup is for example/convenience as it will not give good results with the base Codellama instruct model (waiting for some efficient fine tunes). I had much better results with even smaller models but it's easy to run as the gguf file is directly available for download.

synw commented 8 months ago

This provider is now deprecated