Open w1gs opened 2 days ago
Nice! Will test this in a bit. It would be great to add setup instructions for this, just added #26.
Nice! Will test this in a bit. It would be great to add setup instructions for this, just added #26.
Awesome. Where would be the best place to add the instructions to set that up?
It should probably appear in a new window when VS Code starts up, similar to the "Welcome" page. We're also open to alternatives.
This PR adds in the Ollama integration. Two new settings were added for Ollama (endpoint and model). Instead of using the Ollama node library a fetch request directly to the provided endpoint is used. A local instance of the Ollama API can be started with the command
ollama serve
. TheOLLAMA_ORIGINS=*
environment variable needs to be set to allow the extension to make requests to Ollama.