leona / helix-gpt

Code assistant language server for Helix with support for Copilot/OpenAI/Codeium/Ollama
MIT License
285 stars 19 forks source link

add ollama provider #42

Closed kyfanc closed 3 months ago

kyfanc commented 4 months ago

This PR add basic support for Ollama.

Prompts are copied from openai provider

Testing

  1. install ollama
  2. launch ollama
  3. ollama pull codellam
  4. modify helix languages.toml
    
    [[language]]
    name = "go"
    language-servers = ["gopls", "gpt"]

[language-server.gpt] command = "bun" args = [ "--inspect=0.0.0.0:6499", "run", "helix-gpt/src/app.ts", "--handler", "ollama", "--logFile", "helix-gpt.log" ]



## Notes
I am new to coding with LLM and just wanted to play around with using helix and local hosted ollama. 
As I don't have access to other providers for comparison, and not much experience with prompt engineering, this is just something that seems working. Please help to test it out and let me know what is missing. 

## Discussion
- for user without strong hardware, some actions with larger file may takes too long and trigger helix `Async job failed: request 8 timed out`
- prompt and parameters may need some more fine tuning
leona commented 3 months ago

Thanks a lot for the PR, merged.