twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.3k stars 126 forks source link

Improve documentation for FIM #229

Closed rburgst closed 2 months ago

rburgst commented 2 months ago

Is your feature request related to a problem? Please describe. I dont know how to properly set up twinny to get FIM to work.

Describe the solution you'd like I would like a clear explanation on what I have to configure using ollama (ideally codegemma, llama3 or deepseek)

Describe alternatives you've considered checked issues, the docs

Additional context currently trying with these settings but they dont work

image image
rjmacarthy commented 2 months ago

Hello, please try /api/generate as the FIM path for ollama. Here are some example configurations.

https://github.com/rjmacarthy/twinny/blob/main/docs/providers.md

rjmacarthy commented 2 months ago

Executive decision: Individual configuration issues will receive an answer and will be considered closed after 24 hours of no replies. Many thanks.