twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
3.11k stars 165 forks source link

FIM Template 'Codeqwen' ignores 'File Context Enabled = true' #358

Closed AndrewRocky closed 2 weeks ago

AndrewRocky commented 3 weeks ago

Describe the bug FIM template Codeqwen doesn't send file context.

To Reproduce Select provider with FIM Template codeqwen - there is no File Context in prompt (checking on LLM's server).

Select provider with FIM Template starcoder2 - File Context is present.

Expected behavior FIM provider codeqwen should add File Context to prompt.

Logs from LLM's server

 INFO     PROMPT=
<|fim_prefix|>-- test<|fim_suffix|><|fim_middle|>

Logging

      Original completion: s for the 'with' statement
      Formatted completion: s for the 'with' statement
      Max Lines: 30
      Use file context: true
      Completed lines count 1
      Using custom FIM template fim.bhs?: false
workbench.desktop.main.js:sourcemap:617 [Extension Host] [twinny] ***Twinny Stream Debug***
    Streaming response from llm_server.local:42069.
    Request body:
    {
  "prompt": "<|fim_prefix|>-- test<|fim_suffix|><|fim_middle|>",
  "stream": true,
  "temperature": 0.2,
  "max_tokens": 64
}

API Provider oobabooga

Chat or Auto Complete? FIM (auto-complete, autocomplete (for searchability))

Model Name qwen2.5-coder-7B`

Version Extension version: v3.17.30

AndrewRocky commented 3 weeks ago

I'll try to create an MR for this

AndrewRocky commented 3 weeks ago

359 should fix this (if it works)

rjmacarthy commented 2 weeks ago

Merged, working well too, thanks!