twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.3k stars 126 forks source link

LM Studio supports "Multi Mode Session", must specify the model name #217

Closed marcmcd closed 2 months ago

marcmcd commented 2 months ago

(I tried to create a pull request against the development branch but it failed since I'm not a collaborator.)

For FIM auto completion requests to the LM Studio provider, the model name must be provided if multiple models are loaded in the LM Studio "Multi Mode Session". If the model name is not provided, then LM Studio responds with an error e.g.

[ERROR] Multiple models loaded. Please specify a model to use. Currently 
loaded models: second-state/StarCoder2-3B-GGUF/starcoder2-3b-Q3_K_L.gguf, 
TheBloke/CodeLlama-7B-Instruct-GGUF/codellama-7b-instruct.Q8_0.gguf

Therefore the function createStreamRequestBodyFim must be modifed to return the model filed also for the ApiProviders.LMStudio case

case ApiProviders.LMStudio:
      return {
        model: options.model, // LM Studio supports "Multi Mode Session", must be specify model name
        prompt,
        stream: true,
        temperature: options.temperature,
        n_predict: options.numPredictFim
      }

This was tested with LM Studio "Multi Mode Session" containing

rjmacarthy commented 2 months ago

Hey, thanks for this. Please could you open a PR against main and I will take care of the rest code wise? Also there are some docs (docs/providers.md) for LMStudio which could be updated with this information.

Many thanks.

rjmacarthy commented 2 months ago

I just released 3.11.18 which allows option to pass model to lmstudio api for multi model support.

Many thanks,

marcmcd commented 2 months ago

Thanks for a great VSCode extension