Closed BananaAcid closed 9 months ago
@BananaAcid despite the yellow underline indicating error (we will fix this I realize it's extremely misleading), you should still be able to use that model.
If that still isn't working, let me know, this would be much different bug!
@sestinj could you elaborate on how the model line above should look and how the the model_roles.default
should look?
@BananaAcid model roles is no longer necessary and won't have an effect, it is just leftover from old config.json files.
Otherwise, your config above looks perfect. If you choose one of the models that are listed in the dropdown (that do not give yellow underline), then we will map those options onto their respective tag names for Ollama. If you give an option that is not in the dropdown, we will just pass that exact string to Ollama, so as long as you use a valid tag name, you can ignore the yellow line and it should work.
Obviously a bit confusing, it's on my mind to reorganize this very soon
Perfect, yes it works.
Reloading the dropdown does not seem to work, but restarting vscode after each change in the config.json does seem to do the trick.
@sestinj when you get around to working on this, it would be handy if it were possible to make a non-standard ollama model be the default (rather than continue always defaulting to GPT-4).
@pjv do you mean that when you reload the window it should remember which model you were using last, or something else?
And is this pre-release, or main version you are using?
do you mean that when you reload the window it should remember which model you were using last, or something else?
Yeah, remembering would be great. Alternatively the ability to specify a default via settings (gui or json) like I guess used to be possible with config.json
would be fine too.
And is this pre-release, or main version you are using?
I’m using v0.7.58. I’ve configured a few recent Ollama models viaconfig.json
. They are working fine but regardless of what order I put them into config.json
in, whenever the window is reloaded or a new one opened, continue always defaults to GPT-4.
@pjv Ok great, I've made the change to persist this info (as well as saving any current session history). Will be available later today in v0.7.60 (pre-release)
Before submitting your bug report
Relevant environment info
Description
There does not seem to exist a way to use any model, that is not predefined in Continue.
I tried adding the following to the
config.json
As well as the the others the same way. Using model
codellama-7b
works.To reproduce
Log output
No response