Closed justyns closed 6 months ago
I was just thinking about this too, would be great to have! You can use the editor.filterBox
UI to allow people to easily switch
@zefhemel how do you feel about a plug updating SETTINGS automatically? I'm thinking of having a command that opens a filterBox with configured llms, then updating ai.selectedModel
in SETTINGS to whatever the user selects.
That would be nice, but "restringifying" the YAML settings in place is likely restructure the settings and remove comments which users may not like.
I ended up going a different route and using clientStore.set
and clientStore.get
to set ai.selectedTextModel
when it is changed. AFAICT this means it will be a client-specific setting like dark mode, and I don't have to worry about figuring out how to auto-update SETTINGS or a new config file
Right now we can only configure a single model in the SETTINGS page. We can change that to be an array of models instead, something like this:
models
probably isn't the right name since each api could support multiple models (e.g. gpt-3 vs gpt-4). But part of this is so that I can support template prompts with a frontmatter like this:Each templated prompt would be override the model it uses. Something like creating a new space script in my example above could go to openai/gpt4, but something like summarizing a personal note would go to a locally-hosted llm instead for privacy reasons.