continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
19.25k stars 1.66k forks source link

OpenAI Prediction #2837

Closed RomneyDa closed 1 week ago

RomneyDa commented 1 week ago

Description

In the first commit, I added predictionContent property to the completion options, that would only support a string, keep it simple, and could be deprecated for prediction later.

Then I decided there might be a use case for the array of text items prediction format, so I decided to do the full implementation in the second commit.

Also, ended up removing the config_schema definition. It's misleading because in practice no one would ever declare prediction in their config. In the future, if prediction became widespread, could add more on the same level as prompt, where instead of being passed as options, it's a separate function parameter

Because of this, I chose to not add prediction to the docs config reference.

See https://platform.openai.com/docs/api-reference/chat/create#chat-create-prediction

Let me know your thoughts on this

Checklist

Usage

// First check model is openai compatible
if(provider instanceof OpenAI){
options = {
   ...options,
prediction: {
    type: "content",
    content: "file string"
    }
}

Could add this to base llm instead but openai specific for now