⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
In the first commit, I added predictionContent property to the completion options, that would only support a string, keep it simple, and could be deprecated for prediction later.
Then I decided there might be a use case for the array of text items prediction format, so I decided to do the full implementation in the second commit.
Also, ended up removing the config_schema definition. It's misleading because in practice no one would ever declare prediction in their config. In the future, if prediction became widespread, could add more on the same level as prompt, where instead of being passed as options, it's a separate function parameter
Because of this, I chose to not add prediction to the docs config reference.
Description
In the first commit, I added
predictionContent
property to the completion options, that would only support a string, keep it simple, and could be deprecated forprediction
later.Then I decided there might be a use case for the array of text items prediction format, so I decided to do the full implementation in the second commit.
Also, ended up removing the config_schema definition. It's misleading because in practice no one would ever declare
prediction
in their config. In the future, ifprediction
became widespread, could add more on the same level asprompt
, where instead of being passed asoptions
, it's a separate function parameterBecause of this, I chose to not add
prediction
to the docs config reference.See https://platform.openai.com/docs/api-reference/chat/create#chat-create-prediction
Let me know your thoughts on this
Checklist
dev
, rather thanmain
Usage
Could add this to base llm instead but openai specific for now