Closed malomarrec closed 1 year ago
Partial answer: Site config already supports using OpenAI for completions. The frontend needs to send slightly different params, such as the below:
const OPENAI_DEFAULT_CHAT_COMPLETION_PARAMETERS: Omit<CompletionParameters, 'messages'> = {
temperature: 0.2,
maxTokensToSample: SOLUTION_TOKEN_LENGTH,
}
(That change needs to be made in code currently.)
The three parts of this issue, and their current status, with my thoughts:
→ The site config keys completions
and embeddings
(documented here) already allow customers to use their own instance. Not sure if we need to do anything else.
→ We already support openai
as a provider, albeit undocumented. The code might need some cleaning.
Why?
→ a few thoughts:
Next steps:
Update:
@vdavid
Hello, cool work you all are doing here. Will this feature be supported for non-enterprise users? I would like to link up cody vscode extension to gpt-4.
Hello, cool work you all are doing here. Will this feature be supported for non-enterprise users? I would like to link up cody vscode extension to gpt-4.
Not sure yet! We're working on it and thinking about it. Supporting multiple LLMs is tricky, because you want to tweak prompts for each one. But get that folks want to use GPT4 (and others). We're trying to find a good solution here :)
Problem
We've had requests from several customers and prospects about using their own openAI or Anthropic contract for Cody. We've reviewed and cleared from a legal standpoint, provided we add a clause in the order form.
Most recently, we heard this from https://github.com/sourcegraph/accounts/issues/2525. They have an existing contract with OpenAI and want to plug:
into Cody.
We need to investigate how to do this. This has multiple dimensions:
TODO
Customers
Important talking point for: