Open ambilykk opened 3 weeks ago
While using the LLM using OpenAI, we encountered the Bad request error.
const capiClient = new OpenAI({ baseURL: "https://api.githubcopilot.com/", apiKey: tokenForUser, headers: { "Copilot-Integration-Id": "copilot-chat" }, }); console.log("capiclient request"); const response = await capiClient.chat.completions.create({ stream: false, model: "gpt-4o", messages: [{ role: "user", content: "What is GitHub Copilot"}] });
Error Message
Once we replace the module function with fetch and hardcoded the Copilot-Integration-Id, it start working.
fetch
Copilot-Integration-Id
const copilotResponse = await fetch( "https://api.githubcopilot.com/chat/completions", { method: "POST", headers: { "Content-Type": "application/json", "Authorization": `Bearer ${tokenForUser}`, "Copilot-Integration-Id": "vscode-chat", }, body: JSON.stringify({ messages: [{ role: "user", content: "What is GitHub Copilot"}], max_tokens: 50, temperature: 0.5 }), } );
While using the LLM using OpenAI, we encountered the Bad request error.
Code Used
Error Message
Work-around
Once we replace the module function with
fetch
and hardcoded theCopilot-Integration-Id
, it start working.