Closed Mathieu-R closed 3 days ago
@Mathieu-R is the same ChatOpenAI
model that you created here passed to Langchain?
@Mathieu-R is the same
ChatOpenAI
model that you created here passed to Langchain?
Yes, you can even try with the following simple example :
import { ChatOpenAI } from "@langchain/openai";
import { createHeaders, PORTKEY_GATEWAY_URL } from "portkey-ai";
const portKeyConf = {
baseUrl: PORTKEY_GATEWAY_URL,
defaultHeaders: createHeaders({
apiKey: PORTKEY_KEY,
VirtualKey: OPENAI_VKEY
})
}
const chat = new ChatOpenAI({
apiKey: "X",
model: "gpt-3.5-turbo",
configuration: portKeyConf
})
await chat.invoke("What is the meaning of life, universe and everything? Answer in a few words.")
I get the following error
AuthenticationError: 401 Incorrect API key provided: X. You can find your API key at https://platform.openai.com/account/api-keys.
Hey! I was just looking into the issue.
See this implementation of the portKeyConf
const portKeyConf = {
baseURL: PORTKEY_GATEWAY_URL,
defaultHeaders: createHeaders({
apiKey: <PORTKEY_API_KEY>,
virtualKey: <OPENAI_V_KEY>,
provider: "openai"
})
}
4 points to notice here:
provider: "openai"
baseURL
and not baseUrl
virtualKey
and not VirtualKey
I hope this solves the issue, if not, feel free to message us. Happy to help. :)
Hey! I was just looking into the issue.
See this implementation of the portKeyConf
const portKeyConf = { baseURL: PORTKEY_GATEWAY_URL, defaultHeaders: createHeaders({ apiKey: <PORTKEY_API_KEY>, virtualKey: <OPENAI_V_KEY>, provider: "openai" }) }
4 points to notice here:
1. We need to set provider as well, in this case `provider: "openai"` 2. The key has to be `baseURL` and not `baseUrl` 3. The key has to be `virtualKey` and not `VirtualKey` 4. If you are using virtualKey, you may not use provider key in the config object.
I hope this solves the issue, if not, feel free to message us. Happy to help. :)
Yes it works, sorry for the inconvenience, I misread the doc. Thanks !
Contact Details
No response
What happened?
What Happened?
I'm trying to use PortKey with LangChainJS. As I understand I can create a virtual key for each LLM provider I wanna use and pass this key to
ChatOpenAI
function from LangChain.The model is then passed to LangChain. However I get the following error :
“Incorrect API key provided: X. You can find your API key at https://platform.openai.com/account/api-keys.”
. Moreover, it doesn't work with other models than OpenAI. Finally, if I simply use the OPENAI api key and not the virtual key, the request works but nothing is logged in the PortKey dashboard.I tried a simple request with the OpenAI node library and it works, the request gets logged in PortKey, so my PortKey api key is correct.
What Should Have Happened?
This should pass request to the gateway and log the LLM call in the PortKey dashboard
Relevant Code Snippet
No response
Your Twitter/LinkedIn
https://www.linkedin.com/in/mathieu-rousseau-929044150/
Version
0.1.xx (Default)
Relevant log output
No response
Code of Conduct