Closed ffreemt closed 2 months ago
Not a silly question.
langchain-openai
uses openai
under the hood so everything will work normally. The same goes for any API wrapper other than just Langchain as long as one of the two are true:
openai
library directlyrequests
would not work)Here is a working example using this library to test langchain-openai
using ChatOpenAI
that just slightly alters the normal chat completion example:
from langchain_openai import ChatOpenAI
from pydantic.v1 import SecretStr
import openai_responses
from openai_responses import OpenAIMock
@openai_responses.mock()
def test_langchain_chat_openai_invoke(openai_mock: OpenAIMock):
openai_mock.chat.completions.create.response = {
"choices": [
{
"index": 0,
"finish_reason": "stop",
"message": {
"content": "J'adore la programmation.",
"role": "assistant",
},
}
]
}
llm = ChatOpenAI(
name="My Custom Chatbot",
model="gpt-4o",
temperature=0,
max_tokens=None,
timeout=None,
max_retries=2,
api_key=SecretStr("sk-fake123"),
)
messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
assert ai_msg.content == "J'adore la programmation." # type: ignore
I'll add this to the examples/
for others to see ๐
Cool, thanks very much.
Hello.
Maybe it's a silly question but I had a hard time trying pytest with langchain_openai. Can your lib work with langchain_openai with some tweaking? Thanks.
I ran a quick test.
I tried this (replacing openai with langchain_openai), but got
when running
test_create_assistant()
, probably for an obvious reason.