Closed shatealaboxiaowang closed 1 year ago
@shatealaboxiaowang I don't know for sure -- in theory it should work with any of the llms / chat models provided by langchain unless there's a bug in the kor code.
Have you tried this: https://python.langchain.com/en/latest/modules/models/chat/integrations/azure_chat_openai.html
Thanks for your reply,Due to local restrictions, I can only use the Microsoft AzureChatOpenAI interface, but AzureChatOpenAI has packaged ChatOpenAI, so it cannot be directly used for your code, here is my code of llm:
llm = AzureChatOpenAI(
openai_api_version="2023-03-15-preview",
deployment_name=engine,
streaming=True,
temperature=llm_kws.pop("temperature", 0.7),
top_p=llm_kws.pop("top_p", 0.95),
max_tokens=llm_kws.pop("max_tokens", 1000),
model_kwargs=llm_kws,
max_retries=3,
stream=True
)
but there are errors that occur when running: chain = create_extraction_chain(llm, schema, encoder_or_encoder_class='json') so have you done similar work for AzureChatOpenAI ?
thanks!
Could you include error information including the stack trace?
Kor
is expected to work out of the box with langchain LLMs that follow the base language API. Going to mark as resolved due to inactivity and lack of error trace.
Thanks.
Thanks for you coding,I see that you use chatopenai to implement, is there an implementation of AzureChatOpenAI? I am looking forward to your reply. Thank you again!