Closed Prikshit7766 closed 4 months ago
🤖
Hello @Prikshit7766,
I'm Dosu, a friendly bot here to assist you with your LangChain repository concerns. I'm here to help you solve bugs, answer your questions, and guide you on how to become a contributor. Rest assured, while we're waiting for a human maintainer, I'm here to help you out.
I'm currently reviewing your issue and will provide a detailed response shortly. Hang tight!
🤖
Thank you for bringing this compatibility issue to our attention. It seems that the recent changes in the OpenAI Python package (version 1.1.1) have indeed caused some issues with the LangChain library.
Based on the information you provided and the context from the LangChain repository, I can suggest the following modifications to the LangChain library to make it compatible with the OpenAI Python package version 1.1.1:
langchain/llms/openai.py
file. The OpenAI Python package has restructured its error handling, and all error types are now available under openai.OpenAIError
. You can replace the current error handling imports with this:errors = [openai.OpenAIError]
values["client"]
assignment in the langchain/llms/openai.py
file. The OpenAI Python package has changed the way it initializes the API client. Instead of creating an instance of openai.OpenAI
or openai.AsyncOpenAI
, you should now create an instance of openai.ApiClient
. Here's how you can do it:values["client"] = openai.ApiClient(
api_key=values.get("openai_api_key"),
timeout=values.get("request_timeout"),
max_retries=values.get("max_retries"),
organization=values.get("openai_organization"),
base_url=values.get("openai_api_base") or None,
)
values["async_client"] = openai.AsyncApiClient(
api_key=values.get("openai_api_key"),
timeout=values.get("request_timeout"),
max_retries=values.get("max_retries"),
organization=values.get("openai_organization"),
base_url=values.get("openai_api_base") or None,
)
Please note that these changes are based on the assumption that the OpenAI Python package has made these changes in version 1.1.1. You should verify these changes by checking the OpenAI Python package's documentation or source code.
As for the error message you're seeing, it's being produced because the ChatCompletion
attribute is not found in the openai
package. This is likely due to the package being an old version. The error message suggests upgrading the openai
package using the command pip install --upgrade openai
.
I hope this helps! If you have any other questions or need further clarification, feel free to ask.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
Can confirm, openai 1.1.1 breaks langchain completely. It seems the "openai.Completion" calls being deprecated are the culprit.
As an adhoc Running pip install "openai<1.0.0"
helps mitigate the issue.
I can confirm that the app we have built is using the LLMChain (Legacy) class and we have this issue... its annoying as we started buidling at the start of the langcahi project and this is causing havoc! Thanks @kmishra1204 for your advice there I will look at doing that and hope LC can patch this !
We all want compatibility with the last version of the openai python library
pip install openai==0.28.1
resolved the issue.
You should use the Langchain extra package to install OpenAI:
pip install langchain[openai]
This will pull a compatible version.
I see that multiple newer versions of langchain has been released in the last weeks but it is not clear if any has provided full support for openai v 1.x.
Is there a plan for langchain to fully migrate to that, or do the newer releases already take care of this?
Newer versions of LangChain indicate that OpenAI v1 is supported.
openai = {version = "<2", optional = true}
Thanks for your reply. Sorry if this is a silly question but where do I look for that?
I downloaded the source code of release v0.0.340 Latest (https://github.com/langchain-ai/langchain/releases/tag/v0.0.340) and I checked the poetry.lock file, which still shows openai = ["openai (>=0,<1)", "tiktoken (>=0.3.2,<0.6.0)"] under [package.extras].
The information in poetry.lock is not passed to the package metadata. The only information used is from pyproject.toml : https://github.com/langchain-ai/langchain/blob/v0.0.340/libs/langchain/pyproject.toml#L53
And yet, you are not going to be able to install this with conda, for example.
├─ langchain is installable and it requires
│ └─ openai >=0.0.0,<1.0.0 , which can be installed;
└─ openai >=1 is not installable because it conflicts with any installable versions previously reported.
... that's a dependency conflict... If you use a lib that depends on a certain version of a library, it's normal to reject another version. That's what dependency resolution is for. Otherwise you'll get issues at runtime, which this issue is exactly about.
Should be fixed, if anyone still sees issues with the latest versions of langchain-openai
and the openai python sdk let us know
Description
Compatibility issue with the Langchain library due to the recent changes in the OpenAI Python package (version 1.1.1). The Langchain library relies on certain structures and imports from the OpenAI package, which have been modified in the new version. Specifically, the issue seems to be related to the following changes:
In the Langchain code, the error handling imports in langchain/llms/openai.py at line 90 were based on the older structure of the OpenAI package. In the newer version, these imports have been restructured and are available in openai._exceptions.
In langchain/llms/openai.py at line 266, values["client"] = openai.Completion is no longer valid in the new version of OpenAI (version 1.1.1).
Also![image](https://github.com/langchain-ai/langchain/assets/101416953/de06a0e8-492b-488e-b09a-2189599370d6)
Note: To avoid the above error, users should downgrade the OpenAI package to version 0.28.1.
Who can help?
No response
Information
Related Components
Reproduction
!pip install langchain openai
from langchain import OpenAI import os
os.environ["OPENAI_API_KEY"] = "key"
llm = OpenAI( model_name="text-davinci-003", temperature= 0.2, max_tokens= 64, openai_api_key=os.environ["OPENAI_API_KEY"], )
Expected behavior
Langchain should work without errors when using OpenAI version 1.1.1.