langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.5k stars 14.8k forks source link

Fail using langchain extractor with AzureOpenAI #11478

Closed geekpaolo closed 6 months ago

geekpaolo commented 11 months ago

System Info

pyython: 3.11 langChain:0.0.309 SO: windows 10

pip list: Package Version


aiofiles 23.2.1 aiohttp 3.8.5 aiosignal 1.3.1 anyio 3.7.1 async-timeout 4.0.3 asyncer 0.0.2 attrs 23.1.0 auth0-python 4.4.0 backoff 2.2.1 bcrypt 4.0.1 beautifulsoup4 4.12.2 bidict 0.22.1 certifi 2023.7.22 cffi 1.15.1 chardet 5.2.0 charset-normalizer 3.2.0 chroma-hnswlib 0.7.3 chromadb 0.4.10 click 8.1.7 colorama 0.4.6 coloredlogs 15.0.1 cryptography 41.0.3 dataclasses-json 0.5.14 Deprecated 1.2.14 django-environ 0.10.0 docx2txt 0.8 emoji 2.8.0 faiss-cpu 1.7.4 fastapi 0.97.0 fastapi-socketio 0.0.10 filelock 3.12.2 filetype 1.2.0 flatbuffers 23.5.26 frozenlist 1.4.0 fsspec 2023.6.0 google-search-results 2.4.2 googleapis-common-protos 1.60.0 gpt4all 1.0.9 greenlet 2.0.2 grpcio 1.57.0 h11 0.14.0 html2text 2020.1.16 httpcore 0.17.3 httptools 0.6.0 httpx 0.24.1 huggingface-hub 0.16.4 humanfriendly 10.0 idna 3.4 importlib-metadata 6.8.0 importlib-resources 6.0.1 Jinja2 3.1.2 joblib 1.3.2 jsonpatch 1.33 jsonpointer 2.4 langchain 0.0.309 langdetect 1.0.9 langsmith 0.0.43 lxml 4.9.3 markdownify 0.11.6 MarkupSafe 2.1.3 marshmallow 3.20.1 monotonic 1.6 mpmath 1.3.0 multidict 6.0.4 mypy-extensions 1.0.0 nest-asyncio 1.5.7 networkx 3.1 nltk 3.8.1 nodeenv 1.8.0 numexpr 2.8.5 numpy 1.25.2 onnxruntime 1.15.1 openai 0.28.1 openapi-schema-pydantic 1.2.4 opentelemetry-api 1.19.0 opentelemetry-exporter-otlp 1.19.0 opentelemetry-exporter-otlp-proto-common 1.19.0 opentelemetry-exporter-otlp-proto-grpc 1.19.0 opentelemetry-exporter-otlp-proto-http 1.19.0 opentelemetry-instrumentation 0.40b0 opentelemetry-proto 1.19.0 opentelemetry-sdk 1.19.0 opentelemetry-semantic-conventions 0.40b0 overrides 7.4.0 packaging 23.1 pandas 1.5.3 pdf2image 1.16.3 Pillow 10.0.0 pip 23.2.1 playwright 1.37.0 posthog 3.0.2 prisma 0.9.1 protobuf 4.24.1 pulsar-client 3.3.0 pycparser 2.21 pydantic 1.10.12 pyee 9.0.4 PyJWT 2.8.0 pypdf 3.15.5 PyPika 0.48.9 pyreadline3 3.4.1 python-dateutil 2.8.2 python-dotenv 1.0.0 python-engineio 4.5.1 python-graphql-client 0.4.3 python-iso639 2023.6.15 python-magic 0.4.27 python-socketio 5.8.0 pytz 2023.3 PyYAML 6.0.1 regex 2023.8.8 requests 2.31.0 safetensors 0.3.2 scikit-learn 1.3.0 scipy 1.11.2 sentence-transformers 2.2.2 sentencepiece 0.1.99 setuptools 68.0.0 six 1.16.0 sniffio 1.3.0 soupsieve 2.5 SQLAlchemy 1.4.49 starlette 0.27.0 sympy 1.12 syncer 2.0.3 tabulate 0.9.0 tenacity 8.2.3 threadpoolctl 3.2.0 tiktoken 0.5.1 tokenizers 0.13.3 tomli 2.0.1 tomlkit 0.12.1 torch 2.0.1 torchvision 0.15.2 tqdm 4.66.1 transformers 4.31.0 typing_extensions 4.7.1 typing-inspect 0.9.0 tzdata 2023.3 unstructured 0.10.18 uptrace 1.19.0 urllib3 2.0.4 uvicorn 0.22.0 watchfiles 0.19.0 websockets 11.0.3 wheel 0.38.4 wrapt 1.15.0 yarl 1.9.2 zipp 3.16.2

Who can help?

@hwchase17 @agola11

Information

Related Components

Reproduction

from langchain.chat_models import AzureChatOpenAI
from langchain.chains import create_extraction_chain

# Accessing the OPENAI_API_KEY KEY
import environ
DIR_COMMON = "./common"
env = environ.Env()
environ.Env.read_env(env_file=DIR_COMMON+"/.env_iveco")

# Schema
schema = {
    "properties": {
        "name": {"type": "string"},
        "height": {"type": "integer"},
        "hair_color": {"type": "string"},
    },
    "required": ["name", "height"],
}

# Input 
inp = """Alex is 5 feet tall. Claudia is 1 feet taller Alex and jumps higher than him. Claudia is a brunette and Alex is blonde."""

# # Run chain
llm = AzureChatOpenAI(deployment_name="gpt4-datalab",model_name="gpt-4")
chain = create_extraction_chain(schema, llm,verbose= True)
print(chain.run(inp))

When i run i get:


> Entering new LLMChain chain...
Prompt after formatting:
Human: Extract and save the relevant entities mentionedin the following passage together with their properties.

Only extract the properties mentioned in the 'information_extraction' function.

If a property is not present and is not required in the function parameters, do not include it in the output.

Passage:
Alex is 5 feet tall. Claudia is 1 feet taller Alex and jumps higher than him. Claudia is a brunette and Alex is blonde.

Traceback (most recent call last):
  File "C:\Sviluppo\python\AI\Iveco\LangChain\extract_1.1.py", line 28, in <module>
    print(chain.run(inp))
          ^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chains\base.py", line 501, in run
    return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chains\base.py", line 306, in __call__
    raise e
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chains\base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chains\llm.py", line 93, in _call
    response = self.generate([inputs], run_manager=run_manager)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chains\llm.py", line 103, in generate
    return self.llm.generate_prompt(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chat_models\base.py", line 469, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chat_models\base.py", line 359, in generate
    raise e
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chat_models\base.py", line 349, in generate
    self._generate_with_cache(
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chat_models\base.py", line 501, in _generate_with_cache
    return self._generate(
           ^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chat_models\openai.py", line 345, in _generate
    response = self.completion_with_retry(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chat_models\openai.py", line 284, in completion_with_retry
    return _completion_with_retry(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\tenacity\__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\tenacity\__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\tenacity\__init__.py", line 314, in iter
    return fut.result()
           ^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\concurrent\futures\_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\concurrent\futures\_base.py", line 401, in __get_result
    raise self._exception
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\tenacity\__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\langchain\chat_models\openai.py", line 282, in _completion_with_retry
    return self.client.create(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\openai\api_resources\chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 155, in create
    response, _, api_key = requestor.request(
                           ^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\openai\api_requestor.py", line 299, in request
    resp, got_stream = self._interpret_response(result, stream)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\openai\api_requestor.py", line 710, in _interpret_response
    self._interpret_response_line(
  File "C:\Users\vw603\Anaconda3\envs\python11\Lib\site-packages\openai\api_requestor.py", line 775, in _interpret_response_line
    raise self.handle_error_response(
openai.error.InvalidRequestError: Unrecognized request argument supplied: functions

Expected behavior


> Entering new LLMChain chain...
Prompt after formatting:
Human: Extract and save the relevant entities mentionedin the following passage together with their properties.

Only extract the properties mentioned in the 'information_extraction' function.

If a property is not present and is not required in the function parameters, do not include it in the output.

Passage:
Alex is 5 feet tall. Claudia is 1 feet taller Alex and jumps higher than him. Claudia is a brunette and Alex is blonde.

> Finished chain.
[{'name': 'Alex', 'height': 5, 'hair_color'
dosubot[bot] commented 11 months ago

🤖

Hello,

The error message you're encountering, "openai.error.InvalidRequestError: Unrecognized request argument supplied: functions", suggests that the OpenAI API is receiving an unexpected argument named "functions". This error might be occurring in the create_openai_fn_chain function in your code. This function is trying to pass a "functions" argument to the OpenAI API, which might not be supported by the AzureOpenAI model you're using.

In your script, you're using the AzureChatOpenAI class from the LangChain framework. This class is designed to interact with a deployed model on Azure OpenAI. If you're using this class with the AzureOpenAI model, make sure that the function_call argument in the additional_kwargs parameter of the AIMessage object is correctly formatted and that the AzureOpenAI model supports the function you're trying to call.

Here is an example of how to use the AzureChatOpenAI class:

AzureChatOpenAI(
    deployment_name="35-turbo-dev",
    openai_api_version="2023-05-15",
)

In this example, 35-turbo-dev is the deployment name of the gpt-35-turbo model deployed on Azure OpenAI. The API version is set to 2023-05-15.

If the error persists, could you please provide the part of your code where you're making the request to the OpenAI API? This will help us better understand the issue and provide a more accurate solution.

Best, Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

GreGGus commented 10 months ago

Any plan to integrate this feature ?

irazakharchenko commented 9 months ago

Have the same issue.

GreGGus commented 9 months ago

@irazakharchenko I made it work.

You need to change the version of the backend API used by Azure LLM. https://learn.microsoft.com/en-us/azure/ai-services/openai/reference Try to use '2023-09-01-preview' for example.

irazakharchenko commented 9 months ago

I already use this version. I will look through my code, maybe missed something.

I find similar issue https://github.com/langchain-ai/langchain/issues/8593 .

shoaibshowkatkhan commented 9 months ago

any updates on this? I switched to azureopenai function calling instead of this, can anyone confirm which of the latter is better? I am getting the same error, tried multiple things, even da-vinchi,gpt-3.5,gpt3.5turbo from azureopenai.

GreGGus commented 9 months ago

LLM : gpt-35-turbo lanchain version : 0.0.316 openai_api_version: 2023-09-01-preview

It works for me

shoaibshowkatkhan commented 9 months ago

Can anyone confirm which is better for extraction? langchain extractor or azureopenai functions?