Closed geekpaolo closed 6 months ago
🤖
Hello,
The error message you're encountering, "openai.error.InvalidRequestError: Unrecognized request argument supplied: functions", suggests that the OpenAI API is receiving an unexpected argument named "functions". This error might be occurring in the create_openai_fn_chain
function in your code. This function is trying to pass a "functions" argument to the OpenAI API, which might not be supported by the AzureOpenAI model you're using.
In your script, you're using the AzureChatOpenAI
class from the LangChain framework. This class is designed to interact with a deployed model on Azure OpenAI. If you're using this class with the AzureOpenAI model, make sure that the function_call
argument in the additional_kwargs
parameter of the AIMessage
object is correctly formatted and that the AzureOpenAI model supports the function you're trying to call.
Here is an example of how to use the AzureChatOpenAI
class:
AzureChatOpenAI(
deployment_name="35-turbo-dev",
openai_api_version="2023-05-15",
)
In this example, 35-turbo-dev
is the deployment name of the gpt-35-turbo
model deployed on Azure OpenAI. The API version is set to 2023-05-15
.
If the error persists, could you please provide the part of your code where you're making the request to the OpenAI API? This will help us better understand the issue and provide a more accurate solution.
Best, Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
Any plan to integrate this feature ?
Have the same issue.
@irazakharchenko I made it work.
You need to change the version of the backend API used by Azure LLM. https://learn.microsoft.com/en-us/azure/ai-services/openai/reference Try to use '2023-09-01-preview' for example.
I already use this version. I will look through my code, maybe missed something.
I find similar issue https://github.com/langchain-ai/langchain/issues/8593 .
any updates on this? I switched to azureopenai function calling instead of this, can anyone confirm which of the latter is better? I am getting the same error, tried multiple things, even da-vinchi,gpt-3.5,gpt3.5turbo from azureopenai.
LLM : gpt-35-turbo lanchain version : 0.0.316 openai_api_version: 2023-09-01-preview
It works for me
Can anyone confirm which is better for extraction? langchain extractor or azureopenai functions?
System Info
pyython: 3.11 langChain:0.0.309 SO: windows 10
pip list: Package Version
aiofiles 23.2.1 aiohttp 3.8.5 aiosignal 1.3.1 anyio 3.7.1 async-timeout 4.0.3 asyncer 0.0.2 attrs 23.1.0 auth0-python 4.4.0 backoff 2.2.1 bcrypt 4.0.1 beautifulsoup4 4.12.2 bidict 0.22.1 certifi 2023.7.22 cffi 1.15.1 chardet 5.2.0 charset-normalizer 3.2.0 chroma-hnswlib 0.7.3 chromadb 0.4.10 click 8.1.7 colorama 0.4.6 coloredlogs 15.0.1 cryptography 41.0.3 dataclasses-json 0.5.14 Deprecated 1.2.14 django-environ 0.10.0 docx2txt 0.8 emoji 2.8.0 faiss-cpu 1.7.4 fastapi 0.97.0 fastapi-socketio 0.0.10 filelock 3.12.2 filetype 1.2.0 flatbuffers 23.5.26 frozenlist 1.4.0 fsspec 2023.6.0 google-search-results 2.4.2 googleapis-common-protos 1.60.0 gpt4all 1.0.9 greenlet 2.0.2 grpcio 1.57.0 h11 0.14.0 html2text 2020.1.16 httpcore 0.17.3 httptools 0.6.0 httpx 0.24.1 huggingface-hub 0.16.4 humanfriendly 10.0 idna 3.4 importlib-metadata 6.8.0 importlib-resources 6.0.1 Jinja2 3.1.2 joblib 1.3.2 jsonpatch 1.33 jsonpointer 2.4 langchain 0.0.309 langdetect 1.0.9 langsmith 0.0.43 lxml 4.9.3 markdownify 0.11.6 MarkupSafe 2.1.3 marshmallow 3.20.1 monotonic 1.6 mpmath 1.3.0 multidict 6.0.4 mypy-extensions 1.0.0 nest-asyncio 1.5.7 networkx 3.1 nltk 3.8.1 nodeenv 1.8.0 numexpr 2.8.5 numpy 1.25.2 onnxruntime 1.15.1 openai 0.28.1 openapi-schema-pydantic 1.2.4 opentelemetry-api 1.19.0 opentelemetry-exporter-otlp 1.19.0 opentelemetry-exporter-otlp-proto-common 1.19.0 opentelemetry-exporter-otlp-proto-grpc 1.19.0 opentelemetry-exporter-otlp-proto-http 1.19.0 opentelemetry-instrumentation 0.40b0 opentelemetry-proto 1.19.0 opentelemetry-sdk 1.19.0 opentelemetry-semantic-conventions 0.40b0 overrides 7.4.0 packaging 23.1 pandas 1.5.3 pdf2image 1.16.3 Pillow 10.0.0 pip 23.2.1 playwright 1.37.0 posthog 3.0.2 prisma 0.9.1 protobuf 4.24.1 pulsar-client 3.3.0 pycparser 2.21 pydantic 1.10.12 pyee 9.0.4 PyJWT 2.8.0 pypdf 3.15.5 PyPika 0.48.9 pyreadline3 3.4.1 python-dateutil 2.8.2 python-dotenv 1.0.0 python-engineio 4.5.1 python-graphql-client 0.4.3 python-iso639 2023.6.15 python-magic 0.4.27 python-socketio 5.8.0 pytz 2023.3 PyYAML 6.0.1 regex 2023.8.8 requests 2.31.0 safetensors 0.3.2 scikit-learn 1.3.0 scipy 1.11.2 sentence-transformers 2.2.2 sentencepiece 0.1.99 setuptools 68.0.0 six 1.16.0 sniffio 1.3.0 soupsieve 2.5 SQLAlchemy 1.4.49 starlette 0.27.0 sympy 1.12 syncer 2.0.3 tabulate 0.9.0 tenacity 8.2.3 threadpoolctl 3.2.0 tiktoken 0.5.1 tokenizers 0.13.3 tomli 2.0.1 tomlkit 0.12.1 torch 2.0.1 torchvision 0.15.2 tqdm 4.66.1 transformers 4.31.0 typing_extensions 4.7.1 typing-inspect 0.9.0 tzdata 2023.3 unstructured 0.10.18 uptrace 1.19.0 urllib3 2.0.4 uvicorn 0.22.0 watchfiles 0.19.0 websockets 11.0.3 wheel 0.38.4 wrapt 1.15.0 yarl 1.9.2 zipp 3.16.2
Who can help?
@hwchase17 @agola11
Information
Related Components
Reproduction
When i run i get:
Expected behavior