assafelovic / gpt-researcher

LLM based autonomous agent that does online comprehensive research on any given topic
https://gptr.dev
Apache License 2.0
14.25k stars 1.86k forks source link

Use azureopenai LLM Error #576

Open bat9527 opened 3 months ago

bat9527 commented 3 months ago

I am using azureopenai and an error occurs when I run it

EMBEDDING_PROVIDER = "azureopenai"
LLM_PROVIDER = "azureopenai"
AZURE_EMBEDDING_MODEL= "text-embedding-ada-002" 

SMART_LLM_MODEL = "gpt-4o"
AZURE_OPENAI_ENDPOINT = "XXX"
AZURE_OPENAI_API_KEY = "XXX"
OPENAI_API_VERSION = "XXX" 

ERROR: Exception in ASGI application Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 244, in run_asgi result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in call await super().call(scope, receive, send) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/applications.py", line 123, in call await self.middleware_stack(scope, receive, send) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/middleware/errors.py", line 151, in call await self.app(scope, receive, send) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 65, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 756, in call await self.middleware_stack(scope, receive, send) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 373, in handle await self.app(scope, receive, send) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 96, in app await wrap_app_handling_exceptions(app, session)(scope, receive, send) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/starlette/routing.py", line 94, in app await func(session) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/fastapi/routing.py", line 348, in app await dependant.call(*values) File "/Users/liwen/Downloads/gpt-res/gpt-researcher/backend/server.py", line 53, in websocket_endpoint report = await manager.start_streaming(task, report_type, report_source, websocket) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/liwen/Downloads/gpt-res/gpt-researcher/backend/websocket_manager.py", line 57, in start_streaming report = await run_agent(task, report_type, report_source, websocket) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/liwen/Downloads/gpt-res/gpt-researcher/backend/websocket_manager.py", line 75, in run_agent report = await researcher.run() ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/liwen/Downloads/gpt-res/gpt-researcher/backend/report_type/basic_report/basic_report.py", line 18, in run await researcher.conduct_research() File "/Users/liwen/Downloads/gpt-res/gpt-researcher/gpt_researcher/master/agent.py", line 96, in conduct_research context = await self.get_context_by_search(self.query) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/liwen/Downloads/gpt-res/gpt-researcher/gpt_researcher/master/agent.py", line 191, in get_context_by_search context = await asyncio.gather([self.process_sub_query(sub_query, scraped_data) for sub_query in sub_queries]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/liwen/Downloads/gpt-res/gpt-researcher/gpt_researcher/master/agent.py", line 210, in process_sub_query content = await self.get_similar_content_by_query(sub_query, scraped_data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/liwen/Downloads/gpt-res/gpt-researcher/gpt_researcher/master/agent.py", line 267, in get_similar_content_by_query return context_compressor.get_context(query=query, max_results=8, cost_callback=self.add_costs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/liwen/Downloads/gpt-res/gpt-researcher/gpt_researcher/context/compression.py", line 48, in get_context relevant_docs = compressed_docs.invoke(query) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/langchain_core/retrievers.py", line 221, in invoke raise e File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/langchain_core/retrievers.py", line 214, in invoke result = self._get_relevant_documents( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/langchain/retrievers/document_compressors/base.py", line 39, in compress_documents documents = _transformer.compress_documents( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/langchain/retrievers/document_compressors/embeddings_filter.py", line 73, in compress_documents embedded_documents = _get_embeddings_from_stateful_docs( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/langchain_community/document_transformers/embeddings_redundant_filter.py", line 70, in _get_embeddings_from_stateful_docs embedded_documents = embeddings.embed_documents( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/langchain_openai/embeddings/base.py", line 535, in embed_documents return self._get_len_safe_embeddings(texts, engine=engine) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/langchain_openai/embeddings/base.py", line 430, in _get_len_safe_embeddings response = self.client.create( ^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/resources/embeddings.py", line 114, in create return self._post( ^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1240, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request return self._request( ^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Unsupported data type

bat9527 commented 3 months ago

I have AzureOpenAI api and Azure Embedding API,but always error

devikasharma97 commented 1 month ago

Hi, in the new update they have changed the value that indicates Azure OpenAI from azureopenai to azure_openai

danieldekay commented 1 month ago

@bat9527, please try the current master branch. Does it work?

PS: We are also using AzureOpenAI, and I'd love for this to be stable.