assafelovic / gpt-researcher

LLM based autonomous agent that conducts local and web research on any topic and generates a comprehensive report with citations.
https://gptr.dev
Apache License 2.0
15.01k stars 2.01k forks source link

Stucks on: Thinking about research questions for the task... #993

Open harisla7 opened 15 hours ago

harisla7 commented 15 hours ago

In PowerShell I get the below error.

Warning: Configuration not found at 'default'. Using default configuration. Do you mean 'default.json'? ⚠️ Error in reading JSON, attempting to repair JSON Error using json_repair: the JSON object must be str, bytes or bytearray, not NoneType ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 27, in choose_agent response = await create_chat_completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\utils\llm.py", line 60, in create_chat_completion response = await provider.get_chat_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\llm_provider\generic\base.py", line 116, in get_chat_response output = await self.llm.ainvoke(messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 307, in ainvoke llm_result = await self.agenerate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 796, in agenerate_prompt return await self.agenerate( ^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 756, in agenerate raise exceptions[0] File "C:\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 924, in _agenerate_with_cache result = await self._agenerate( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\langchain_openai\chat_models\base.py", line 824, in _agenerate response = await self.async_client.create(**payload) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\openai\resources\chat\completions.py", line 1661, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\openai_base_client.py", line 1839, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\openai_base_client.py", line 1533, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\openai_base_client.py", line 1634, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model gpt-4o-2024-08-06 does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Python312\Lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 242, in run_asgi result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 60, in call return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\site-packages\fastapi\applications.py", line 1054, in call await super().call(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette\applications.py", line 113, in call await self.middleware_stack(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette\middleware\errors.py", line 152, in call await self.app(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette\middleware\cors.py", line 77, in call await self.app(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette\middleware\exceptions.py", line 62, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app raise exc File "C:\Python312\Lib\site-packages\starlette_exception_handler.py", line 42, in wrapped_app await app(scope, receive, sender) File "C:\Python312\Lib\site-packages\starlette\routing.py", line 715, in call await self.middleware_stack(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette\routing.py", line 735, in app await route.handle(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette\routing.py", line 362, in handle await self.app(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette\routing.py", line 95, in app await wrap_app_handling_exceptions(app, session)(scope, receive, send) File "C:\Python312\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app raise exc File "C:\Python312\Lib\site-packages\starlette_exception_handler.py", line 42, in wrapped_app await app(scope, receive, sender) File "C:\Python312\Lib\site-packages\starlette\routing.py", line 93, in app await func(session) File "C:\Python312\Lib\site-packages\fastapi\routing.py", line 383, in app await dependant.call(*solved_result.values) File "C:\Users\azureuser1\gpt-researcher\backend\server\server.py", line 110, in websocket_endpoint await handle_websocket_communication(websocket, manager) File "C:\Users\azureuser1\gpt-researcher\backend\server\server_utils.py", line 121, in handle_websocket_communication await handle_start_command(websocket, data, manager) File "C:\Users\azureuser1\gpt-researcher\backend\server\server_utils.py", line 28, in handle_start_command report = await manager.start_streaming( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\azureuser1\gpt-researcher\backend\server\websocket_manager.py", line 66, in start_streaming report = await run_agent(task, report_type, report_source, source_urls, tone, websocket, headers = headers, config_path = config_path) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\azureuser1\gpt-researcher\backend\server\websocket_manager.py", line 108, in run_agent report = await researcher.run() ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\azureuser1\gpt-researcher\backend\report_type\basic_report\basic_report.py", line 41, in run await researcher.conduct_research() File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\agent.py", line 90, in conduct_research self.agent, self.role = await choose_agent( ^^^^^^^^^^^^^^^^^^^ File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 44, in choose_agent return await handle_json_error(response) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 55, in handle_json_error json_string = extract_json_with_regex(response) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\azureuser1\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 71, in extract_json_with_regex json_match = re.search(r"{.?}", response, re.DOTALL) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python312\Lib\re__init__.py", line 177, in search return _compile(pattern, flags).search(string) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: expected string or bytes-like object, got 'NoneType' INFO: connection closed

Desktop (please complete the following information):

GPT Researcher Error GTP Researcher Error (2) GTP Researcher Error (3) GPT Researcher Error (4)

ouarkainfo commented 4 hours ago

same issue. it does not work.