assafelovic / gpt-researcher

GPT based autonomous agent that does online comprehensive research on any given topic
https://gptr.dev
MIT License
12.98k stars 1.61k forks source link

Grtting error message on model gpt-4o #588

Open AIBDALabs opened 2 weeks ago

AIBDALabs commented 2 weeks ago

I just completed the setup for your repo and set the api key for my account. But the soon as I clicked on research button after putting in my prompt term, I got the following error log.

Error choosing agent: Error code: 404 - {'error': {'message': 'The model gpt-4o does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}} Default Agent ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 244, in run_asgi result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\fastapi\applications.py", line 1054, in call await super().call(scope, receive, send) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\applications.py", line 123, in call await self.middleware_stack(scope, receive, send) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\middleware\errors.py", line 151, in call await self.app(scope, receive, send) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\middleware\exceptions.py", line 65, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app raise exc File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 756, in call await self.middleware_stack(scope, receive, send) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 776, in app await route.handle(scope, receive, send) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 373, in handle await self.app(scope, receive, send) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 96, in app await wrap_app_handling_exceptions(app, session)(scope, receive, send) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app raise exc File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\starlette\routing.py", line 94, in app await func(session) File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\fastapi\routing.py", line 348, in app await dependant.call(values) File "C:\Users\test\Desktop\gptr\gpt-researcher\backend\server.py", line 53, in websocket_endpoint report = await manager.start_streaming(task, report_type, report_source, websocket) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\Desktop\gptr\gpt-researcher\backend\websocket_manager.py", line 57, in start_streaming report = await run_agent(task, report_type, report_source, websocket) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\Desktop\gptr\gpt-researcher\backend\websocket_manager.py", line 75, in run_agent report = await researcher.run() ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\Desktop\gptr\gpt-researcher\backend\report_type\basic_report\basic_report.py", line 18, in run await researcher.conduct_research() File "C:\Users\test\Desktop\gptr\gpt-researcher\gpt_researcher\master\agent.py", line 96, in conduct_research context = await self.get_context_by_search(self.query) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\Desktop\gptr\gpt-researcher\gpt_researcher\master\agent.py", line 177, in get_context_by_search sub_queries = await get_sub_queries(query=query, agent_role_prompt=self.role, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\Desktop\gptr\gpt-researcher\gpt_researcher\master\actions.py", line 100, in get_sub_queries response = await create_chat_completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\Desktop\gptr\gpt-researcher\gpt_researcher\utils\llm.py", line 93, in create_chat_completion response = await provider.get_chat_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\Desktop\gptr\gpt-researcher\gpt_researcher\llm_provider\openai\openai.py", line 61, in get_chat_response output = await self.llm.ainvoke(messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 191, in ainvoke llm_result = await self.agenerate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 609, in agenerate_prompt return await self.agenerate( ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 569, in agenerate raise exceptions[0] File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\langchain_core\language_models\chat_models.py", line 754, in _agenerate_with_cache result = await self._agenerate( ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\langchain_openai\chat_models\base.py", line 657, in _agenerate response = await self.async_client.create(messages=message_dicts, params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\resources\chat\completions.py", line 1214, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai_base_client.py", line 1790, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai_base_client.py", line 1493, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\test\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai_base_client.py", line 1584, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model gpt-4o does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

I'm using a valid API key and my account is also active yet it is giving me this error

assafelovic commented 2 weeks ago

It seems you do not have access to gpt-4o or maybe you haven't set your API Key correctly?

samiqazi commented 2 weeks ago

How can we change the model to 3.5 turbo, pls.