assafelovic / gpt-researcher

LLM based autonomous agent that does online comprehensive research on any given topic
https://gptr.dev
Apache License 2.0
13.99k stars 1.81k forks source link

gpt-researcher not work on docker. #798

Closed Octo8080X closed 1 week ago

Octo8080X commented 2 weeks ago

Describe the bug

I have identified the error expected string or bytes-like object, got 'NoneType'.

To Reproduce Steps to reproduce the behavior:

  1. git clone https://github.com/assafelovic/gpt-researcher.git
  2. cd gpt-researcher
  3. setup .env
  4. docker compose up --build
  5. error and does not return a respons.

Expected behavior A clear and concise description of what you expected to happen.

Screenshots If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

Additional context

On the web screen, when you register your survey, the following error is output to the console.

Traceback (most recent call last):
  File "/usr/src/app/gpt_researcher/master/actions.py", line 139, in choose_agent
    response = await create_chat_completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/gpt_researcher/utils/llm.py", line 61, in create_chat_completion
    response = await provider.get_chat_response(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/gpt_researcher/llm_provider/generic/base.py", line 97, in get_chat_response
    output = await self.llm.ainvoke(messages)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 297, in ainvoke
    llm_result = await self.agenerate_prompt(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 786, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 746, in agenerate
    raise exceptions[0]
  File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 922, in _agenerate_with_cache
    result = await self._agenerate(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/langchain_openai/chat_models/base.py", line 825, in _agenerate
    response = await self.async_client.create(**payload)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1339, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1816, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1510, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1611, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-4o` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 244, in run_asgi
    result = await self.app(self.scope, self.asgi_receive, self.asgi_send)  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 151, in __call__
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 77, in __call__
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 754, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 774, in app
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 371, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 96, in app
    await wrap_app_handling_exceptions(app, session)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 94, in app
    await func(session)
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 348, in app
    await dependant.call(**values)
  File "/usr/src/app/backend/server.py", line 89, in websocket_endpoint
    report = await manager.start_streaming(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/backend/websocket_manager.py", line 60, in start_streaming
    report = await run_agent(task, report_type, report_source, source_urls, tone, websocket, headers)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/backend/websocket_manager.py", line 97, in run_agent
    report = await researcher.run()
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/backend/report_type/basic_report/basic_report.py", line 41, in run
    await researcher.conduct_research()
  File "/usr/src/app/gpt_researcher/master/agent.py", line 111, in conduct_research
    self.agent, self.role = await choose_agent(
                            ^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/gpt_researcher/master/actions.py", line 157, in choose_agent
    return await handle_json_error(response)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/gpt_researcher/master/actions.py", line 168, in handle_json_error
    json_string = extract_json_with_regex(response)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/gpt_researcher/master/actions.py", line 184, in extract_json_with_regex
    json_match = re.search(r"{.*?}", response, re.DOTALL)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/re/__init__.py", line 176, in search
    return _compile(pattern, flags).search(string)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: expected string or bytes-like object, got 'NoneType'
INFO:     connection closed
assafelovic commented 2 weeks ago

@ElishaKay

ElishaKay commented 2 weeks ago

Welcome @Octo8080X

Looks like a .env issue on first glance.

Try this:

Add a .env file to the root folder with these values:

OPENAI_API_KEY=
TAVILY_API_KEY=

Get the keys from here: https://app.tavily.com/sign-in https://platform.openai.com/api-keys

Then restart docker with: docker compose up --build

May the force be with you 🙏

Octo8080X commented 2 weeks ago

@ElishaKay

Thank you

I tried again according to the instructions and was able to confirm that it works.

It was very helpful.

Octo8080X commented 1 week ago

Problem solved.