Significant-Gravitas / AutoGPT

AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
https://agpt.co
MIT License
166.45k stars 44.06k forks source link

urllib.request.urlopen always gets HTTP Error 502: Bad Gateway #4159

Closed w2404 closed 12 months ago

w2404 commented 1 year ago

⚠️ Search for existing issues first ⚠️

Which Operating System are you using?

Linux

Which version of Auto-GPT are you using?

Master (branch)

GPT-3 or GPT-4?

GPT-3.5

Steps to reproduce 🕹

  1. Open a http server python -m http.server 8123
  2. Prepend the following code to /autogpt/commands/web_selenium.py
    import urllib.request
    urllib.request.urlopen('http://localhost:8123')
  3. Execute python -m autogpt

Current behavior 😯

2023-05-13 17:44:58.116572: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: SSE3 SSE4.1 SSE4.2 AVX AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-05-13 17:44:59.689705: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:996] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355
2023-05-13 17:44:59.691088: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:996] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355
2023-05-13 17:44:59.691273: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:996] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355
NEWS:  Welcome to Auto-GPT!
NEWS:  Below you'll find the latest Auto-GPT News and updates regarding features!
NEWS:  If you don't wish to see this message, you can run Auto-GPT with the --skip-news flag.
NEWS:  
NEWS:  Website and Documentation Site 📰📖NEWS:  Check out https://agpt.co, the official news & updates site for Auto-GPT!
NEWS:  The documentation also has a place here, at https://docs.agpt.co
NEWS:  
NEWS:  🚀 v0.3.0 Release 🚀NEWS:  Over a week and 275 pull requests have passed since v0.2.2, and we are happy to announce
NEWS:  the release of v0.3.0! From now on, we will be focusing on major improvements rather
NEWS:  than bugfixes, as we feel stability has reached a reasonable level. Most remaining
NEWS:  issues relate to limitations in prompt generation and the memory system, which will be
NEWS:  the focus of our efforts for the next release.
NEWS:  
NEWS:  Highlights and notable changes in this release:
NEWS:  
NEWS:  Plugin support 🔌NEWS:  Auto-GPT now has support for plugins! With plugins, you can extend Auto-GPT's abilities,
NEWS:  adding support for third-party services and more.
NEWS:  See https://github.com/Significant-Gravitas/Auto-GPT-Plugins for instructions and available plugins.
NEWS:  
NEWS:  Changes to Docker configuration 🐋NEWS:  The workdir has been changed from /home/appuser to /app.
NEWS:  Be sure to update any volume mounts accordingly!
NEWS:  
NEWS:  ⚠ Command `send_tweet` is DEPRECATED, and will be removed in v0.4.0 ⚠
NEWS:  Twitter functionality (and more) is now covered by plugins, see [Plugin support 🔌]
NEWS:  
WARNING:  You are running on `master` branch - this is not a supported branch.
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/home/rwkv/Auto-GPT-proxy/autogpt/__main__.py", line 5, in <module>
    autogpt.cli.main()
  File "/usr/lib/python3.11/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/click/core.py", line 1635, in invoke
    rv = super().invoke(ctx)
         ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/click/decorators.py", line 26, in new_func
    return f(get_current_context(), *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/rwkv/Auto-GPT-proxy/autogpt/cli.py", line 90, in main
    run_auto_gpt(
  File "/home/rwkv/Auto-GPT-proxy/autogpt/main.py", line 147, in run_auto_gpt
    command_registry.import_commands(command_category)
  File "/home/rwkv/Auto-GPT-proxy/autogpt/commands/command.py", line 110, in import_commands
    module = importlib.import_module(module_name)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1206, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1178, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/rwkv/Auto-GPT-proxy/autogpt/commands/web_selenium.py", line 7, in <module>
    urllib.request.urlopen('http://localhost:8080')
  File "/usr/lib/python3.11/urllib/request.py", line 216, in urlopen
    return opener.open(url, data, timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 525, in open
    response = meth(req, response)
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 634, in http_response
    response = self.parent.error(
               ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 563, in error
    return self._call_chain(*args)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 496, in _call_chain
    result = func(*args)
             ^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 643, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 502: Bad Gateway

Expected behavior 🤔

no error occurs

Your prompt 📝

N/A

Your Logs 📒

N/A

w2404 commented 1 year ago

At first, my autogpt was not able to use selenium, So I tried to pack selenium related code to a separated web service, where I can debug selenium alone.

Within autogpt's web_selenium.py, I would use urllib.request to call this web service.

Then I figured out, all the urllib requests failed with 502 errors, and that is why the selenium itself failed at the beginning, as selenium also calls urllib requests.

If I run the following code as a standalone script, no error occurs, just like it should be.

import urllib.request
urllib.request.urlopen('http://localhost:8123')

So what could be the reason, that inside autogpt's web_selenium.py, urllib requests fail with 502 errors?

w2404 commented 1 year ago

A walk around for my own problem, is to replace the http service with a socket service. A socket connection is able work inside my autogpt.

def readnum(sock):
    s=b''
    ss=sock.recv(1)
    while not ss==b' ':
        s+=ss
        ss=sock.recv(1)
    return int(s)

def sendcmd(obj):
    HOST, PORT = "localhost", 12723
    with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as sock:
        sock.connect((HOST, PORT))
        s=json.dumps(obj)
        print(len(s))
        s=f'{len(s)} {s}'.encode()
        sock.sendall(s)
        size=readnum(sock)
        print('size',size)
        received = sock.recv(size).decode()
        o=json.loads(received )
ntindle commented 1 year ago

How are you getting a 502 locally??

w2404 commented 1 year ago

How are you getting a 502 locally??

Yes, that is what I want to know.

github-actions[bot] commented 1 year ago

This issue has automatically been marked as stale because it has not had any activity in the last 50 days. You can unstale it by commenting or removing the label. Otherwise, this issue will be closed in 10 days.

github-actions[bot] commented 12 months ago

This issue was closed automatically because it has been stale for 10 days with no activity.