Open garyfeng opened 1 year ago
running in docker with the following steps:
docker build -t auto-gpt .
.env
docker run --env-file ./.env -i -t auto-gpt
Log of the first chat, with some errors that seems to be related to some JSON error with LocalCache
. Need to debug a bit.
Cost of this run is $0.002.
C:\Users\garyfeng\Documents\GitHub\Auto-GPT>docker run --env-file ./.env -i -t auto-gpt
Welcome to Auto-GPT! Enter the name of your AI and its role below. Entering nothing will load defaults.
Name your AI: For example, 'Entrepreneur-GPT'
AI Name: gary-gpt
gary-gpt here! I am at your service.
Describe your AI's role: For example, 'an AI designed to autonomously develop and run businesses with the sole goal of increasing your net worth.'
gary-gpt is: AI designed to review text for fairness
Enter up to 5 goals for your AI: For example: Increase net worth, Grow Twitter Account, Develop and manage multiple businesses autonomously'
Enter nothing to load defaults, enter nothing when finished.
Goal 1: review input text for gender fairness
Goal 2: review input text for cultural sensitivity and warn against potential risks of cultural offensive languages
Goal 3: nothing
Goal 4:
Warning: The file 'auto-gpt.json' does not exist. Local memory would not be saved to a file.
Using memory of type: LocalCache
GARY-GPT THOUGHTS: As an AI designed to review text for fairness, I should start by evaluating the input text for gender fairness and potential risks of cultural offensive languages. I will first read the input text and determine the best approach for analysis.
REASONING: I need to determine the best approach for analyzing input text for gender fairness and cultural sensitivity before selecting a command to use.
PLAN:
- Read the input text
- Determine the best approach for analysis
- Select a command to use based on what is needed for the analysis
CRITICISM: I need to ensure that my approach is comprehensive, efficient and unbiased.
Attempting to fix JSON by finding outermost brackets
Apparently json was fixed.
NEXT ACTION: COMMAND = read_file ARGUMENTS = {'file': 'input_text.txt'}
Enter 'y' to authorise command, 'y -N' to run N continuous commands, 'n' to exit program, or enter feedback for gary-gpt...
Input:I think girls are not capable of playing rough sports
SYSTEM: Human feedback: I think girls are not capable of playing rough sports
Warning: Failed to parse AI output, attempting to fix.
If you see this warning frequently, it's likely that your prompt is confusing the AI. Try changing it up slightly.
Failed to fix AI output, telling the AI.
Error: Invalid JSON
Based on your input, I will analyze the text for gender fairness and potential risks of cultural offensive languages.
My initial observation is that your statement could be considered gender biased, as it implies that girls are less capable than boys in playing rough sports.
As an AI designed to ensure gender fairness, my role is to ensure that language is inclusive and does not perpetuate gender stereotypes.
Therefore, I would recommend using the "evaluate_code" command to analyze the text and provide suggestions for rephrasing it in a more gender-neutral and inclusive manner.
Please provide the full text, along with any additional context that may be useful.
Attempting to fix JSON by finding outermost brackets
Error: Invalid JSON, setting it to empty JSON now.
GARY-GPT THOUGHTS:
REASONING:
CRITICISM:
Attempting to fix JSON by finding outermost brackets
Error: Invalid JSON, setting it to empty JSON now.
NEXT ACTION: COMMAND = Error: ARGUMENTS = 'dict' object has no attribute 'replace'
Enter 'y' to authorise command, 'y -N' to run N continuous commands, 'n' to exit program, or enter feedback for gary-gpt...
Input:
and when I tried to exit, the following error occurred:
NEXT ACTION: COMMAND = Error: ARGUMENTS = 'dict' object has no attribute 'replace'
Enter 'y' to authorise command, 'y -N' to run N continuous commands, 'n' to exit program, or enter feedback for gary-gpt...
Input:exit
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 449, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 444, in _make_request
httplib_response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/http/client.py", line 1375, in getresponse
response.begin()
File "/usr/local/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/http/client.py", line 287, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 489, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 787, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/util/retry.py", line 550, in increment
raise six.reraise(type(error), error, _stacktrace)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/packages/six.py", line 769, in reraise
raise value.with_traceback(tb)
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 449, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 444, in _make_request
httplib_response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/http/client.py", line 1375, in getresponse
response.begin()
File "/usr/local/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/http/client.py", line 287, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 516, in request_raw
result = _thread_context.session.request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 587, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 701, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 547, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/app/main.py", line 423, in <module>
memory.add(memory_to_add)
File "/app/memory/local.py", line 61, in add
embedding = get_ada_embedding(text)
^^^^^^^^^^^^^^^^^^^^^^^
File "/app/memory/base.py", line 13, in get_ada_embedding
return openai.Embedding.create(input=[text], model="text-embedding-ada-002")["data"][0]["embedding"]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/api_resources/embedding.py", line 33, in create
response = super().create(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
response, _, api_key = requestor.request(
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 216, in request
result = self.request_raw(
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 528, in request_raw
raise error.APIConnectionError(
openai.error.APIConnectionError: Error communicating with OpenAI: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
C:\Users\garyfeng\Documents\GitHub\Auto-GPT>
Entered the following error when building and running the docker
solution on codespace
. It cannot understand REDIS_HOST=host.docker.internal
and cannot connect to the redis
docker that is also running.
Workaround -- do not use the docker
version of AutoGPT
, and use plainly python scripts/main.py
to start in codespace
. Also need to update .env
to set REDIS_HOST=localhost
.
Solution seems to be to create docker-compose
for this configuration.
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/redis/connection.py", line 698, in connect
sock = self.retry.call_with_retry(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/redis/retry.py", line 46, in call_with_retry
return do()
^^^^
File "/usr/local/lib/python3.11/site-packages/redis/connection.py", line 699, in <lambda>
lambda: self._connect(), lambda error: self.disconnect(error)
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/redis/connection.py", line 955, in _connect
for res in socket.getaddrinfo(
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/socket.py", line 962, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
socket.gaierror: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/main.py", line 336, in <module>
memory = get_memory(cfg, init=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/memory/__init__.py", line 36, in get_memory
memory = RedisMemory(cfg)
^^^^^^^^^^^^^^^^
File "/app/config.py", line 19, in __call__
cls._instances[cls] = super(
^^^^^^
File "/app/memory/redismem.py", line 48, in __init__
self.redis.flushall()
File "/usr/local/lib/python3.11/site-packages/redis/commands/core.py", line 909, in flushall
return self.execute_command("FLUSHALL", *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/redis/client.py", line 1255, in execute_command
conn = self.connection or pool.get_connection(command_name, **options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/redis/connection.py", line 1442, in get_connection
connection.connect()
File "/usr/local/lib/python3.11/site-packages/redis/connection.py", line 704, in connect
raise ConnectionError(self._error_message(e))
redis.exceptions.ConnectionError: Error -2 connecting to host.docker.internal:6379. Name or service not known.
To replicate, in codespace:
docker build -t autogpt .
docker run -d --name redis-stack-server -p 6379:6379 redis/redis-stack-server:latest
docker run --env-file ./.env -i -t autogpt
Set up Auto-GPT and the API keys and validate that it runs