Josh-XT / AGiXT

AGiXT is a dynamic AI Agent Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, and a versatile plugin system, AGiXT delivers efficient and comprehensive AI solutions.
https://AGiXT.com
MIT License
2.66k stars 356 forks source link

Backend error: TypeError: expected string or bytes-like object #93

Closed VRImage closed 1 year ago

VRImage commented 1 year ago

Provider: oobabooga

Starting task with objective: say hello.

Executing task 1: Develop a task list.

INFO: 127.0.0.1:54204 - "GET /api/agent/Agent-LLM/task/status HTTP/1.1" 200 OK Exception in thread Thread-3 (run_task): Traceback (most recent call last): File "/home/garlen/miniconda3/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/home/garlen/miniconda3/lib/python3.10/threading.py", line 953, in run self._target(*self._args, *self._kwargs) File "/home/garlen/Agent-LLM/AgentLLM.py", line 235, in run_task result = self.execution_agent(task["task_name"], task["task_id"]) File "/home/garlen/Agent-LLM/AgentLLM.py", line 222, in execution_agent return self.run(prompt) File "/home/garlen/Agent-LLM/AgentLLM.py", line 87, in run commands = re.findall(r"Commands:(.)", self.response, re.MULTILINE) File "/home/garlen/miniconda3/lib/python3.10/re.py", line 240, in findall return _compile(pattern, flags).findall(string) TypeError: expected string or bytes-like object

gururise commented 1 year ago

I am getting this same error when trying to run oobabooga:

AI_PROVIDER=oobabooga
AI_MODEL=default

oobabooga via: python server.py --model alpaca-7b --listen --no-stream

Running Agent-LLM locally.

Josh-XT commented 1 year ago

Just pushed out an update to the main branch that may fix this.

gururise commented 1 year ago

Just pushed out an update to the main branch that may fix this.

did a git pull, rebuilt the frontend, launched the backend and got this error (I believe this is issue #94 ). I had not seen this problem in 1.08 or 1.09

INFO:     127.0.0.1:55742 - "GET /api/agent/undefined/command HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 436, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/applications.py", line 276, in __call__
    await super().__call__(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 92, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 147, in simple_response
    await self.app(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
    raise e
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
    await self.app(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/routing.py", line 237, in app
    raw_response = await run_endpoint_function(
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/routing.py", line 163, in run_endpoint_function
    return await dependant.call(**values)
  File "/home/gene/Downloads/Agent-LLM/app.py", line 137, in get_commands
    commands = Commands(agent_name)
  File "/home/gene/Downloads/Agent-LLM/Commands.py", line 10, in __init__
    self.commands = self.load_commands()
  File "/home/gene/Downloads/Agent-LLM/Commands.py", line 41, in load_commands
    command_class = getattr(module, module_name)()
  File "/home/gene/Downloads/Agent-LLM/commands/work_with_ai.py", line 13, in __init__
    name = f" AI Agent {agent.name}"
AttributeError: 'dict' object has no attribute 'name'

unable to continue testing due to the above.

Josh-XT commented 1 year ago

Please try again, I pushed an update to fix that error.

gururise commented 1 year ago

Please try again, I pushed an update to fix that error.

Sorry.. ;( Still getting the same exact error after pulling latest git and rebuilding front-end. EDIT: Actually the error is slightly different line numbers:

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 436, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/applications.py", line 276, in __call__
    await super().__call__(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 92, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 147, in simple_response
    await self.app(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
    raise e
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
    await self.app(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/routing.py", line 237, in app
    raw_response = await run_endpoint_function(
  File "/home/gene/Downloads/Agent-LLM/venv/lib/python3.10/site-packages/fastapi/routing.py", line 163, in run_endpoint_function
    return await dependant.call(**values)
  File "/home/gene/Downloads/Agent-LLM/app.py", line 137, in get_commands
    commands = Commands(agent_name)
  File "/home/gene/Downloads/Agent-LLM/Commands.py", line 10, in __init__
    self.commands = self.load_commands()
  File "/home/gene/Downloads/Agent-LLM/Commands.py", line 41, in load_commands
    command_class = getattr(module, module_name)()
  File "/home/gene/Downloads/Agent-LLM/commands/work_with_ai.py", line 15, in __init__
    name = f" AI Agent {agent.name}"
AttributeError: 'dict' object has no attribute 'name'
gururise commented 1 year ago

The above error is now resolved after the latest commit! Now, the oobabooga error is:

Response: {'error': 'This app has no endpoint /api/textgen/.'}
INFO:     127.0.0.1:36314 - "GET /api/agent/Agent-LLM/task/status HTTP/1.1" 200 OK

Task Result:

{'error': 'This app has no endpoint /api/textgen/.'}

Response: {'error': 'This app has no endpoint /api/textgen/.'}

New Tasks:

[{'task_name': {'error': 'This app has no endpoint /api/textgen/.'}}]

Exception in thread Thread-2 (run_task):
Traceback (most recent call last):
  File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.10/threading.py", line 953, in run
    self._target(*self._args, **self._kwargs)
  File "/home/gene/Downloads/Agent-LLM/AgentLLM.py", line 242, in run_task
    self.prioritization_agent()
  File "/home/gene/Downloads/Agent-LLM/AgentLLM.py", line 196, in prioritization_agent
    prompt = prompt.replace("{task_names}", ", ".join(task_names))
TypeError: sequence item 0: expected str instance, dict found
Agent-LLM
write a tweet about AI
INFO:     127.0.0.1:36314 - "GET /api/agent/Agent-LLM/task HTTP/1.1" 200 OK
Agent-LLM
Josh-XT commented 1 year ago

Updated endpoint in #98 , looks like they changed it on Oobas end.

gururise commented 1 year ago

Updated endpoint in #98 , looks like they changed it on Oobas end.

So it looks like they moved to a universal API endpoint now and are ditching the gradio one on port 7860. Here's the API example code they provide: https://github.com/oobabooga/text-generation-webui/blob/main/api-example.py

The next problem is their API runs on port 5000, which is the same port that the FastAPI server runs.

Edit: also need to add --api to the ooba commandline

Josh-XT commented 1 year ago

Updated endpoint in #98 , looks like they changed it on Oobas end.

So it looks like they moved to a universal API endpoint now and are ditching the gradio one on port 7860. Here's the API example code they provide: https://github.com/oobabooga/text-generation-webui/blob/main/api-example.py

The next problem is their API runs on port 5000, which is the same port that the FastAPI server runs.

Fantastic. That is two different conflicts on port 5000, so I will plan to change our back end port.

Josh-XT commented 1 year ago

https://github.com/Josh-XT/Agent-LLM/releases/tag/v1.1.0

Updated port to 7437 and the endpoint for ooba has been updated in the docs as well. Please let me know if you continue to have issues.

gururise commented 1 year ago

Still a few issues remain. Submitted a PR #101 to fix these issues with oobabooga.