mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
22.85k stars 1.74k forks source link

(Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f81e6220f40>: Failed to establish a new connection: [Errno 111] Connection refused')) #905

Open TomiLikesToCode opened 1 year ago

TomiLikesToCode commented 1 year ago

LocalAI version: Latest at this time

Environment, CPU architecture, OS, and Version: windows 64 bit

Describe the bug openai.error.APIConnectionError: Error communicating with OpenAI: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f81e6220f40>: Failed to establish a new connection: [Errno 111] Connection refused')) with AUTOGPT and other platforms that are trying to connect to localai api

To Reproduce

Expected behavior Connect to api normaly

Logs Traceback (most recent call last): File "/usr/local/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/local/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/app/autogpt/main.py", line 5, in autogpt.app.cli.main() File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1157, in call return self.main(args, kwargs) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1666, in invoke rv = super().invoke(ctx) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, ctx.params) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 783, in invoke return __callback(args, *kwargs) File "/usr/local/lib/python3.10/site-packages/click/decorators.py", line 33, in new_func return f(get_current_context(), args, kwargs) File "/app/autogpt/app/cli.py", line 121, in main run_auto_gpt( File "/app/autogpt/app/main.py", line 70, in run_auto_gpt create_config( File "/app/autogpt/app/configurator.py", line 99, in create_config config.fast_llm = check_model(config.fast_llm, "fast_llm", config=config) File "/app/autogpt/app/configurator.py", line 176, in check_model models = api_manager.get_models(openai_credentials) File "/app/autogpt/llm/api_manager.py", line 107, in get_models all_models = openai.Model.list(**openai_credentials)["data"] File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/listable_apiresource.py", line 60, in list response, , api_key = requestor.request( File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 288, in request result = self.request_raw( File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 609, in request_raw raise error.APIConnectionError( openai.error.APIConnectionError: Error communicating with OpenAI: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f81e6220f40>: Failed to establish a new connection: [Errno 111] Connection refused'))

CrashTD commented 1 year ago

I get a quite similar, not to say basically identical, error but under Linux running the AutoGPT docker. but without reference to a failed connection. happens after answering the "I want Auto-GPT to:" question.

NOTE:All files/directories created by this agent can be found inside its workspace at: /app/auto_gpt_workspace CMOGPT has been created with the following details: Name: CMOGPT Role: a professional digital marketer AI that assists Solo entrepreneurs in growing their businesses by providing world-class expertise in solving marketing problems for Saas, content products, agencies and more. Goals: Traceback (most recent call last): File "/usr/local/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/local/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/app/autogpt/main.py", line 5, in autogpt.app.cli.main() File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1157, in call return self.main(args, kwargs) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1666, in invoke rv = super().invoke(ctx) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, ctx.params) File "/usr/local/lib/python3.10/site-packages/click/core.py", line 783, in invoke return __callback(args, *kwargs) File "/usr/local/lib/python3.10/site-packages/click/decorators.py", line 33, in new_func return f(get_current_context(), args, **kwargs) File "/app/autogpt/app/cli.py", line 121, in main run_auto_gpt( File "/app/autogpt/app/main.py", line 159, in run_auto_gpt memory = get_memory(config) File "/app/autogpt/memory/vector/init.py", line 82, in get_memory raise NotImplementedError( NotImplementedError: The Redis memory backend has been rendered incompatible by work on the memory system, and has been removed temporarily.

localai-bot commented 1 year ago

:warning::warning::warning::warning::warning:

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me, I don't support (yet) replying!

:warning::warning::warning::warning::warning:

The issue seems to be related to the OpenAI API connection error. It could be caused by several reasons such as incorrect API key, incorrect API host or port, network issues, or misconfiguration of the LocalAI service. To troubleshoot this issue, you can follow these steps:

  1. Check if the API key is correct and valid by verifying it on the OpenAI dashboard.
  2. Ensure that the API host and port are correctly set in the LocalAI configuration file or environment variables.
  3. Check the network connectivity between the LocalAI service and the OpenAI API endpoint.
  4. Verify that the LocalAI service is properly configured and running. If the issue persists, you can also try restarting the LocalAI service or reinstalling the OpenAI CLI package to ensure a fresh installation.

Sources: