Closed agenceblink closed 5 days ago
Hello! Can you confirm that the ezLocalai container is running? It takes awhile to boot up that container.
Hi Josh and thanks for your quick reply. Yes, I've noticed that the ezlocalai container can take a little time but it's online and I can access http://0.0.0.0:8091.
To give a bit more context: it's precisely the "Chat completion" that seems to be causing the problem.
In fact, I've just created a Gpt4free agent by copying the parameters of the default one that works and I've got the same problem:
Thanks in advance for your support - let me know if I can buy you a coffee :)
Hi Josh! I've done a couple of full reinstallations, the last one with python start.py --agixt-auto-update true --with-ezlocalai true --gpu-layers 22 But nothing seems to be working, and I can't access port :8502 So it's really frustrating because I'm SO excited about using AGIXT :/.
Thanks in advance for your feedback 👍
UP : As I'm very motivated, I didn't give up and I did lots of tests, particularly with api_url but it was impossible to get ezlocalai to work. In fact, I even created a project with Claude sonet, feeding him all the docs and gith pages, but unfortunately it didn't work :/ 🙏
@agenceblink Hey I got the same problem here with ezlocalai demo not accessible via port 8502. Not great but at least I am not the only one. The error happens just on my machine which has a GPU, on another machine without GPU the Demo is reachable. I need to "recreate" the error, because I moved on using other frameworks. I finally came back to AGiXT because it looked and feeled the best and I see the most potential in it. So great work here Josh-XT (:
I will perform a new install and see if I still run into the error.
@sx584 I've done a lot of installations in different ways, especially to get the hang of docker desktop, and I haven't been able to get the API to work. That said, I haven't tried it without a GPU. In the meantime, Josh-XT's work is indeed impressive :)
Sorry for the late response here. I believe the solution would be to set the agents embedder to "default". There appears to be an issue with the way I am sending embeddings from ezLocalai and OpenAI to Chroma currently.
The embedder has been forced on the back end to default
in the latest release. I will revisit third party embedders in the near future, but the default one has been working great.
Description
Hi! I hope I'm not bothering you for nothing. I've installed the scripts on Docker Desktop and my first try with "gpt4free" works just fine. However, as soon as I try with "ezlocalai", I get the following error: "openai.APIConnectionError: Connection error" I made sure not to put an API key in the .env file and nothing in the backend input.
Am I missing something? Thanks in advance ;)
Operating System
Acknowledgements