Open Namec999 opened 1 week ago
@Namec999
Hi, thanks for reaching out!
Please provide container logs to us by entering these commands in the shell(These commands will save the container logs to your current working directory in the shell)
docker logs wrenai-wren-ui-1 >& wrenai-wren-ui.log && \
docker logs wrenai-wren-ai-service-1 >& wrenai-wren-ai-service.log && \
docker logs wrenai-wren-engine-1 >& wrenai-wren-engine.log && \
docker logs wrenai-ibis-server-1 >& wrenai-ibis-server.log
Please also provide the .env.ai file with credential key removed.
all the requested files are bellow :
.env.ai.txt .env.txt wrenai-ibis-server.log wrenai-wren-ai-service.log wrenai-wren-engine.log wrenai-wren-ui.log
@Namec999 I think the EMBEDDING_MODEL_DIMENSION
of your chosen embedding model nomic-emded-text
should be 768 instead of 8192. Reference: https://huggingface.co/nomic-ai/nomic-embed-text-v1.5
Could you change that and restart launcher and try again?
even when setting the dimension to 768, still have the same issue, with same errors
@Namec999 could you elaborate what does this sentence mean?
for the mebedings i am using my local ollama installation (windows : the app not the docker container)
Also I suppose you don't need to change the EMBEDDER_OLLAMA_URL
, the default should be fine. Also, did you pull the embedding model from Ollama already by running ollama pull nomic-emded-text
first?
yep, my ollama is well installed, and can reach the ollama api locally using postman et get my embeddings
Since Wren AI is running in docker containers, so using 192.168.x.x may not successfully access Ollama
The default url doesn't mean Ollama is running in Docker. It helps Wren AI running in Docker successfully resolve URL to access Ollama in your local machine
currently using my ollama sincs months with other tools like Dify, and works OK
also replaced the 192.168... url with the default one mentioned in the env file
@Namec999 or do you mind join our discord server and we book a time for a quick chat, so I can help you look at your issue? There must be something wrong out there that I missed out.
@Namec999 I've found the issue, now there is a bug in wren-ai-service. There are now two work-around solutions for you.
We'll fix this in the next release. Sorry for your inconvenience.
greate news :-)
thank you for your support
@Namec999 We've released a new version, please try again!
Hey, Greate News,
actuualy when trying the link you provided for the windows version (.zip), my system says that it contains a the Trojan:Script/Wacatac.H!ml virus !!!!
but when downloading the file from the Docs, it's OK, but i cannot see if it's the latest version or not.
best
tried downloading the 0.7.1-rc.1
but the wrenai-wren-ai-service-1 container keeps restarting :
with this console log :
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:5555 (Press CTRL+C to quit)
Traceback (most recent call last):
File "//src/force_deploy.py", line 7, in
@Namec999 the link I gave you is the latest version(0.7.1), 0.7.1-rc.1 is not.
Hey, Greate News,
actuualy when trying the link you provided for the windows version (.zip), my system says that it contains a the Trojan:Script/Wacatac.H!ml virus !!!!
but when downloading the file from the Docs, it's OK, but i cannot see if it's the latest version or not.
best
yeah, the file downloaded from the docs should also be the latest verion(0.7.1)
hello all,
i have a fresh instalaltion of wren ai 0.7,
using open ai LLM provider using groq url and Key
for the mebedings i am using my local ollama installation (windows : the app not the docker container)
every thngs seeams to be OK, i can access the UI, bet when asking for somthing the UI says : Failed to create asking task.
and i have this error in the wren-ui console log
how to manage this a try my fisrt working experience