Closed Curiosity007 closed 1 year ago
Hi @Curiosity007 , please click on your avatar in the top navigation bar and click "Clear Chat History". This will reset the database. Currently, because we lack migrations, when we make schema updates you should run this reset after pulling the latest changes.
So this was my fresh and first install on a new virtual environment. And i tried resetting the database, but somehow it is not working as intended.
And python main.py is also not working. I tried older version of your repository before the cors implementation, and it works well.
@Curiosity007 any chance you could join the discord so we can chat about it?
I'll try to start a fresh container image
By any chance, did you rebuild the image as such before running it?
docker compose up --build
No. I just used docker-compose up
Let me mention 1 additional information. I am running this on WSL2, and somehow this 5173 port is not accessible to my outside network, even though I opened it via netsh. So, I am running ngrok over it, and accessing the URL. This should not be the cause of the error, right? And also, ngrock only works with Docker.
But when I run python main.py, ngrock is not able to pick up the connection. it says connection refused. Which means, running python script is not creating any service which is running at port 5173
thanks for letting me know. I'll boot up Windows and give it a shot when HuggingFace fixes their network issues. Right now, building the container image doesn't work.
Yes. Just now tried to do the docker image and saw the time out error. Is it possible to add the option of checking the sentence transformers model in local folder, and skip downloading if it is already present?
Thank you for prompt action. Will wait for your fixes
@Curiosity007 I'm not able to reproduce your issue :/ What I can think of is
./app/data/brainchulo.db
then refresh.a. Start ooba
b. From the brainchulo root directory, type: docker compose up --build
then navigate to http://127.0.0.1:5173
.
You may need to ensure it isn't running by first typing docker compose down
.
Working fine here too. However, I don't see the OPTIONS requests. Could it be that CORS is an issue because you're using ngrok?
That's a great point. @Curiosity007 , if you use ngrok, make sure to add the url to the list of allowed origins.
Error is related to ngrock, that I am sure now. Question is, how do I resolve it? Because in WSL2, i have ran other docker and streamlit applications, and I can access the ports easily. However, this application with this docker port, I am unable to access. Is there any way to expose the app via gradio or something?
So, I am not using ngrock now. I used this command to get the local ip ip addr show eth0
.
Got this
Added this line -
Now Chat window loaded.
But when I hit any chat , this error
If I do not give these lines in the origin, chat window will not load
"http://localhost:7865",
"http://127.0.0.1:7865",
"http://0.0.0.0:7865",
This is the error in the console
backend_1 | requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=5000): Max retries exceeded with url: /api/v1/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f0d747a21a0>: Failed to establish a new connection: [Errno 111] Connection refused'))
These are the extension I am using in my ooba -
--share --sdp-attention --wbits 4 --groupsize 128 --model_type llama --listen --no-stream --verbose --api
Error is related to CORS
Maybe this is also another culprit -
The only change I did in docker-compose.yaml is to comment the #network_mode: "host" because I was getting this error -
looks like you should update 0.0.0.0 to maybe localhost or 127.0.0.1? I think the issue is now that your setup cannot access Ooba. If you want to fast-track this issue you should join our Discord so we can discuss.
Thank you. Yes, had to update my .env file to
OPENAI_API_KEY=<not needed when using VicunaLLM>
CHAT_API_URL=http://0.0.0.0:5000/api/v1/generate
MEMORIES_PATH=memories/
UPLOAD_PATH=uploads/
DOCUMENT_STORE_NAME=brainchulo_docs
CONVERSATION_STORE_NAME=brainchulo_convos
Also, I did one more thing. I uninstalled docker and everything. Installed Docker 3 on WSL2. Ran the docker build, and voila. I can access the chat.
Closing this issue.
And installed Chrome in WSL2 and accessed Chrome. So I was using localhost IP and no more CORS issue.
I have 1 feature request. Can .env accommodate origin ip? So, when I am changing the ip, instead of changing main.py, I can update the .env file
I can start the app via docker method. But while uploading/conversation etc. giving errors
So, docker method is not working. However, I can see the frontend GUI.
The only change I did in docker-compose.yaml is to comment the #network_mode: "host" because I was getting this error -
ERROR: for brainchulo_backend_1 "host" network_mode is incompatible with port_bindings
However, I actually want to use the python main.py variation to give me flexibility. When I run the script, i can see port 7865 is running and I can see the webpage. However, when I try to run localhost:5173, I am not getting any interface. How do I run using only main.py?