ChuloAI / BrainChulo

Harnessing the Memory Power of the Camelids
MIT License
145 stars 11 forks source link

Error - Both in Docker and while running Main.py #25

Closed Curiosity007 closed 1 year ago

Curiosity007 commented 1 year ago

I can start the app via docker method. But while uploading/conversation etc. giving errors

backend_1   | INFO:     172.18.0.1:41944 - "OPTIONS /conversations HTTP/1.1" 400 Bad Request
backend_1   | INFO:     172.18.0.1:33130 - "POST /conversations/null/files HTTP/1.1" 422 Unprocessable Entity
backend_1   | INFO:     172.18.0.1:44614 - "OPTIONS /conversations/null/messages HTTP/1.1" 400 Bad Request

So, docker method is not working. However, I can see the frontend GUI.

The only change I did in docker-compose.yaml is to comment the #network_mode: "host" because I was getting this error -

ERROR: for brainchulo_backend_1 "host" network_mode is incompatible with port_bindings

However, I actually want to use the python main.py variation to give me flexibility. When I run the script, i can see port 7865 is running and I can see the webpage. However, when I try to run localhost:5173, I am not getting any interface. How do I run using only main.py?

iGavroche commented 1 year ago

Hi @Curiosity007 , please click on your avatar in the top navigation bar and click "Clear Chat History". This will reset the database. Currently, because we lack migrations, when we make schema updates you should run this reset after pulling the latest changes.

Curiosity007 commented 1 year ago

So this was my fresh and first install on a new virtual environment. And i tried resetting the database, but somehow it is not working as intended.

And python main.py is also not working. I tried older version of your repository before the cors implementation, and it works well.

iGavroche commented 1 year ago

@Curiosity007 any chance you could join the discord so we can chat about it?

I'll try to start a fresh container image

iGavroche commented 1 year ago

By any chance, did you rebuild the image as such before running it? docker compose up --build

Curiosity007 commented 1 year ago

No. I just used docker-compose up

Curiosity007 commented 1 year ago

Let me mention 1 additional information. I am running this on WSL2, and somehow this 5173 port is not accessible to my outside network, even though I opened it via netsh. So, I am running ngrok over it, and accessing the URL. This should not be the cause of the error, right? And also, ngrock only works with Docker.

But when I run python main.py, ngrock is not able to pick up the connection. it says connection refused. Which means, running python script is not creating any service which is running at port 5173

iGavroche commented 1 year ago

thanks for letting me know. I'll boot up Windows and give it a shot when HuggingFace fixes their network issues. Right now, building the container image doesn't work.

Curiosity007 commented 1 year ago

Yes. Just now tried to do the docker image and saw the time out error. Is it possible to add the option of checking the sentence transformers model in local folder, and skip downloading if it is already present?

Thank you for prompt action. Will wait for your fixes

iGavroche commented 1 year ago

@Curiosity007 I'm not able to reproduce your issue :/ What I can think of is

1 there is an issue preventing the creation of the first conversation. It would be great to get more info from the web inspector. You should try to reset from the UI and the first XHR requests should look like:

image

2 Is it possible that you do not have write permissions to the db? Try deleting ./app/data/brainchulo.db then refresh.

3 I'm not sure what you're trying to do with ngrok but if ports are closed for WSL you need to enable the ports in your firewall rather than trying to go around the issue.

4 Pull the latest code and run it in Docker. All you need to do is:

a. Start ooba b. From the brainchulo root directory, type: docker compose up --build then navigate to http://127.0.0.1:5173. You may need to ensure it isn't running by first typing docker compose down.

paolorechia commented 1 year ago

Working fine here too. However, I don't see the OPTIONS requests. Could it be that CORS is an issue because you're using ngrok?

image

iGavroche commented 1 year ago

That's a great point. @Curiosity007 , if you use ngrok, make sure to add the url to the list of allowed origins.

Curiosity007 commented 1 year ago

Error is related to ngrock, that I am sure now. Question is, how do I resolve it? Because in WSL2, i have ran other docker and streamlit applications, and I can access the ports easily. However, this application with this docker port, I am unable to access. Is there any way to expose the app via gradio or something?

image

Curiosity007 commented 1 year ago

So, I am not using ngrock now. I used this command to get the local ip ip addr show eth0 . Got this

image

Added this line -

image

Now Chat window loaded.

image

But when I hit any chat , this error

image

If I do not give these lines in the origin, chat window will not load


    "http://localhost:7865",
    "http://127.0.0.1:7865",
    "http://0.0.0.0:7865",

This is the error in the console

backend_1 | requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=5000): Max retries exceeded with url: /api/v1/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f0d747a21a0>: Failed to establish a new connection: [Errno 111] Connection refused'))

These are the extension I am using in my ooba -

--share --sdp-attention --wbits 4 --groupsize 128 --model_type llama --listen --no-stream --verbose --api

Error is related to CORS

Maybe this is also another culprit -

The only change I did in docker-compose.yaml is to comment the #network_mode: "host" because I was getting this error -

iGavroche commented 1 year ago

looks like you should update 0.0.0.0 to maybe localhost or 127.0.0.1? I think the issue is now that your setup cannot access Ooba. If you want to fast-track this issue you should join our Discord so we can discuss.

Curiosity007 commented 1 year ago

Thank you. Yes, had to update my .env file to

OPENAI_API_KEY=<not needed when using VicunaLLM>
CHAT_API_URL=http://0.0.0.0:5000/api/v1/generate
MEMORIES_PATH=memories/
UPLOAD_PATH=uploads/
DOCUMENT_STORE_NAME=brainchulo_docs
CONVERSATION_STORE_NAME=brainchulo_convos

Also, I did one more thing. I uninstalled docker and everything. Installed Docker 3 on WSL2. Ran the docker build, and voila. I can access the chat.

Closing this issue.

image

And installed Chrome in WSL2 and accessed Chrome. So I was using localhost IP and no more CORS issue.

Curiosity007 commented 1 year ago

I have 1 feature request. Can .env accommodate origin ip? So, when I am changing the ip, instead of changing main.py, I can update the .env file