c0sogi / LLMChat

A full-stack Webui implementation of Large Language model, such as ChatGPT or LLaMA.
MIT License
245 stars 40 forks source link

Fail to run api in docker #44

Closed oppokui closed 10 months ago

oppokui commented 10 months ago

When I run everything in docker, the api container fail to build llama.cpp as cmake didn't exist. Does it require the external gcc to be 11?

>docker-compose -f docker-compose-local.yaml up api
[+] Running 3/0
 ✔ Container llmchat-cache-1  Running                                                                                                                              0.0s 
 ✔ Container llmchat-db-1     Running                                                                                                                              0.0s 
 ✔ Container llmchat-api-1    Created                                                                                                                              0.0s 
Attaching to llmchat-api-1, llmchat-cache-1, llmchat-db-1
llmchat-api-1    | None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
llmchat-api-1    | [2023-09-19 07:50:40,323] SQLAlchemy:CRITICAL - Current DB connection of LocalConfig: db/traffic@traffic_admin
llmchat-api-1    | INFO:     Started server process [1]
llmchat-api-1    | INFO:     Waiting for application startup.
llmchat-api-1    | [2023-09-19 07:50:41,191] ApiLogger:CRITICAL - ⚙️ Booting up...
llmchat-api-1    | [2023-09-19 07:50:41,191] ApiLogger:CRITICAL - MySQL DB connected!
llmchat-api-1    | [2023-09-19 07:50:41,195] ApiLogger:CRITICAL - Redis CACHE connected!
llmchat-api-1    | [2023-09-19 07:50:41,195] ApiLogger:CRITICAL - uvloop installed!
llmchat-api-1    | [2023-09-19 07:50:41,195] ApiLogger:CRITICAL - Llama CPP server monitoring started!
llmchat-api-1    | INFO:     Application startup complete.
llmchat-api-1    | INFO:     Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
llmchat-api-1    | [2023-09-19 07:50:41,200] ApiLogger:ERROR - Llama CPP server is not available
llmchat-api-1    | [2023-09-19 07:50:41,200] ApiLogger:CRITICAL - Starting Llama CPP server
llmchat-api-1    | - Loaded .env file successfully.
llmchat-api-1    | - API_ENV: local
llmchat-api-1    | - DOCKER_MODE: True
llmchat-api-1    | - Parsing function for function calling: control_browser
llmchat-api-1    | - Parsing function for function calling: control_web_page
llmchat-api-1    | - Parsing function for function calling: web_search
llmchat-api-1    | - Parsing function for function calling: vectorstore_search
llmchat-api-1    | Using openai embeddings
llmchat-api-1    | 🦙 llama.cpp DLL not found, building it...
llmchat-api-1    | 🦙 Trying to build llama.cpp DLL: /app/repositories/llama_cpp/llama_cpp/build-llama-cpp-cublas.sh
llmchat-api-1    | /app/repositories/llama_cpp/llama_cpp/build-llama-cpp-cublas.sh: line 2: cd: /app/repositories/llama_cpp/vendor/llama.cpp: No such file or directory
llmchat-api-1    | /app/repositories/llama_cpp/llama_cpp/build-llama-cpp-cublas.sh: line 6: cmake: command not found
llmchat-api-1    | /app/repositories/llama_cpp/llama_cpp/build-llama-cpp-cublas.sh: line 7: cmake: command not found
llmchat-api-1    | cp: cannot stat '/app/repositories/llama_cpp/vendor/llama.cpp/build/bin/Release/libllama.so': No such file or directory
llmchat-api-1    | 🦙 Could not build llama.cpp DLL!
llmchat-api-1    | 🦙 Trying to build llama.cpp DLL: /app/repositories/llama_cpp/llama_cpp/build-llama-cpp-default.sh
llmchat-api-1    | /app/repositories/llama_cpp/llama_cpp/build-llama-cpp-default.sh: line 2: cd: /app/repositories/llama_cpp/vendor/llama.cpp: No such file or directory
llmchat-api-1    | /app/repositories/llama_cpp/llama_cpp/build-llama-cpp-default.sh: line 6: cmake: command not found
llmchat-api-1    | /app/repositories/llama_cpp/llama_cpp/build-llama-cpp-default.sh: line 7: cmake: command not found
llmchat-api-1    | cp: cannot stat '/app/repositories/llama_cpp/vendor/llama.cpp/build/bin/Release/libllama.so': No such file or directory
llmchat-api-1    | 🦙 Could not build llama.cpp DLL!
llmchat-api-1    | [2023-09-19 07:50:41,256] ApiLogger:WARNING - 🦙 Could not import llama-cpp-python repository: 🦙 Could not build llama.cpp DLL!
llmchat-api-1    | ...trying to import installed llama-cpp package...
llmchat-api-1    | INFO:     10.101.7.43:42488 - "GET / HTTP/1.1" 304 Not Modified
llmchat-api-1    | INFO:     10.101.7.43:42488 - "GET /main.dart.js HTTP/1.1" 200 OK
llmchat-api-1    | INFO:     10.101.7.43:49942 - "GET / HTTP/1.1" 304 Not Modified
llmchat-api-1    | INFO:     10.101.7.43:49942 - "GET /main.dart.js HTTP/1.1" 304 Not Modified
oppokui commented 10 months ago

By the way, the side effect is that UI didn't show login form.

Screenshot from 2023-09-19 15-54-05

oppokui commented 10 months ago

It works after waiting for a while.

oppokui commented 10 months ago

I meet the issue again. I can see the login form when I access it through hostname like: http://eng-test-sp15:8000/chat/. But when I access it through IP (http://10.206.237.241:8000/chat/), it failed to show login form, sometime just blank page. I check the network panel of browser debugger, it sounds two style/icon files can't be downloaded.

Screenshot from 2023-09-19 17-41-28

Screenshot from 2023-09-19 17-46-46