Closed ljw20180420 closed 3 months ago
Make sure to add a correct PUBLIC_ORIGIN
env variable! That's probably your issue here
Thank you. I set PUBLIC_ORIGIN as http://localhost:8000. But It not works.
services:
test_Phi-3-mini-4k-instruct-gguf:
restart: always
container_name: test_Phi-3-mini-4k-instruct-gguf
image: ghcr.io/ggerganov/llama.cpp:server
volumes:
- "./chat-ui/llama.cpp:/models"
command: -m /models/Phi-3-mini-4k-instruct-q4.gguf --host 0.0.0.0 -c 4096
environment:
PUBLIC_ORIGIN: "http://localhost:8000"
PUBLIC_ORIGIN should be on the chat-ui
service, not the llama.cpp server 😄
It works! Thank you. 😄
This is .env.local
This is nginx configuration.
When I access chat-ui by localhost:3000, and chat Phi-3-mini-4k-instruct, it works. However, if I access chat-ui by localhost:8000/chat-ui-db, and chat Phi-3, it reports
How can I proxy chat-ui by nginx subpath?