Open dermot-mcg opened 10 months ago
I meet the same problem like you .
http://localhost:8090/backend-api/conversations?offset=0&limit=2
any one can suggestion ?
Seams I may found the solution.
Example for VPS address is "192.168.0.10"
Change the localhost to the VPS address.
version: '3'
services:
chatgpt-client:
image: soulteary/chatgpt
restart: always
ports:
- 8090:8090
environment:
# service port
APP_PORT: 8090
# the ChatGPT client domain, keep the same with chatgpt-client: `APP_HOSTNAME` option
APP_HOSTNAME: "http://192.168.0.10:8090"
# the ChatGPT backend upstream, or connect a sparrow dev server `"http://host.docker.internal:8091"`
APP_UPSTREAM: "http://sparrow:8091"
sparrow:
image: soulteary/sparrow
restart: always
environment:
# [Basic Settings]
# => The ChatGPT Web Client Domain
WEB_CLIENT_HOSTNAME: "http://192.168.0.10:8090"
# => Service port, default: 8091
APP_PORT: 8091
# [Advanced Settings] *optional
# => Enable the new UI
FEATURE_NEW_UI: "on"
# => Enable history list
ENABLE_HISTORY_LIST: "on"
# => Enable i18n
ENABLE_I18N: "on"
# => Enable the data control
ENABLE_DATA_CONTROL: "on"
# => Enable the model switch
ENABLE_MODEL_SWITCH: "on"
# enable the openai official model (include the plugin model)
# ENABLE_OPENAI_OFFICIAL_MODEL: "on"
# [Plugin Settings] *optional
# => Enable the plugin
# ENABLE_PLUGIN: "on"
# => Enable the plugin browsing
# ENABLE_PLUGIN_BROWSING: "on"
# => Enable the plugin code interpreter
# ENABLE_PLUGIN_CODE_INTERPRETER: "on"
# enable the plugin model dev feature
# ENABLE_PLUGIN_PLUGIN_DEV: "on"
# [Private OpenAI API Server Settings] *optional
# => Enable OpenAI 3.5 API
ENABLE_OPENAI_API: "on"
# => OpenAI API Key
OPENAI_API_KEY: "sk-CHANGE ME WITH OWN API KEY"
# => Enable OpenAI API Proxy
# OPENAI_API_PROXY_ENABLE: "on"
# => OpenAI API Proxy Address, eg: `"http://127.0.0.1:1234"` or ""
# OPENAI_API_PROXY_ADDR: "http://127.0.0.1:1234"
# [Private Midjourney Server Settings] *optional
# => Enable Midjourney
# ENABLE_MIDJOURNEY: "on"
# => Enable Midjourney Only
# ENABLE_MIDJOURNEY_ONLY: "on"
# => Midjourney API Key
# MIDJOURNEY_API_SECRET: "your-secret"
# => Midjourney API Address, eg: `"ws://...."`, or `"ws://host.docker.internal:8092/ws"`
# MIDJOURNEY_API_URL: "ws://localhost:8092/ws"
# [Private Midjourney Server Settings] *optional
# => Enable FlagStudio
# ENABLE_FLAGSTUDIO: "on"
# => Enable FlagStudio only
# ENABLE_FLAGSTUDIO_ONLY: "off"
# => FlagStudio API Key
# FLAGSTUDIO_API_KEY: "your-flagstudio-api-key"
# [Private Claude Server Settings] *optional
# => Enable Claude
# ENABLE_CLAUDE: "on"
# => Enable Claude Only
# ENABLE_CLAUDE_ONLY: "on"
# => Claude API Key
# CLAUDE_API_SECRET: "your-secret"
# => Claude API Address, eg: `"ws://...."`, or `"ws://host.docker.internal:8093/ws"`
# CLAUDE_API_URL: "ws://localhost:8093/ws"
logging:
driver: "json-file"
options:
max-size: "10m"
After that I go to the VPS page where the docker is enable:
If ENABLE_OPENAI_OFFICIAL_MODEL
is enable, error The administrator has disabled the export capability of this model.
will show up.
To change model just use
/?model=gpt-4
for GPT-4
/?model=gpt-3
for GPT-3
Quick note: Ctrl R the page after update the docker. Try on incognito tab with no extension to rule out problem extension blocking request.
I get the following issue at the front end (hosting on a VPS, not locally, which has other containers for other services and thus a shared network - Nginx Proxy Manager):
docker-compose.yml