edgar971 / open-chat

A self-hosted, offline, ChatGPT-like chatbot with different LLM support. 100% private, with no data leaving your device.
MIT License
65 stars 8 forks source link

unRAID docker container (cpu version) not starting; curl missing #2

Closed CodingAna closed 1 year ago

CodingAna commented 1 year ago

Hi, I cannot start my unRAID docker container and the log says /app/run.sh: 19: curl: not found. I am using the default configuration and the cpu version.

There's also a warning

/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_fields.py:127: UserWarning: Field "model_alias" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`.
  warnings.warn(
jsnel commented 1 year ago

I was just about to comment the same. Starting from the unraid template:

/app/run. sh: 19: curl: not found
Model file not found. Downloading...

image

viktortrass commented 1 year ago

Hello,

I can't start the virtual machine, it stops!

image

donkeykong74 commented 1 year ago

Same here, not sure what to download either to put in the models directory.

chrizzo84 commented 1 year ago

Hello,

I can't start the virtual machine, it stops!

image

Same problem here :( Also got the model_config error.

dword4 commented 1 year ago

Looking at the two Dockerfiles they are trying to build containers with alpine but using apt-get for package install, this probably is at least part of the problem with this not starting correctly. It needs to be using apk instead as that is the default Alpine package manager

timocapa commented 1 year ago

FWIW this fails even if the model is present.

/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_fields.py:127: UserWarning: Field "model_alias" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`.
  warnings.warn(
llama.cpp: loading model from /models/llama-2-7b-chat.bin
/models/llama-2-7b-chat.bin model found.
Initializing server with:
Batch size: 2096
Number of CPU threads: 4
Context window: 4096

> ai-chatbot-starter@0.1.0 start
> next start

ready - started server on 0.0.0.0:3000, url: http://localhost:3000

** Press ANY KEY to close this window ** 
Dustinhoefer commented 1 year ago

FWIW this fails even if the model is present.

/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_fields.py:127: UserWarning: Field "model_alias" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`.
  warnings.warn(
llama.cpp: loading model from /models/llama-2-7b-chat.bin
/models/llama-2-7b-chat.bin model found.
Initializing server with:
Batch size: 2096
Number of CPU threads: 4
Context window: 4096

> ai-chatbot-starter@0.1.0 start
> next start

ready - started server on 0.0.0.0:3000, url: http://localhost:3000

** Press ANY KEY to close this window ** 

Same here, downloaded manually, container starts and then just stops

ptichalouf commented 1 year ago

Same here with cuda version :)

guizaodev commented 1 year ago

Same error

Captura de Tela 2023-09-01 às 15 53 56
AnterosOberon commented 1 year ago

I am having the same error, but as someone who has been using anaconda to do this on my gaming computer, I am so stoked to see an unraid container and look forward to seeing the fist round of issues resolved. Great effort!

jimserio commented 1 year ago

FWIW this fails even if the model is present.

/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_fields.py:127: UserWarning: Field "model_alias" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`.
  warnings.warn(
llama.cpp: loading model from /models/llama-2-7b-chat.bin
/models/llama-2-7b-chat.bin model found.
Initializing server with:
Batch size: 2096
Number of CPU threads: 4
Context window: 4096

> ai-chatbot-starter@0.1.0 start
> next start

ready - started server on 0.0.0.0:3000, url: http://localhost:3000

** Press ANY KEY to close this window ** 

Same here, downloaded manually, container starts and then just stops

+1 same error. Manually downloaded the model. It finds it fine, but the container still stops with the Python namespace error.

edgar971 commented 1 year ago

Fixing this now, the CPU version was not supposed to be release yet. Sorry about that.

edgar971 commented 1 year ago

Fixed in v1.0.4 version: ghcr.io/edgar971/open-chat-cpu:v1.0.4