Closed CodingAna closed 1 year ago
I was just about to comment the same. Starting from the unraid template:
/app/run. sh: 19: curl: not found
Model file not found. Downloading...
Hello,
I can't start the virtual machine, it stops!
Same here, not sure what to download either to put in the models directory.
Hello,
I can't start the virtual machine, it stops!
Same problem here :( Also got the model_config error.
Looking at the two Dockerfiles they are trying to build containers with alpine but using apt-get for package install, this probably is at least part of the problem with this not starting correctly. It needs to be using apk instead as that is the default Alpine package manager
FWIW this fails even if the model is present.
/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_fields.py:127: UserWarning: Field "model_alias" has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`.
warnings.warn(
llama.cpp: loading model from /models/llama-2-7b-chat.bin
/models/llama-2-7b-chat.bin model found.
Initializing server with:
Batch size: 2096
Number of CPU threads: 4
Context window: 4096
> ai-chatbot-starter@0.1.0 start
> next start
ready - started server on 0.0.0.0:3000, url: http://localhost:3000
** Press ANY KEY to close this window **
FWIW this fails even if the model is present.
/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_fields.py:127: UserWarning: Field "model_alias" has conflict with protected namespace "model_". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`. warnings.warn( llama.cpp: loading model from /models/llama-2-7b-chat.bin /models/llama-2-7b-chat.bin model found. Initializing server with: Batch size: 2096 Number of CPU threads: 4 Context window: 4096 > ai-chatbot-starter@0.1.0 start > next start ready - started server on 0.0.0.0:3000, url: http://localhost:3000 ** Press ANY KEY to close this window **
Same here, downloaded manually, container starts and then just stops
Same here with cuda version :)
Same error
I am having the same error, but as someone who has been using anaconda to do this on my gaming computer, I am so stoked to see an unraid container and look forward to seeing the fist round of issues resolved. Great effort!
FWIW this fails even if the model is present.
/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_fields.py:127: UserWarning: Field "model_alias" has conflict with protected namespace "model_". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`. warnings.warn( llama.cpp: loading model from /models/llama-2-7b-chat.bin /models/llama-2-7b-chat.bin model found. Initializing server with: Batch size: 2096 Number of CPU threads: 4 Context window: 4096 > ai-chatbot-starter@0.1.0 start > next start ready - started server on 0.0.0.0:3000, url: http://localhost:3000 ** Press ANY KEY to close this window **
Same here, downloaded manually, container starts and then just stops
+1 same error. Manually downloaded the model. It finds it fine, but the container still stops with the Python namespace error.
Fixing this now, the CPU version was not supposed to be release yet. Sorry about that.
Fixed in v1.0.4 version: ghcr.io/edgar971/open-chat-cpu:v1.0.4
Hi, I cannot start my unRAID docker container and the log says
/app/run.sh: 19: curl: not found
. I am using the default configuration and the cpu version.There's also a warning