Closed fastdaima closed 1 month ago
Is it showing any kind of error?
Try running the following command:
chmod +x model_name.llamafile
and then load it again.
Alternatively, you can:
Then, try loading it again.
By the way, I'm using Linux Mint. I'm not sure which distribution you're using, but good luck!
I'm running llamafile as a service under Rocky Linux 9. Here is the systemd unit files I created to run the dolphin-2.9-llama3-8b model, with llamafile listening on localhost port 8082.
/etc/systemd/system/llamafile.service
[Unit]
Description=Launch llamafile terminal program
After=network.target
[Service]
Type=simple
User=llamafile
Group=llamafile
EnvironmentFile=/etc/sysconfig/llamafile
ExecStart=/bin/sh /usr/local/bin/llamafile $LLAMA_ARGS
#StandardInput=tty
StandardOutput=journal
StandardError=journal
Restart=on-failure
[Install]
WantedBy=multi-user.target
/etc/sysconfig/llamafile
LLAMA_ARGS=--server --port 8082 --nobrowser --ctx-size 0 -m /home/llamafile/dolphin-2.9-llama3-8b-Q5_K_M.gguf
Thanks a lot @vlasky , I create a .sh script for it, then ran it as an linux service. Will try this one too.
I tried both ways,
I was not able to run the service in either ways.
Can anyone direct me on how to run it as a service.