Closed Mte90 closed 18 hours ago
Hi @Mte90, thank you for trying Tabby.
It seems that the llama-server lacks execution permission. Could you try the following and then restart tabby:
chmod +x llama-server
We have fixed this issue, but it hasn't been released yet. The fix will be included in the next release.
I confirm that this fixes :-D
Describe the bug
Information about your version
0.20.0
Information about your GPU
Additional context
So I just downloaded the latest version for cuda and executed with
./tabby serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct --device cuda
and I see that there is that error but it still proceed.The model has permission and I am executed tabby as normal user (also other models were downloaded but the error is just for this one).