Open Kas1o opened 1 week ago
In the latest llama.cpp build. (b3190) if you provide a grammer without definition of root then will get a error like below:
root
llama_sampling_init: grammar does not contain a 'root' symbol ERR [ send_error] task error | ... error="Failed to parse grammar"
And, In koboldcpp1.68 windows cuda12 there will only have:
[WinError -529697949] Windows Error 0xe06d7363
In the latest llama.cpp build. (b3190) if you provide a grammer without definition of
root
then will get a error like below:And, In koboldcpp1.68 windows cuda12 there will only have: