Closed BoQsc closed 1 month ago
Very outdated.
It seems like llamacpp of llamafile is so old, it does not support mlock on Windows that is virtuallock.
We haven't updated the llama.cpp server since they removed vision support upstream. We're currently planning on replacing it with our own new server.
Contact Details
No response
What happened?
mlock doesn't work at all and webui of
llamafile --server
clearly shows that llama.cppllama-server
is very outdated in this release of llamafileVersion
llamafile-0.8.14
What operating system are you seeing the problem on?
Windows
Relevant log output