Open d-z-m opened 3 days ago
llama.cpp upstream has support for -fa in server. I noticed llamafile only has support for this option in CLI mode.
-fa
server
I propose it is added to server mode as well so we server mode users can reap the benefits.
This issue has been fixed in https://github.com/Mozilla-Ocho/llamafile/commit/4aea6060b202c2f17f393640ce8b7689ef6412b9
The fix has been incorporated into the recent llamafile v0.8.8 release.
llama.cpp upstream has support for
-fa
inserver
. I noticed llamafile only has support for this option in CLI mode.I propose it is added to server mode as well so we server mode users can reap the benefits.