I'm trying to reproduce examples of EM German with llamafile v0.6.2 in server mode.
The example page lists the used options and results for different models. While llamafile's usage help has the most options in it's "common" section, it doesn't accept the following invocation:
and fails with various "unknown argument:" errors.
Invocation with model and system-prompt specification only is accepted but seems to ignore the system prompt (prompt, user and bot name in the start/setup page aren't changed, option --prompt is accepted in cli mode only).
Is it possible to accept some more options (for all parameters available in the startup page) in server mode?
I'm trying to reproduce examples of EM German with llamafile v0.6.2 in server mode.
The example page lists the used options and results for different models. While llamafile's usage help has the most options in it's "common" section, it doesn't accept the following invocation:
and fails with various "unknown argument:" errors.
Invocation with model and system-prompt specification only is accepted but seems to ignore the system prompt (prompt, user and bot name in the start/setup page aren't changed, option --prompt is accepted in cli mode only).
Is it possible to accept some more options (for all parameters available in the startup page) in server mode?