Closed hololeo closed 2 months ago
I believe so, yes. The llamafile executable is a ZIP archive. For example:
$ unzip -vl foo.llamafile
Archive: o//llama.cpp/main/main
Length Method Size Cmpr Date Time CRC-32 Name
-------- ------ ------- ---- ---------- ----- -------- ----
5782 Defl:N 1917 67% 2022-03-17 07:00 7dea32f2 llama.cpp/server/public/completion.js
0 Stored 0 0% 2022-03-17 07:00 00000000 llama.cpp/server/public/history-template.txt
44464 Defl:N 11409 74% 2022-03-17 07:00 68515dd1 llama.cpp/server/public/index.html
22472 Defl:N 8487 62% 2022-03-17 07:00 bec881f4 llama.cpp/server/public/index.js
3695 Defl:N 1314 64% 2022-03-17 07:00 b4c6a62f llama.cpp/server/public/json-schema-to-grammar.mjs
0 Stored 0 0% 2022-03-17 07:00 00000000 llama.cpp/server/public/prompt-template.txt
...
You can see the frontend web assets are all in there. You can change them. You'd do that using the zipalign
command we distribute. After you run make -j32 && sudo make install
run `man zipalign.
Contact Details
No response
What happened?
Currently its possible to add more things into the the llamafile like the model.
I would like to add some front end application code so i can make a SPA + llm with the llamafile server running. Is this possible?
The benefit would be a developer could distribute a customized chat interface to a customized model
Version
0.8.4
What operating system are you seeing the problem on?
No response
Relevant log output
No response