ggerganov / llama.cpp

LLM inference in C/C++
MIT License
68.05k stars 9.76k forks source link

Bug: llama-server not loading the UI #10404

Open anagri opened 2 hours ago

anagri commented 2 hours ago

What happened?

building the llama-server from scratch using the latest 8e752a7

./examples/server/deps.sh
rm -rf build && cmake -S . -B build && cmake --build build --config Release -j $(sysctl -n hw.logicalcpu) --target llama-server

on running the server, then loading the http://127.0.0.1:8080 getting the following logs -

main: server is listening on http://127.0.0.1:8080 - starting the main loop
srv  update_slots: all slots are idle
request: GET / 127.0.0.1 200
request: GET /index.js 127.0.0.1 404
request: GET /completion.js 127.0.0.1 200
request: GET /json-schema-to-grammar.mjs 127.0.0.1 404

the index.html is expecting /index.js and other files, which are returning 404. i have downloaded the dependencies, but still few files are missing. kindly check if these files are downloaded with deps.sh.

Name and Version

$ ./build/bin/llama-cli --version version: 4131 (8e752a77) built with Homebrew clang version 18.1.5 for arm64-apple-darwin23.3.0

What operating system are you seeing the problem on?

Mac

Relevant log output

No response

anagri commented 2 hours ago

attn: @ngxson

ngxson commented 1 hour ago

It's either your browser cache the old page, or you need to re-clone the repo. Ref: https://github.com/ggerganov/llama.cpp/pull/10175#issuecomment-2463316247

ngxson commented 1 hour ago

the index.html is expecting /index.js and other files, which are returning 404

Clarification: the new UI never calls /index.js