meta-llama / llama-stack-apps

Agentic components of the Llama Stack APIs
MIT License
3.23k stars 320 forks source link

404 NOT Found! #59

Open chuznhiwu opened 2 weeks ago

chuznhiwu commented 2 weeks ago

after "llama distribution start --name ollama --port 5000 --disabled ipv6" then I get Serving POST /inference/batch_chat_completion Serving POST /inference/batch_completion Serving POST /inference/chat_completion Serving POST /inference/completion Serving POST /safety/run_shields Serving POST /agentic_system/memory_bank/attach Serving POST /agentic_system/create Serving POST /agentic_system/session/create Serving POST /agentic_system/turn/create Serving POST /agentic_system/delete Serving POST /agentic_system/session/delete Serving POST /agentic_system/memory_bank/detach Serving POST /agentic_system/session/get Serving POST /agentic_system/step/get Serving POST /agentic_system/turn/get Listening on 0.0.0.0:5000 INFO: Started server process [293531] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:5000 (Press CTRL+C to quit) INFO: 127.0.0.1:47696 - "GET /?vscodeBrowserReqId=1725413273912 HTTP/1.1" 404 Not Found

without "--disabled ipv6" is the same output Has anyone encountered this situation?

dltn commented 1 week ago

The server running on 5000 is an API server and not a webserver you'd be able to visit in your browser. In a separate window, run mesop app/main.py to start the web app.

chuznhiwu commented 1 week ago

The server running on 5000 is an API server and not a webserver you'd be able to visit in your browser. In a separate window, run mesop app/main.py to start the web app.

Thank you for your reply. I have reloaded llama stack apps and gone through the process again. Below is the result (I did not enable oallma and used local, llama stack run local --name 8b-instruct --port 5000) image In a separate window, run mesop app/main.py image image when I run PYTHONPATH=. python examples/scripts/vacation.py localhost 5000, and got this image image I have been feeling confused for several days now

ps:llama stack run local-ollama --name 8b-instruct --port 5000 got the same

JoseGuilherme1904 commented 1 week ago

62

chuznhiwu commented 1 week ago

62

Thank you~