Open maswadkar opened 2 weeks ago
Try going into llama-stack-client-python code repo and looking into API.md documentation, you might find what you are looking for there.
Hi maswadkar , see: https://github.com/meta-llama/llama-stack/blob/main/docs/zero_to_hero_guide/
and
https://github.com/meta-llama/llama-stack/blob/main/docs/resources/llama-stack-spec.html
llama-stack-spec.html should be opened using a browser; here, you will find the function definitions. I'm not part of the project, but I've had the same questions, and these two links helped me a lot.
Guilherme
π The feature, motivation and pitch
Hi, I have
I was successfully was able to POST
http://localhost:5000/inference/chat_completion
and get the response. because there is very nicely documented sample code onhttps://llama-stack.readthedocs.io/en/latest/getting_started/index.html
But unfortunately I was not able to do any of
because there is no documentation of even a sample code or even cookbook.
Could you please guide us. this i believe would help many enthusiasts like me.
I am ready to contribute, if even information provided.
Alternatives
No response
Additional context
No response