Open aidando73 opened 18 hours ago
Might be an error with llama-stack-client 🤔
Installing pip install httpx==0.27.2
resolves the issue. I was on 0.28.0 before:
(llama-stack) ubuntu@168-138-112-243:~/1xa100-2/llama-stack-apps$ pip show httpx
Name: httpx
Version: 0.28.0
Summary: The next generation HTTP client.
Home-page:
Author:
Author-email: Tom Christie <tom@tomchristie.com>
License: BSD-3-Clause
Location: /home/ubuntu/.local/lib/python3.10/site-packages
Requires: anyio, certifi, httpcore, idna
Required-by: gradio, gradio_client, llama_stack, llama_stack_client, safehttpx
Might be something wrong with the latest release of httpx? Just came out yesterday:
https://github.com/encode/httpx/releases/tag/0.28.0
Asking maintainers: https://github.com/encode/httpx/discussions/3425
Maintainers have replied:
Heya. That parameter became deprecated in 0.26.0, and was removed in 0.28.0. If you could follow up with llama_stack_client to help their team get this resolved, that'd be helpful. You can pin to httpx=0.27.2 in the meantime.
https://github.com/encode/httpx/discussions/3425
Opened an issue on llama_stack_client_python https://github.com/meta-llama/llama-stack-client-python/issues/54
System Info
Information
🐛 Describe the bug
Error logs
Expected behavior
No error occurs and command runs with output as per README.md