Open JuanmaMenendez opened 1 month ago
@JuanmaMenendez I'm looking into this issue, but as of now I'm unable to reproduce the error with the example provided. Could you share a more complete code sample and stack trace for the errors you are encountering?
Describe the bug
In v0.5, when I run a SensitiveTopic validation with disable_llm=true (LLM disabled) and device with the default value of -1
In the validation script, I got the error:
ports/default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
In the server script I got the error
INFO:werkzeug:127.0.0.1 - - [26/Jul/2024 12:19:28] "POST /guards/sensitive_topics/openai/v1/chat/completions HTTP/1.1" 200 - zsh: bus error guardrails start --config config.py
NOTE: The validation works as expected if I change device to a positive value like 0. eg:
device=0
. I am using an M1 Macbook pro laptop.This is my server config:
Library version: Version (0.5)