Open jack2684 opened 3 months ago
hi @jack2684 this should be available at guard.history.last.compiled_prompt
Could you happen to share where you got the guard_llm.history.last.compiled_instructions reference so we could update the docs accordingly? Thanks!
Description When using following code, I don't know what will be sent to the llm endpoint.
Even the history doesn't show the, not to mention I don't want to view prompt only after send it successfully
This will output somehting like
I don't see where is provided XML schemas.
Why is this needed I would like to get the original prompt for debugging purpose.
Implementation details [If known, describe how this change should be implemented in the codebase]
End result