Closed AadarshBhalerao closed 3 months ago
Hi @AadarshBhalerao
How are you setting verbose?
For your reference.
So do something like:
new_message = await app.generate_async(messages=[{
"role": "user",
"content": "What is life and health insurance?"
}],
options={"verbose": True})
You can also use nemoguardrails CLI
https://docs.nvidia.com/nemo/guardrails/user_guides/cli.html?highlight=options
Or like below
app = LLMRails(config=config, verbose=True)
Thanks for a prompt reply @Pouyanpi
I had mentioned verbose here
# Configuration of LLMs is passed
app = LLMRails(config=config, llm=chat_model, verbose=True)
Post which I ran the code, so it gave me the no LLM calls error. I restarted the kernal this time kept verbose=False and in 3rd time removed the verbose parameter. Still issue remains
Hi @AadarshBhalerao ,
I examined your case. The issue is not enabling verbose mode, but your rails definition.
Change it to
rails:
input:
flows:
- self check input
And you'd see that your off topic
flow gets applied.
I am using this code
config.yml
flow.co
and my output was working fine. But I just to test added verbose=True once in my app configuration i.e. app = LLMRails(config=config, llm=chat_model)
Since then it has given me no output
Everytime i am getting this as output