Open Komal-99 opened 9 months ago
HI @Komal-99 , did you find a solution to your query.. I am facing same issue
@rohitgarud @Komal-99 , the error message is indeed misleading. This exception is thrown when there is no LLM configured in the config. To fix the original example, there are two ways:
config.yml
or pass it directly to the LLMRails
instance, since it's already initialized before the qa
chain. config.yml
rails:
dialog:
user_messages:
embeddings_only: True
The second option is recommended since the flow doesn't care about the actual user intent, so we don't need to generate one. Also, for both of them to work correctly, make sure there is at least one user message defined:
define user ask something
"something"
I'll mark this as a bug to provide a more explicit error message. It should actually be fixed by https://github.com/NVIDIA/NeMo-Guardrails/pull/223 once merged. But also to trigger a warning if flows are being defined and no user message is defined.
I have Implemented Neoguardrails in Retrieval QA chain with llama2 model but it is giving error
My code:
rails.co as follows
config.yml
Error Occuring
Please Help How I can use llama2 model correctly 🙏🙏