For the next example the exception occurs at this line for call (llm (Agent &retrieval_folder (doc_name ("story2.txt"))) $question)
!(extend-py! motto)
!(bind! &retrieval_folder (retrieval-agent "./data/texts_for_retrieval" 200 3 "data"))
!( let $question (user "Who strides over the rushing rivers and broad lakes?")
(let $prompt (llm (Agent &retrieval_folder (doc_name ("story2.txt"))) $question)
(llm
(Agent (chat-gpt "gpt-3.5-turbo-0613"))
(Messages (system ("Taking this information into account, answer the user question"
$prompt))
$question)
)
)
)
and this exception is passed with messages into next llm call
{'role': 'system', 'content': 'Taking this information into account, answer the user question Error llm Agent <motto.agents.retrieval_agent.RetrievalAgent object at 0x7fa8aa46f820> doc_name story2.txt\n\n\n user Who strides over the rushing rivers and broad lakes?\n\n Exception caught:\nTypeError: GroundedAtom is expected as input to a non-MeTTa agent. .. }
For the next example the exception occurs at this line for call
(llm (Agent &retrieval_folder (doc_name ("story2.txt"))) $question)
and this exception is passed with messages into next llm call
{'role': 'system', 'content': 'Taking this information into account, answer the user question Error llm Agent <motto.agents.retrieval_agent.RetrievalAgent object at 0x7fa8aa46f820> doc_name story2.txt\n\n\n user Who strides over the rushing rivers and broad lakes?\n\n Exception caught:\nTypeError: GroundedAtom is expected as input to a non-MeTTa agent. .. }