Closed dalberto closed 1 year ago
@dalberto just added a PR to fix this issue, along with some integration tests around specific type parsing to make sure that this is robust!
Can you try installing the guardrails-ai
package from the commit version directly? Either pip install git+ssh://git@github.com/shreyar/guardrails.git@1e02c67
or a later commit should work.
I'm planning on cutting a new release tomorrow with this bug fix as well as some additional features.
Updated the from_str
function to check to see if the boolean is parsed correctly, otherwise do the parsing from string.
Closing this issue, but once again lmk if you run into this again and reopen the issue!
hey, another small parsing issue.
My sample case:
output_parser = GuardrailsOutputParser.from_rail_string(rail_spec)
transcript = """ Person: Hi, can you help me draft a business letter? AI Assistant: Of course, what type of letter are you looking to draft? Person: I need to send a formal email to a potential client to introduce my business. AI Assistant: Great. Can you give me some more details about your business and the purpose of the email? Person: Sure, my business provides marketing services and I want to introduce our services to the potential client and request a meeting to discuss potential collaboration. AI Assistant: Understood. Would you like me to provide you with a template to start with? Person: Yes, that would be helpful. AI Assistant: Okay, here's a draft you can start with: Dear [Client Name], I hope this email finds you well. My name is [Your Name], and I am reaching out from [Your Business Name], a marketing services company. We specialize in helping businesses like yours increase their brand awareness and drive sales through various marketing strategies. I would like to take this opportunity to introduce our services to you and discuss potential collaboration opportunities. I would like to request a meeting to discuss in more detail how we can help your business grow. Please let me know if this is something that interests you, and we can schedule a time that works best for you. Thank you for considering [Your Business Name]. I look forward to hearing back from you. Best regards, [Your Name] """
chat_prompt = PromptTemplate( template=output_parser.guard.base_prompt, input_variables=["transcript"] )
chain = LLMChain(llm=ChatOpenAI(temperature=0), prompt=chat_prompt)
output = chain.run(transcript=transcript)
output_parser.parse(output)
It's notable that the LLM is providing good output, this is what I see as the values within the last frame:
I'm not yet too familiar with the codebase but my hunch is that we don't need to parse bools, assuming the LLM produces valid JSON. Alternatively, if the goal is to support non-JSON formats down the line, it may be worthwhile to pattern match on the type of
s
to route to the right parsing logic.