antonio-bravo / globaldevopsexperience_gdex-afterevent

0 stars 1 forks source link

Challenge 5: Integrating Diverse Event Data and Enhancing Email Entry Efficiency #47

Open antonio-bravo opened 1 week ago

antonio-bravo commented 1 week ago

Challenge 5: Integrating Diverse Event Data and Enhancing Email Entry Efficiency

As a Product Owner, I want to create a solution that seamlessly integrates event data from various formats into our primary events database and enhances the efficiency of email entries by Shirley, the PA of Robert Green. This will ensure consistency across our expanded event listings and reduce manual data entry errors.

Why:

Acceptance Criteria:

  1. Command-Line Tool for Data Integration:

    • Develop a command-line tool capable of importing files in EDIFACT format, as well as text dumps from websites and emails, into our storage account.
    • This tool should parse these diverse formats and standardize them into a consistent format that can be directly imported into our events database.
  2. Enhanced Data Entry Form:

    • Upgrade the current form used by Shirley with Smart Components to facilitate the easy pasting and automatic parsing of email content.
    • Implement simple validations to ensure data integrity and reduce errors in the manually entered data.

Tasks:


"Technology should improve your life... not become your life." - Billy Cox

antonio-bravo commented 1 week ago

Alex Fletcher Braindump

Alex Fletcher:

Hey team! ๐ŸŒŸ Had a brainwave while I was deep in the code trenches last night. We've been looking for ways to parse these files, right? Well, I think I've stumbled upon a couple of game-changers that might just do the trick! ๐Ÿš€

You know those unstructured files that have been a headache? Well, I think I've found our aspirin. Plus, I've got something that could really help Shirley with her daily "put the email in the system" tasks! ๐Ÿš€

Hereโ€™s the scoop!. Instead of banging our heads against the wall trying to write custom parsers, we can use AI to do the parsing for us. We can use Semantic Kernel (LLM in code!), which is a C# implementation to use Large Language Models (LLM's) in our code. We can use this to parse these files for us. Itโ€™s like having a data whisperer in our toolkit, turning chaos into structured data without the migraine. And for Shirley, I found Smart Paste Blazor component! It's like the clipboard has suddenly got a brain! ๐Ÿง โœ‚๏ธ

I've created the basic Command-line tool already, we only need to program the AI. I also mapped out some initial ideas how to tackle the problem and set up a Wiki page to brainstorm how we can integrate these tools seamlessly into our workflow. Simply accept my pull request and take it from there.

Iโ€™ve already have some preliminary thoughts on how we need to write our prompts and get JSON back! I set up a discussion space on our Wiki where we can brainstorm how these tools might integrate with our current setup. Check out the initial ideas and contribute your thoughts here on our Wiki. ๐Ÿ“

  • Stuck or need some insights? Just hit /help for guidance. ๐Ÿ†˜
  • Want to dive deeper into how these technologies can transform our work? /expert-tip is your go-to. ๐Ÿ“˜
  • Ready for me to spearhead the integration of these tools? Type /fix and I'll get right on it. ๐Ÿ”ง
  • Want to ensure the implementation aligns with our project standards? /verify to have me double-check. ๐Ÿ”

Oh! And a quick reminder, if youโ€™re ready for the next challenge, type /finish. Emily will close the issue for you, so donโ€™t close it yourself. But before that, make sure to use /fix and implement the provided code to be all set for the next challenge.