GatorEducator / gatorgrade

:heavy_check_mark: Generate or Run a Suite of GatorGrader Checks
GNU General Public License v3.0
13 stars 9 forks source link

feat: gatorgrade feedback #141

Open Chezka109 opened 5 days ago

Chezka109 commented 5 days ago

Description:

The current version of GatorGrade does not ask users for feedback after they've used the tool, which misses an opportunity to collect insights on user experience, potential bugs, or areas of improvement.

Expected Behavior:

After running GatorGrade, users should be prompted to leave feedback on their experience with the tool. This feature would provide a brief message asking if they'd like to share feedback, with an option to submit it through a form or directly via the tool. Gathering feedback would help in identifying pain points and areas of improvement.

Actual Behavior:

Currently, GatorGrade runs its checks and provides output, but there is no mechanism in place for users to offer feedback directly after usage. This could lead to missed opportunities for understanding the user’s perspective or fixing unnoticed issues.

Proposed Solution:

Implement a feature that prompts users for feedback after using GatorGrade. This could involve displaying a short message at the end of the output, directing users to a feedback form or collecting responses directly within the tool. This feedback mechanism would help the development team gather valuable insights and continuously improve the tool.

gkapfham commented 2 days ago

Hello @Chezka109, here are some questions to consider: where would the feedback be stored? for how long would the feedback be stored? how would the data be analyzed? how would the data influence the development of GatorGrade and other affiliated tools?