Call-for-Code-for-Racial-Justice / Open-Sentencing-UI

Open Sentencing User Interface (UI) provided is meant to allow public defenders or others an ability to easily review contents of a case to determine when bias was detected.
Apache License 2.0
10 stars 17 forks source link

Epic - Public Defender enters a new case #86

Open chenhuyr opened 2 years ago

chenhuyr commented 2 years ago

Background on the problem the feature will solve/improved user experience

In this main use case scenario, data has not yet been entered for the case. The person entering may be a public defender or a staff member.

User Story 1

A public defender ‘Bob’ has a new case to work on. He has been assigned a young man who is accused of committing second degree arson in Michigan and is facing the full penalty of 20 years. Per quick research he learns:

The penalty for second degree arson is a felony conviction punishable by up to 20 years in prison, or a fine not more than $20,000 or 3 times the value of the property damaged or destroyed, whichever is greater, or both Bob wants to know if his client’s possible sentence of 20 years is fair given his race is black. He has seen his white clients receive lesser sentences.

He will go into our tool and input all of the case information. He is curious if bias detection will flag that black defendants receive higher sentences typically for this crime. If this occurs, he will share this information with his client and present it to the prosecution. Bob will share the information by printing a final result form from his computer. The print-out should clearly show BIAS was detected per the algorithm and that his defendant is facing more time in jail simply due to the color of his skin. He will give the prosecutor, judge and defendant copies of the document. With data to help, he talks the sentencing team into less time in jail.

User Story 2

A public defender Lea has asked her staff member Jill to enter 15 new cases into the Open Sentencing tool. Jill gets busy entering case details one at a time and saving. In the end, she prints the details of each case for Lea and places them in a binder. All the bias found in each case is highlighted. Jill makes sure Lea is updated on which cases has disparate sentences based on race so action can be taken. (Note: this makes me think a summary report would be good in some way).

Tasks

Adding Defendant Demographic Details

Adding Defendant Case Details

Acceptance Criteria

N/A

github-actions[bot] commented 2 years ago

Thank you so much for contributing to our work!

github-actions[bot] commented 2 years ago

:wave: Hi! This issue has been marked stale due to inactivity. If no further activity occurs, it will automatically be closed.