DavidL0914 / student

MIT License
0 stars 0 forks source link

5.3 Computing Bias Lesson Team Teach | CompSci Blogs #7

Open utterances-bot opened 9 months ago

utterances-bot commented 9 months ago

5.3 Computing Bias Lesson Team Teach | CompSci Blogs

Computing Bias Lesson

https://davidl0914.github.io/student/2023/12/11/Computing-Bias-Team-Teach_IPYNB2.html

DavidL0914 commented 9 months ago

To address bias in the predictive policing algorithm, the city should prioritize transparency, regularly audit the algorithm's impact, and ensure clear explanations of decision-making processes. They should also diversify and regularly update the training data to eliminate historical biases, incorporating new feedback from their communities. Furthermore, they could employ bias mitigation techniques, such as pre-processing methods and fairness constraints, and establish continuous monitoring with a feedback loop for algorithm adaptation. These approaches would help create a fair, transparent, and accountable predictive policing system that respects civil rights and community concerns. The method of mitigation described is commonly known as "Fairness Constraints." Fairness constraints involve incorporating restrictions or adjustments in the algorithm to ensure fair and equitable treatment across different demographic groups.