fani-lab / Adila

Fairness-Aware Team Formation
3 stars 2 forks source link

Investigating gender fairness of recommendation algorithms in the music domain #76

Open Rounique opened 1 year ago

Rounique commented 1 year ago

Link: https://dl.acm.org/doi/10.1016/j.ipm.2021.102666 Year: 2021 Venue: Information Processing and Management

In this paper, a novel dataset of music listening records is introduced to study bias in recommender systems based on users' demographics. The authors define a notion of fairness based on the performance gap between different demographic groups and evaluate various collaborative filtering algorithms for accuracy and fairness metrics. They find significant unfairness between male and female user groups and examine how recommender algorithms can amplify underlying population bias.

Additionally, they explore the effectiveness of a resampling strategy as a debiasing method, which slightly improves fairness measures while maintaining accuracy and performance. The main contributions of the study include the introduction of a large-scale real-world dataset, identifying robust algorithms for handling gender bias and assessing the impact of data debiasing methods.

Metrics used: NDCG, Recall, Diversity, Coverage

The results demonstrate that they are able to enhance diversity and coverage without any/significant changes in NDCG and Recall.

Github code: https://github.com/CPJKU/recommendation_systems_fairness

hosseinfani commented 1 year ago

@Rounique thanks.

Rounique commented 1 year ago

@hosseinfani These metrics seem to be applicable to the team formation problem and I believe we can use their resampling method as a pre-processing debias and see if it is effective in making the teams more fair.

hosseinfani commented 1 year ago

@Rounique perfect. please start prototyping them (metrics and resampling) on toy samples of team formation and keep me posted. thanks.