Skip to content

Abstract Search

Primary Submission Category: Causal Inference and Bias/Discrimination

Detecting and Mitigating Discriminatory Bias in Treatment Assignment Policies: A Causal Algorithmic Fairness Approach with a Field Experiment

Authors: Joel Persson, Jurriën Bakker, Dennis Bohle, Florian von Wangenheim,

Presenting Author: Joel Persson*

Heterogeneous treatment effect (HTE) prediction is often used to learn and optimize treatment assignment policies. Typically, this involves assigning treatment to those individuals for which the predicted HTE exceeds a specified threshold. There is discriminatory bias if the prediction error in the HTE systematically varies across protected groups (e.g., race) as, then, members of some groups are assigned incorrect treatment at a higher rate than others. We develop methods for detecting and mitigating such discriminatory bias. Our methods are based on group-wise estimation and inference of the error in the average HTE predicted by a model versus the average HTE of a consistent and unbiased estimator. Our methods are general; they make minimal assumptions on the prediction model and estimator. Here, we propose estimators based on randomized, regression discontinuity, and instrumental variable designs. Via simulations, we show that our methods are consistent and unbiased in detecting and mitigating the discriminatory bias. To test our methods in practice, we partner with a leading travel marketplace and use data from a field experiment on targeted offers exceeding 1B USD in costs. We find discriminatory bias towards people from some countries but that our methods can mitigate this. Our work contributes to previous research by developing methods for detecting and mitigating discriminatory bias in treatment assignment policies and by demonstrating their performance in practice.