Primary Submission Category: Machine Learning and Causal Inference
Tailored Overlap for Observational Causal Inference and Domain Adaptation
Authors: Avi Feller, Alexander D’Amour, Steven Yadlowsky,
Presenting Author: David Bruns-Smith*
In observational causal inference and predictive modeling under distribution shift, assumptions about overlap between treatment/covariate groups or training/test distributions are critical for identifying causal effects or finding an optimal predictive model. Standard theory quantifies overlap in terms of bounds on the inverse propensity score, which are typically measured using $chi^2$ divergences. However, in modern settings with high-dimensional covariates, these standard divergence measures are often infinite. In this paper, we propose a new approach to measuring overlap that is tailored to a specific function class, which allows us to better capture the relationship between the treatment and outcome or between covariates and target variable. We show how $chi^2$ divergences can be generalized to this restricted function class setting, and use this to motivate more widespread use of balancing weight-based methods, which adjust the relative influence of different observations in the training data. These methods allow us to more accurately identify causal effects and optimal predictors, even in settings with high-dimensional covariates and limited overlap.