Skip to content

Abstract Search

Primary Submission Category: Difference in Differences, Synthetic Control, Methods for Panel and Longitudinal Data

A Meta-learner for Heterogeneous Effects in DiD and General Conditional Functionals Under Covariate Shift

Authors: Hui Lan, Haoge Chang, Eleanor Dillon, Vasilis Syrgkanis,

Presenting Author: Hui Lan*

We address the problem of estimating heterogeneous treatment effects in panel data, leveraging the popular difference-in-differences (DiD) framework under the parallel trends assumption. We propose a novel doubly robust meta-learner for the conditional average treatment effect on the treated. Our framework allows for interpretable projections onto lower-dimensional subsets of interest, and can be easily implemented as a convex loss minimization problem involving a set of auxiliary models. By leveraging Neyman orthogonality, our proposed appraoch is robust to estimation errors in the auxilliary models. As a generalization to this problem, we also provide an estimation framework for general functionals under covariate shift. Additionally, we extend our methodology to handle binary instruments out of practical concerns when there is non-compliance in the treatment assignment. Empirical results demonstrate the superiority of our approach over existing baselines, and we provide a detailed discussion on the performance of our meta-learner in relations with different identifying assumptions.