Primary Submission Category: Causal Bounds, Partial Identification
Information-Theoretic Causal Bounds
Authors: Yonghan Jung,
Presenting Author: Yonghan Jung*
We develop an information-theoretic framework for partial identification of causal effects under unmeasured confounding. Existing partial identification approaches suffer from one or more of the following limitations: restricting outcomes to be bounded or discrete, requiring auxiliary inputs (instruments, proxies, or user-specified sensitivity parameters), necessitating full structural modeling, or neglecting effect heterogeneity. We universally address these limitations through a novel information-theoretic divergence bound. Our key insight is that the f-divergences of the observational distribution P(Y | A=a, X=x) and the interventional distribution P(Y | do(A=a), X=x) are upper bounded by a function of the propensity score Pr(A=a|X=x). We translate these upper bounds of f-divergences into sharp lower and upper bounds of conditional causal effects without requiring boundedness assumptions, auxiliary variables, or full data-generating process specification. We develop a semiparametric estimator for the proposed causal bounds that attains fast convergence rates even when nuisance components converge slowly. Simulation studies and real data applications demonstrate the practical utility of our bounds.
