Skip to content

Abstract Search

Primary Submission Category: Instrumental Variables

Regularized DeepIV with Model Selection

Authors: Hui Lan, Zihao Li, Vasilis Syrgkanis, Mengdi Wang, Masatoshi Uehara,

Presenting Author: Hui Lan*

In this paper, we study nonparametric estimation of instrumental variable (IV) regressions. While recent advancements in machine learning have introduced flexible methods for IV estimation, they often encounter one or more of the following limitations: (1) restricting the IV regression to be uniquely identified; (2) requiring minimax computation oracle, which is highly unstable in practice; (3) absence of model selection procedure. In this paper, we analyze a Tikhonov-regularized variant of the seminal DeepIV method, called Regularized DeepIV (RDIV) regression, that can converge to the least-norm IV solution, and overcome all three limitations. RDIV consists of two stages: first, we learn the conditional distribution of covariates, and by utilizing the learned distribution, we learn the estimator by minimizing a Tikhonov-regularized loss function. We further show that RDIV allows model selection procedures that can achieve the oracle rates in the misspecified regime. When extended to an iterative estimator, we prove that RDIV matches the current state-of-the-art convergence rate. Furthermore, we conducted numerical experiments to justify the efficiency of RDIV empirically. Our results provide the first rigorous guarantees for the empirically well-established DeepIV method, showcasing the importance of regularization which was absent from the original work.