Primary Submission Category: Randomized Designs and Analyses
Simulation-Based Inference After Adaptive Experiments
Authors: Aurelien Bibaut, Brian Cho, Nathan Kallus,
Presenting Author: Aurelien Bibaut*
In recent years, much work has been done on inference after adaptive data collection. Most of these provide methods that enforce normality of the asymptotic distribution. As noted elsewhere, control of the distribution of test statistics in adaptive experiments comes at the expense of power. In this article, we overcome the need to control the asymptotic distribution of statistics by simulating the distribution of the statistic under candidate values of the parameter of interest. Our sole focus in constructing test statistics is then to maximize power. Unlike some existing works, our framework allows for arbitrary adaptive experimental designs, and in particular does not require non-zero propensities, thereby allowing for UCB and linUCB adaptive experimental designs. We provide generic conditions for inference validity for scalar parameters in semiparametric models. Numerical experiments demonstrate drastic improvements in power and confidence interval size over all existing baselines.