Double Debiased Machine Learning Nonparametric Inference with Continuous Treatments

We propose a nonparametric inference method for causal effects of continuous treatment variables, under unconfoundedness and in the presence of high-dimensional or nonparametric nuisance parameters. Our double debiased machine learning (DML) estimators for the average dose-response function (or the...

Full description

Bibliographic Details
Main Authors: Kyle Colangelo, Ying-Ying Lee
Format: Report
Language:unknown
Subjects:
DML
Online Access:http://arxiv.org/pdf/2004.03036
Description
Summary:We propose a nonparametric inference method for causal effects of continuous treatment variables, under unconfoundedness and in the presence of high-dimensional or nonparametric nuisance parameters. Our double debiased machine learning (DML) estimators for the average dose-response function (or the average structural function) and the partial effects are asymptotically normal with nonparametric convergence rates. The nuisance estimators for the conditional expectation function and the conditional density can be nonparametric kernel or series estimators or ML methods. Using a kernel-based doubly robust influence function and cross-fitting, we give primitive conditions under which the nuisance estimators do not affect the first-order large sample distribution of the DML estimators. We further give low-level conditions for kernels and series estimators, as well as modern ML methods - the generalized random forest and deep neural networks. We justify the use of kernel to localize the continuous treatment at a given value by the Gateaux derivative. We implement various ML methods in Monte Carlo simulations and an empirical application on a job training program evaluation.