Transparent and Robust Causal Inferences in the Social and Health Sciences

Carlos Leonardo Kulnig Cinelli
PhD, 2021
Hazlett, Chad J
The past few decades have witnessed rapid and unprecedented theoretical progress on the science of causal inference. Most of this progress, however, relies on strong, exact assumptions, such as the absence of unobserved confounders, or the absence of certain direct effects. Unfortunately, more often than not these assumptions are not only untestable, but also very hard to defend in practice. This dissertation develops new theory, methods, and software for drawing causal inferences under more realistic settings. These tools allow applied scientists to both examine the sensitivity of their causal inferences to violations of their underlying assumptions, and also to draw robust (albeit also more modest) conclusions from settings in which traditional methods fail. Specifically, our contributions are the following: (i) novel powerful, yet simple, suite of sensitivity analysis tools for popular methods, such as confounding adjustment and instrumental variables, that can be immediately put to use to improve the robustness and transparency of current applied research; (ii) the first formal, systematic approach to sensitivity analysis for arbitrary linear structural causal models; and, (iii) novel (partial) identification results that marry two apparently disparate areas of causal inference research—the generalization of causal effects and the identification of “causes of effects.” These methods are illustrated with examples from the social and health sciences.
2021