Skip to main content
eScholarship
Open Access Publications from the University of California

UC Riverside

UC Riverside Electronic Theses and Dissertations bannerUC Riverside

Topics in Artificial Neural Networks: Causal Inference and Functional Derivative Estimation

Abstract

Advances in computer science technologies have shed light on artificial neural networks (ANN). ANN shows its power and efficiency in various classification and regression problems and is gaining more and more popularity in various fields. In this dissertation, we investigate treatment effects estimation using fully connected shallow network, and explore sparse deep neural network regression and functional derivative estimation. The first part of this dissertation provides a unified framework for efficient estimation of various types of treatment effects (TE) in observational data with a diverging number of covariates through a generalized optimization. We show that the number of confounders is allowed to increase with the sample size, and further investigate how fast it can grow with the sample size to ensure root-n consistency of the resulting TE estimator. Moreover, we establish asymptotic normality and semiparametric efficiency of the TE estimator. The second part of this dissertation proposes a penalized deep ReQU network estimator (PDRN) obtained from empirical risk minimization framework. The proposed neural network bases on Jacobi polynomial approximation on the hyperbolic cross/sparse grid and alleviates the "curse of dimensionality". Our PDRN estimator also provides smooth functional derivative estimation. Our estimators are illustrated through simulation studies and multiple real data examples.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View