Survey Methodology
Are deep learning models superior for missing data imputation in surveys? Evidence from an empirical comparison

by Zhenhua Wang, Olanrewaju Akande, Jason Poulos and Fan LiNote 1

  • Release date: December 15, 2022

Abstract

Multiple imputation (MI) is a popular approach for dealing with missing data arising from non-response in sample surveys. Multiple imputation by chained equations (MICE) is one of the most widely used MI algorithms for multivariate data, but it lacks theoretical foundation and is computationally intensive. Recently, missing data imputation methods based on deep learning models have been developed with encouraging results in small studies. However, there has been limited research on evaluating their performance in realistic settings compared to MICE, particularly in big surveys. We conduct extensive simulation studies based on a subsample of the American Community Survey to compare the repeated sampling properties of four machine learning based MI methods: MICE with classification trees, MICE with random forests, generative adversarial imputation networks, and multiple imputation using denoising autoencoders. We find the deep learning imputation methods are superior to MICE in terms of computational time. However, with the default choice of hyperparameters in the common software packages, MICE with classification trees consistently outperforms, often by a large margin, the deep learning imputation methods in terms of bias, mean squared error, and coverage under a range of realistic settings.

Key Words: Deep learning; Hyperparameter selection; Missing data; Multiple imputation by chained equations; Simulation studies; Survey data.

Table of contents

How to cite

Wang, Z., Akande, O., Poulos, J. and Li, F. (2022). Are deep learning models superior for missing data imputation in surveys? Evidence from an empirical comparison. Survey Methodology, Statistics Canada, Catalogue No. 12-001-X, Vol. 48, No. 2. Paper available at http://www.statcan.gc.ca/pub/12-001-x/2022002/article/00009-eng.htm.

Note


Date modified: