How to decompose the non-response variance: A total survey error approach
Section 5. Conclusion
The
proposed unit-level score is a good approximation of the unit impact on the
variance due to non-response. It is applicable for different survey designs,
compliant with calibration estimators for domain totals and works with many
common imputation methods. The assumptions on which the decomposition relies
are generally valid in common surveys using unbiased imputation methods and
consistent estimators of imputation model parameters. The simulation results show that this approach becomes more accurate with larger sample
sizes. The decomposition of the non-response
variance is biased due to its non-linearity. However, the bias is smaller in asymmetric populations and
when focusing on a small number of nonresponding
units. The fact that the ordering of units using the estimated contribution to
variance due to non-response is similar to the real order is an important aspect when the priority is to identify the
largest contributors, not necessarily their actual contributions, to the total
error.
This
paper presented the method in a univariate context but it can be easily
extended to a multivariate framework, using a distance function to combine the
item contributions into a unit contribution. The idea remains to focus our
attention in terms of collection treatments or manual verification on cases
where the unit scores are the highest. In this case the non-response follow-up
treatment might be different for unit non-response compared to partial non-response.
For example, a telephone follow-up could be used to collect all the items for
the total nonresponding units with the larger score; and the partial
nonrespondents with a large score could be sent to an analyst for review,
depending on the budget for follow-up. Moreover, if this score can be computed
several times during the collection period, then non-response follow-ups will
be more efficient because the unit score will be more accurate and the quality
might become satisfactory for some estimates. Simulation results show that the
proposed score is a good approximation to the contribution of a unit to the
variance due to non-response. Subsequently, this score could be used to
determine how many and which nonresponding units should be followed in order to
reach a given estimated coefficient of variation.
This work was initially done for non-response prioritization under the Rolling
Estimate iterative adaptive design process for IBSP. Following the original
plan, key item estimates would be computed with their associated quality
indicators at several specific times during the collection period. After each
specific time, the units with the largest contributions according to our method
would be prioritized for follow-up.
Acknowledgements
The authors want to thank the reviewers (Cynthia Bocci
and Jessica Andrews), the associate editor, the referees and the assistant
editor for their valuable feedback.
Appendix
Proof 1
Proof 2
Proof 3
References
Beaumont, J.-F., and Bissonnette, J. (2011). Variance estimation
under composite imputation: The methodology behind SEVANI. Survey Methodology, 37, 2, 171-179. Paper available at https://www150.statcan.gc.ca/n1/pub/12-001-x/2011002/article/11605-eng.pdf.
Beaumont, J.-F., and Bocci, C. (2009). Variance
estimation when donor imputation is used to fill in missing values. Canadian Journal of Statistics, 37,
400-416.
Beaumont, J.-F., Bocci, C. and Haziza, D. (2014). An adaptive
data collection procedure for call prioritization. Journal of Official Statistics, 30, 607-621.
Beaumont, J.-F., Haziza, D. and Bocci, C. (2011). On
variance estimation under auxiliary value imputation in sample surveys. Statistica Sinica, 21, 515-537.
Biemer, P.P. (2010). Total survey error: Design, implementation,
and evaluation. Public Opinion Quarterly,
74, 5, 817-848.
Bosa, K., and Godbout, S. (2014). IBSP Quality Measures
Methodology Guide. Business Survey Methods
Division. Internal document.
Godbout, S., Beaucage, Y. and Turmelle, C. (2011).
Achieving quality and efficiency using a top-down approach in the Canadian integrated
business statistics Program. Proceedings
of the Conference of European Statisticians. United Nations Statistical
Commission and Economic Commission for Europe. Work Session on Statistical Data
Editing. Ljubljana, Slovenia, 9-11 May 2011.
Groves, R.M., and Heeringa, S.G. (2006). Responsive
design for household surveys: Tools for actively controlling survey errors and
costs. Journal of the Royal Statistical
Society, Series A, 169, No. 3, 439-457.
Mills, F., Godbout, S., Bosa, K. and Turmelle, C.
(2013). Multivariate selective editing in the integrated business statistics program. Proceedings of the Joint Statistical
Meeting 2013 - Survey Research Methods Section. August 2013. Montréal,
Canada.
Särndal, C.-E. (1992). Methods for estimating the
precision of survey estimates when imputation has been used. Survey Methodology, 18, 2, 241-252.
Paper available at https://www150.statcan.gc.ca/n1/pub/12-001-x/1992002/article/14483-eng.pdf.
Schouten, B., Calinescu, M. and Luiten, A. (2013).
Optimizing quality of response through adaptive survey designs. Survey Methodology, 39, 1, 29-58. Paper
available at https://www150.statcan.gc.ca/n1/pub/12-001-x/2013001/article/11824-eng.pdf.
Statistics Canada (2015). Integrated Business Statistics Program Overview. Statistics Canada
Catalogue no. 68-515-X. Ottawa.
Turmelle, C., Godbout, S. and Bosa, K. (2012).
Methodological challenges in the development of Statistics Canada’s new integrated
business statistics program. Proceedings
of the International Conference on
Establishment Surveys IV. Montréal, Canada.
ISSN : 1492-0921
Editorial policy
Survey Methodology publishes articles dealing with various aspects of statistical development relevant to a statistical agency, such as design issues in the context of practical constraints, use of different data sources and collection techniques, total survey error, survey evaluation, research in survey methodology, time series analysis, seasonal adjustment, demographic studies, data integration, estimation and data analysis methods, and general survey systems development. The emphasis is placed on the development and evaluation of specific methodologies as applied to data collection or the data themselves. All papers will be refereed. However, the authors retain full responsibility for the contents of their papers and opinions expressed are not necessarily those of the Editorial Board or of Statistics Canada.
Submission of Manuscripts
Survey Methodology is published twice a year in electronic format. Authors are invited to submit their articles in English or French in electronic form, preferably in Word to the Editor, (statcan.smj-rte.statcan@canada.ca, Statistics Canada, 150 Tunney’s Pasture Driveway, Ottawa, Ontario, Canada, K1A 0T6). For formatting instructions, please see the guidelines provided in the journal and on the web site (www.statcan.gc.ca/SurveyMethodology).
Note of appreciation
Canada owes the success of its statistical system to a long-standing partnership between Statistics Canada, the citizens of Canada, its businesses, governments and other institutions. Accurate and timely statistical information could not be produced without their continued co-operation and goodwill.
Standards of service to the public
Statistics Canada is committed to serving its clients in a prompt, reliable and courteous manner. To this end, the Agency has developed standards of service which its employees observe in serving its clients.
Copyright
Published by authority of the Minister responsible for Statistics Canada.
© Her Majesty the Queen in Right of Canada as represented by the Minister of Industry, 2018
Use of this publication is governed by the Statistics Canada Open Licence Agreement.
Catalogue No. 12-001-X
Frequency: Semi-annual
Ottawa