Sort Help
entries

Results

All (107)

All (107) (40 to 50 of 107 results)

  • Articles and reports: 11-522-X20010016272
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The French survey of homeless people using support services is unique because of its scope and the conditions under which it was conducted. About 4,000 users of shelters and soup kitchens were surveyed in January and February 2001. Because some users move from one service point to another, it was necessary to collect precise data on the number of times each respondent used such services (meals and person-nights) during the week preceding the survey. Data quality is extremely important since it has a major impact on the sampling weight assigned to each individual.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016273
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    For a multivariate survey based on simple random sampling, the problem of calculating an optimal sampling size becomes one of solving a stochastic programming problem in which each constraint corresponds to a bounded estimate of the variance for a commodity. The problem is stochastic because the set of data collected from a previous survey makes the components of each constraint random variables; consequently, the calculated size of a sample is itself a random variable and is dependent on the quality of that set of data. By means of a Monte Carlo technique, an empirical probability distribution of the optimal sampling size can be produced for finding the probability of the event that the prescribed precision will be achieved. Corresponding to each set of previously collected data, there is an optimal size and allocation across strata. While reviewing these over several consecutive periods of time, it may be possible to identify troublesome strata and to see a trend in the stability of the data. The review may reveal an oscillatory pattern in the sizes of the samples that might have evolved over time due to the dependency of one allocation on another.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016274
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Since the late 1950s, the probability surveys in the manufacturing sector within the Manufacturing and Construction Division (MCD) had been almost exclusively selected by using Poisson sampling with unit probabilities assigned proportionate to some measure of size. Poisson sampling has the advantage of simplistic variance calculations. Its disadvantage is that the sample size is a random variable, thus adding an additional (and usually positive) component of variance to the survey estimates. In the 1998 survey year, MCD initiated the use of the modified Tillé sampling procedure in some of its surveys. This sampling procedure is used when there is unequal probability of selection and the sample size is fixed. This paper briefly describes this modified procedure and some of its features, and for a variety of dissimilar surveys, itcontrasts variance results obtained using the Tillé procedure to those resulting from the earlier Poisson procedure.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016275
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Hot deck imputation, in which missing items are replaced with values from respondents, is often used in survey sampling. A model supporting such procedures is the model in which response probabilities are assumed equal within imputation cells. In this paper, an efficient version of hot deck imputation is described, as are the variance of the efficient version derived under the cell response model and an approximation to the fully efficient procedure in which a small number of values are imputed for each non-respondent, respectively. Variance estimation procedures are presented and illustrated in a Monte Carlo study.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016276
    Description:

    In surveys where interviewers need a high degree of specialist knowledge and training, one is often forced to make do with a small number of highly trained people, each having a high case load. It is well known that this can lead to interviewer variability having a relatively large impact on the total error, particularly for estimates of simple quantities such as means and proportions. In a previous paper (Davis and Scott, 1995) the impact for continuous responses was looked at using a linear components of variance model. However, most responses in health questionnaires are binary and it is known that this approach results in underestimating the intra-cluster and intra-interviewer correlations for binary responses. In this paper,a multi-level binary model is used to explore the impact of interviewer variability on estimated proportions.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016277
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The advent of computerized record-linkage methodology has facilitated the conduct of cohort mortality studies in which exposure data in one database are electronically linked with mortality data from another database. In this article, the impact of linkage errors on estimates of epidemiological indicators of risk, such as standardized mortality ratios and relative risk regression model parameters, is explored. It is shown that these indicators can be subject to bias and additional variability in the presence of linkage errors, with false links and non-links leading to positive and negative bias, respectively, in estimates of the standardized mortality ratio. Although linkage errors always increase the uncertainty in the estimates, bias can be effectively eliminated in the special case in which the false positive rate equals the false negative rate within homogeneous states defined by cross-classification of the covariates of interest.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016278
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The first round on quality reporting of the statistics produced in Eurostat has almost been completed. This paper presents the experiences so far and, in particular, some of the methodological problems encountered when measuring the quality of the statistics that are produced for international comparisons. A proposal is also presented for indicators that summarize the detailed information provided in these quality reports. Two sets of indicator are discussed: the first more producer-oriented, the second more user-oriented.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016279
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Rather than having to rely on traditional measures of survey quality, such as response rates, the Social Survey Division of the U.K. Office for National Statistics has been looking for alternative ways to report on quality. In order to achieve this, all the processes involved throughout the lifetime of a survey, from sampling and questionnaire design through to production of the finished report, have been mapped out. Having done this, we have been able to find quality indicators for many of these processes. By using this approach, we hope to be able to appraise any changes to our processes as well as to inform our customers of the quality of the work we carry out.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016280
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Survey response rates serve as one key measure of the quality of a data set. However, they are only useful to a statistical agency in the evaluation of ongoing data collections if they are based on a predefined set of formulas and definitions that are uniformly applied across all data collections.

    In anticipation of a revision of the current National Center for Education Statistics (NCES) statistical standards, several agency-wide audits of statistical practices were undertaken in the late 1990s. In particular, a compendium documenting major survey design parameters of NCES surveys was drafted. Related to this, NCES conducted a targeted audit of the consistency in response rate calculations across these surveys.

    Although NCES has had written statistical standards since 1988, the audit of the reported response rates from 50 survey components in 14 NCES surveys revealed considerable variability in procedures used to calculate response rates. During the course of the response rate audit, the Statistical Standards Program staff concluded that the organization of the 1992 Standards made it difficult to find all of the information associated with response rates in the standards. In fact, there are references to response rate in a number of separate standards scattered throughout the 1992 Statistical Standards.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016281
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Methodology for estimating the sampling error of the non-seasonally adjusted estimate of level of the Index of Production (IoP) has previously been developed using Taylor linearization and parametric bootstrap methods, with both producing comparable results. From the study, it was considered that the parametric bootstrap approach would be more practical to implement. This paper describes the methodology that is being developed to estimate the sampling error of the non-seasonally adjusted IoP change using the parametric bootstrap method, along with the data that are needed from the contributing surveys, the assumptions made, and the practical problems encountered during development.

    Release date: 2002-09-12
Stats in brief (1)

Stats in brief (1) ((1 result))

  • Stats in brief: 13-604-M2002039
    Description:

    The latest annual results for the US/Canada purchasing power parities (PPPs) and real expenditures per head in the US compared with Canada are published in this paper. The data were developed for the period 1992 to 2001, using the latest US and Canada expenditure data from the National Accounts and price comparisons for 1999. The paper contains summaries of differences between the results of the multilateral (OECD) study and the Statistics Canada bilateral study. Some differences in classifications have been incorporated, as well as normal national Accounts revisions. Ten tables are presented in an Appendix for 21 categories of expenditure for the GDP.

    Release date: 2002-06-28
Articles and reports (105)

Articles and reports (105) (60 to 70 of 105 results)

  • Articles and reports: 11-522-X20010016292
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Statistics can serve to benefit society, but, if manipulated politically or otherwise, statistics may also be used by the powerful as instruments to maintain the status quo or even to oppress. Statisticians working internationally, usually employed by international, supra-national or bilateral agencies, face a range of problems as they try to 'make a difference' in the lives of the poorest people in the world. One of the most difficult challenges statisticians face is the dilemma between open accountability and national sovereignty (in relation to what data are collected, the methods used and who is to have access to the results). Because of increasing globalization and new modalities of development co-operation and partnership, statisticians work in a constantly changing environment.

    This paper addresses the problems of improving the quality of cross-national data. This paper aims to raise consciousness of the role of statisticians at the international level; describe some of the constraints under which statisticians work; address principles which ought to govern the general activities of statisticians; and evaluate, in particular, the relevance of such principles to international statisticians. This paper also draws upon the recent Presidential Address to the Royal Statistical Society (Presented June 2001, JRSS Series D forthcoming).

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016294
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists. Outcome rates in telephone surveys are usually based on an entire sample. However, telephone samples from commercial sample vendors contain identifiable subsets of records with very different probabilities of obtaining particular dispositions. In such a case, component outcome rates could vary in ways unrelated to rates based on the entire sample. The 2000 Behavioral Risk Factor Surveillance System (BRFSS) survey examines the degree to which selected outcome rates (by state, for different subsets of records) correlate with corresponding global rates. Although correlations tend to be large, not all cases are, making it worthwhile to examine component outcome rates.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016295
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    In order to compensate for unreported, missing, unreasonable, or unusable data, the Uniform Crime Reporting (UCR) Program conducts data estimations and imputations using a variety of statistical methods. This paper illustrates how offence and arrest data are estimated using a variety of different approaches. The paper also points out the strengths and the shortcomings of each approach.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016296
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The Canadian Labour Force Survey (LFS) is one of Statistics Canada's most important surveys. It is a monthly survey that collects data concerning the person's labour force status, the nature of the person's work or reason for not working, and the person's demographics. The survey sample consists of approximately 52,000 households. Coverage error is a measure of data quality that is important to any survey. One of the key measures of coverage error in the LFS is the percentage difference between the Census of Population estimates and the LFS population counts; this error is called slippage. A negative value indicates that the LFS has a problem of overcoverage, while a positive value indicates the LFS has an undercoverage problem. In general, slippage is positive, thus meaning that the LFS consistently misses people who should be enumerated.

    The purpose of this study was to determine why slippage is increasing and what can be done to remedy it. The study was conducted in two stages. The first stage was a historical review of the projects that have studied and tried to control slippage in the LFS, as well as the operational changes that have been implemented over time. The second stage was an analysis of factors such as vacancy rates, non-response, demographics, urban and rural status and the impact of these factors on the slippage rate.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016297
    Description:

    This paper discusses in detail issues dealing with the technical aspects in designing and conducting surveys. It is intended for an audience of survey methodologists. The Danish National Institute of Social Research is an independent institution under the Ministry of Social Affairs. The Institute carries out surveys on social issues on encompassing a broad range of subjects. The Sustainable Financing Initiative Survey (SFI-SURVEY) is an economically independent section within the institute. SFI-SURVEY carries out scientific surveys both for the Institute, for other public organizations, and for the private sector as well. The SFI-SURVEY interviewer body has 450 interviewers spread throughout Denmark. There are five supervisors, each with a regional office, who are in contact with the interviewer body. On a yearly basis, SFI-SURVEY conducts 40 surveys. The average sample size (gross) is 1,000 persons. The average response rate is 75%. Since January 1999, the following information about the surveys have been recorded: · Type of method used (face-to-face or telephone) · Length of questionnaire (interviewing time in minutes) · Whether or not a folder was sent to the respondents in advance · Whether or not an interviewer instruction meeting was given · Number of interviews per interviewer per week · Whether or not the subject of the survey was of interest to the respondents · Interviewing month · Target group (random selection of the total population or special groups)

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016298
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    This paper discusses the Office for National Statistics' (ONS) approach to developing systematic quality measurements and reporting methods. It is presented against the background of European developments and the growing demand for quality measurement. Measuring the quality of statistics presents considerable practical and methodological challenges. The paper describes the main building blocks to be used for the new quality measure program, and includes specific examples. Working with other national statistical institutions; and developing an enhanced measurement framework, output measurements, and reporting procedures, are all vital ingredients in achieving recognition of the ONS as a quality organization.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016300
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The Australian Bureau of Statistics (ABS) produces many statistics that help the government and the wider community make more informed decisions. However, if these decisions are to be truly informed, it is essential that the users are able to understand the limitations of the statistics and how to use the data in an appropriate context. As a result, the ABS has initiated a project entitled Qualifying Quality, which focuses on two key directions: presentation and education. Presentation provides people with information about the quality of the data in order to help them answer the question "Are the data fit for the purpose?"; while education assists those people in appreciating the importance of information on quality and knowing how to use such information. In addressing these two issues, the project also aims to develop and identify processes and technical systems that will support and encourage the appropriate use of data.

    This paper provides an overview of the presentation and education initiatives which have arisen from this project. The paper then explores the different methods of presentation, the systems that support them, and how the education strategies interact with each other. In particular, the paper comments on the importance of supporting education strategies with well developed systems and appropriate presentation methods.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016301
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The Integrated Metadatabase is a corporate repository of information for each of Statistics Canada's surveys. The information stored in the Integrated Metadatabase includes a description of data sources and methodology, definitions of concepts and variables measured, and indicators of data quality. It provides an effective vehicle for communicating data quality to data users. Its coverage of Statistics Canada's data holdings is exhaustive, the provided information on data quality complies with the Policy on Informing Users of Data Quality and Methodology, and it is presented in a consistent and systematic fashion.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016302
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    This session provides three more contributions to the continuing discussion concerning the national statistics offices' response to the topic of quality -in particular, the subtopic of communicating quality. These three papers make the important and necessary assumption that national statistical offices have an obligation to report the limitations of the data; users should know and understand those limitations; and, as a result of understanding the limitations, users ought to be able to determine whether the data are fit for their purposes.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016303
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    In large-scale surveys, it is almost guaranteed that some level of non-response will occur. Generally, statistical agencies use imputation as a way to treat non-response items. A common preliminary step to imputation is the formation of imputation cells. In this article, the formation of these cells is studied using two methods. The first method is similar to that of Eltinge and Yansaneh (1997) in the case of weighting cells and the second is the method currently used in the Canadian Labour Force Survey. Using Labour Force data, simulation studies are performed to test the impact of the response rate, the response mechanism, and constraints on the quality of the point estimator in both methods.

    Release date: 2002-09-12
Journals and periodicals (1)

Journals and periodicals (1) ((1 result))

  • Journals and periodicals: 85F0036X
    Geography: Canada
    Description:

    This study documents the methodological and technical challenges that are involved in performing analysis on small groups using a sample survey, oversampling, response rate, non-response rate due to language, release feasibility and sampling variability. It is based on the 1999 General Social Survey (GSS) on victimization.

    Release date: 2002-05-14
Date modified: