Sort Help
entries

Results

All (107)

All (107) (0 to 10 of 107 results)

  • Articles and reports: 82-005-X20020016479
    Geography: Canada
    Description:

    The Population Health Model (POHEM) is a policy analysis tool that helps answer "what-if" questions about the health and economic burden of specific diseases and the cost-effectiveness of administering new diagnostic and therapeutic interventions. This simulation model is particularly pertinent in an era of fiscal restraint, when new therapies are generally expensive and difficult policy decisions are being made. More important, it provides a base for a broader framework to inform policy decisions using comprehensive disease data and risk factors. Our "base case" models comprehensively estimate the lifetime costs of treating breast, lung and colorectal cancer in Canada. Our cancer models have shown the large financial burden of diagnostic work-up and initial therapy, as well as the high costs of hospitalizing those dying of cancer. Our core cancer models (lung, breast and colorectal cancer) have been used to evaluate the impact of new practice patterns. We have used these models to evaluate new chemotherapy regimens as therapeutic options for advanced lung cancer; the health and financial impact of reducing the hospital length of stay for initial breast cancer surgery; and the potential impact of population-based screening for colorectal cancer. To date, the most interesting intervention we have studied has been the use of tamoxifen to prevent breast cancer among high risk women.

    Release date: 2002-10-08

  • Articles and reports: 11-522-X20010016227
    Description:

    The reputation of a national statistical office depends on the level of service it provides. Quality must be a core value and providing excellent service has to be embedded in the culture of a statistical organization.

    The paper outlines what is meant by a high quality statistical service. It explores factors that contribute to a quality work culture. In particular, it outlines the activities and experiences of the Australian Bureau of Statistics in maintaining a quality culture.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016228
    Description:

    The Current Population Survey is the primary source of labour force data for the United States. Throughout any survey process, it is critical that data quality be ensured. This paper discusses how quality issues are addressed during all steps of the survey process, including the development of the sample frame, sampling operations, sample control, data collection, editing, imputation, estimation, questionnaire development. It also reviews the quality evaluations that are built into the survey process. The paper concludes with a discussion of current research and possible future improvements to the survey.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016230
    Description:

    This publication consists of three papers, each addressing data quality issues associated with a large and complex survey. Two of the case studies involve household surveys of labour force activity and the third focuses on a business survey. The papers each address a data quality topic from a different perspective, but share some interesting common threads.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016231
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. Its is intended for an audience of survey methodologists.

    In 2000, the Behavioral Risk Factor Surveillance System (BRFSS) conducted monthly telephone surveys in 50 American states, the District of Columbia, and Puerto Rico: each was responsible for collecting its own survey data. In Maine, data collection was split between the state health department and ORC Macro, a commercial market research firm. Examination of survey outcome rates, selection biases and missing values for income suggest that the Maine health department data are more accurate. However, out of 18 behavioural health risk factors, only four are statistically different by data collector, and for these four factors, the data collected by ORC Macro seem more accurate.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016233
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    From January 2000, the data collection method of the Finnish Consumer Survey was changed from a Labour Force Survey panel design mode to an independent survey. All interviews are now carried out centrally from Statistics Finland's Computer Assisted Telephone Interview (CATI) Centre. There have been suggestions that the new survey mode has been influencing the respondents' answers. This paper analyses the extent of obvious changes in the results of the Finnish Consumer Survey. This is accomplished with the help of a pilot survey. Furthermore, this paper studies the interviewer's role in the data collection process. The analysis is based on cross-tabulations, chi-square tests and multinomial logit models. It shows that the new survey method produces more optimistic estimations and expectations concerning economic matters than the old method did.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016235
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Police records collected by the Federal Bureau of Investigation (FBI) through the Uniform Crime Reporting (UCR) Program are the leading source of national crime statistics. Recently, audits to correct UCR records have raised concerns as to how to handle the errors discovered in these files. Concerns centre around the methodology used to detect errors and the procedures used to correct errors once they have been discovered. This paper explores these concerns, focusing on sampling methodology, establishment of a statistical-adjustment factor, and alternative solutions. The paper distinguishes the difference between sample adjustment and sample estimates of an agency's data, and recommends sample adjustment as the most accurate way of dealing with errors.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016236
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The Uniform Crime Reporting (UCR) Program has devoted a considerable amount of resources in a continuous effort to improve the quality of its data. In this paper, the authors introduce and discuss the use of the cross-ratios and chi-square measures to evaluate the rationality of the data. The UCR data is used to empirically illustrate this approach.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016237
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Secondary users of health information often assume that administrative data provides a relatively sound basis for making important planning and policy decisions. If errors are evenly or randomly distributed, this assumption may have little impact on these decisions. However, when information sources contain systematic errors, or when systematic errors are introduced during the creation of master files, this assumption can be damaging.

    The most common systematic errors involve underreporting activities for a specific population; inaccurate re-coding of spatial information; and differences in data entry protocols, which have raised questions about the consistency of data submitted by different tracking agencies. The Central East Health Information Partnership (CEHIP) has identified a number of systematic errors in administrative databases and has documented many of these in reports distributed to partner organizations.

    This paper describes how some of these errors were identified and notes the processes that give rise to the loss of data integrity. The conclusion addresses some of the impacts these problems have for health planners, program managers and policy makers.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016238
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Research programs building on population-based, longitudinal administrative data and record-linkage techniques are found in England, Scotland, the United States (the Mayo Clinic), Western Australia and Canada. These systems can markedly expand both the methodological and the substantive research in health and health care.

    This paper summarizes published, Canadian data quality studies regarding registries, hospital discharges, prescription drugs, and physician claims. It makes suggestions for improving registries, facilitating record linkage and expanding research into social epidemiology. New trends in case identification and health status measurement using administrative data have also been noted. And the differing needs for data quality research in each province have been highlighted.

    Release date: 2002-09-12
Stats in brief (1)

Stats in brief (1) ((1 result))

  • Stats in brief: 13-604-M2002039
    Description:

    The latest annual results for the US/Canada purchasing power parities (PPPs) and real expenditures per head in the US compared with Canada are published in this paper. The data were developed for the period 1992 to 2001, using the latest US and Canada expenditure data from the National Accounts and price comparisons for 1999. The paper contains summaries of differences between the results of the multilateral (OECD) study and the Statistics Canada bilateral study. Some differences in classifications have been incorporated, as well as normal national Accounts revisions. Ten tables are presented in an Appendix for 21 categories of expenditure for the GDP.

    Release date: 2002-06-28
Articles and reports (105)

Articles and reports (105) (10 to 20 of 105 results)

  • Articles and reports: 11-522-X20010016241
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Leslie Kish long advocated the use of the "rolling sample" design. With non-overlapping, monthly panels that can be cumulated over different lengths of time for domains of different sizes, the rolling sample design enables a single survey to serve multiple purposes. The Census Bureau's new American Community Survey uses such a rolling sample design with annual averages to measure change at the state level, and three-year or five-year moving averages to describe progressively smaller domains. This paper traces Kish's influence on the development of the American Community Survey, and discusses some practical methodological issues that had to be addressed during the implementation of the design.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016242
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    "Remembering Leslie Kish" provides us with a personal view of his many contributions to the international development of statistics. One of the elements that made his contributions so special and effective was the "Kish approach". The characteristic features of this approach include: identifying what is important; formulating and answering practical questions; seeking patterns and frameworks; and above all, persisting in the promotion of good ideas. Areas in which his technical contributions have made the most impact on practical survey work in developing countries have been identified. A unique aspect of Leslie's contribution is the motivation he created for the development of a world-wide community of survey samplers.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016243
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Since 1996, the Census Bureau has been creating Web Computerized Self-Administered Questionnaires (CSAQs). These electronic questionnaires have some data quality advantages over paper questionnaires such as the availability of online help; pre-loaded data; the use of interactive edits (which allows respondents to correct their responses as they are entered); and, for establishment surveys, the ability to import data from spreadsheets. This paper provides an overview of the Census Bureau's Web CSAQs. Each of the Web CSAQ design features promoting data quality are explained, as well as the features that impose data quality obstacles. Finally, some recent, empirical, data quality results from both establishment and household surveys are presented.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016244
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Over the past few years, Statistics New Zealand (SNZ) has experienced an increase in the volume of business survey data supplied by e-mail. However, up until now, SNZ has not had the business processes available to support electronic collection in a way that meets both the needs of SNZ and data suppliers. To this end, SNZ has invested a lot of effort over the last year in investigating how best to approach the problems and opportunities presented by electronic data collection. This paper outlines SNZ's plans to move the e-mail supplied data to a secure lodgement facility and the future development of an internet-based data collection system. It also presents a case study of the Monthly Retail Trade Survey data currently supplied by e-mail. This case study illustrates some of the benefits of electronic data, but also examines some of the costs to the organization and the data quality problems encountered. It also highlights the need to consider the data collection methodology within the wider context of the total survey cycle.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016245
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    This paper summarizes recent Australian Bureau of Statistics (ABS) methodological developments and other experiences with electronic data reporting (EDR). It deals particularly with the part of EDR loosely defined as 'e-forms', or screen-based direct collection instruments, where the respondent manually enters all or most of the data. In this context, the paper covers recent ABS experiences and current work, but does not revisit the historical EDR work or cover other developments in Australia outside the ABS.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016246
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Samples sizes in small population areas are typically very small. As a result, customary, area-specific, direct estimators of Small Area Means do not provide acceptable quality in terms of Mean Square Error (MSE). Indirect estimators that borrow strength from related areas by linking models based on similar auxiliary data are now widely used for small area estimation. Such linking models are either implicit (as in the case of synthetic estimators) or explicit (as in the case of model-based estimators). In the Frequentist approach, the quality of an indirect estimator is measured by its estimated MSE while the posterior variance of the Small Area Mean is used in the Bayesian approach. This paper reviews some recent work on estimating MSE and the evaluation of posterior variance.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016247
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    This paper describes joint research by the Office for National Statistics (ONS) and Southampton University regarding the evaluation of several different approaches to the local estimation of International Labour Office (ILO) unemployment. The need to compare estimators with different underlying assumptions has led to a focus on evaluation methods that are (partly at least) model-independent. Model-fit diagnostics that have been considered include: various residual procedures, cross-validation, predictive validation, consistency with marginals, and consistency with direct estimates within single cells. These diagnostics have been used to compare different model-based estimators with each other and with direct estimators.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016248
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The Sawmill Survey is a voluntary census of sawmills in Great Britain. It is limited to fixed mills using domestically-grown timber. Three approaches to assess the coverage of this survey are described:

    (1) A sample survey of the sawmilling industry from the UK's business register, excluding businesses already sampled in the Sawmill Survey, is used to assess the undercoverage in the list of known sawmills; (2) A non-response follow-up using local knowledge of regional officers of the Forestry Commission, is used to estimate the sawmills that do not respond (mostly the smaller mills); and (3) A survey of small-scale sawmills and mobile sawmills (many of these businesses are micro-enterprises) is conducted to analyse their significance.

    These three approaches are synthesized to give an estimate of the coverage of the original survey compared with the total activity identified, and to estimate the importance of micro-enterprises to the sawmilling industry in Great Britain.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016249
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.The United States' Census 2000 operations were more innovative and complex than ever before. State population totals were required to be produced within nine months and, using the coverage measurement survey, adjusted counts were expected within one year. Therefore, all operations had to be implemented and completed quickly with quality assurance (QA) that had both an effective and prompt turnaround. The QA challenges included: getting timely information to supervisors (such as enumerator re-interview information), performing prompt checks of "suspect" work (such as monitoring contractors to ensure accurate data capture), and providing reports to headquarters quickly. This paper presents these challenges and their solutions in detail, thus providing an overview of the Census 2000 QA program.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016250
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    This paper describes the Korea National Statistics Office's (KNSO) experiences in data quality assessment and introduces the strategies of institutionalizing the assessment procedure. This paper starts by briefly describing the definition of quality assessment, quality dimensions and indicators at the national level. It introduces the current situation of the quality assessment process in KNSO and lists the six dimensions of quality that have been identified: relevance, accuracy, timeliness, accessibility, comparability and efficiency. Based on the lessons learned from these experiences, this paper points out three essential elements required in an advanced system of data quality assessment: an objective and independent planning system, a set of appropriate indicators and competent personnel specialized in data quality assessment.

    Release date: 2002-09-12
Journals and periodicals (1)

Journals and periodicals (1) ((1 result))

  • Journals and periodicals: 85F0036X
    Geography: Canada
    Description:

    This study documents the methodological and technical challenges that are involved in performing analysis on small groups using a sample survey, oversampling, response rate, non-response rate due to language, release feasibility and sampling variability. It is based on the 1999 General Social Survey (GSS) on victimization.

    Release date: 2002-05-14
Date modified: