Sort Help
entries

Results

All (107)

All (107) (0 to 10 of 107 results)

  • Articles and reports: 82-005-X20020016479
    Geography: Canada
    Description:

    The Population Health Model (POHEM) is a policy analysis tool that helps answer "what-if" questions about the health and economic burden of specific diseases and the cost-effectiveness of administering new diagnostic and therapeutic interventions. This simulation model is particularly pertinent in an era of fiscal restraint, when new therapies are generally expensive and difficult policy decisions are being made. More important, it provides a base for a broader framework to inform policy decisions using comprehensive disease data and risk factors. Our "base case" models comprehensively estimate the lifetime costs of treating breast, lung and colorectal cancer in Canada. Our cancer models have shown the large financial burden of diagnostic work-up and initial therapy, as well as the high costs of hospitalizing those dying of cancer. Our core cancer models (lung, breast and colorectal cancer) have been used to evaluate the impact of new practice patterns. We have used these models to evaluate new chemotherapy regimens as therapeutic options for advanced lung cancer; the health and financial impact of reducing the hospital length of stay for initial breast cancer surgery; and the potential impact of population-based screening for colorectal cancer. To date, the most interesting intervention we have studied has been the use of tamoxifen to prevent breast cancer among high risk women.

    Release date: 2002-10-08

  • Articles and reports: 11-522-X20010016227
    Description:

    The reputation of a national statistical office depends on the level of service it provides. Quality must be a core value and providing excellent service has to be embedded in the culture of a statistical organization.

    The paper outlines what is meant by a high quality statistical service. It explores factors that contribute to a quality work culture. In particular, it outlines the activities and experiences of the Australian Bureau of Statistics in maintaining a quality culture.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016228
    Description:

    The Current Population Survey is the primary source of labour force data for the United States. Throughout any survey process, it is critical that data quality be ensured. This paper discusses how quality issues are addressed during all steps of the survey process, including the development of the sample frame, sampling operations, sample control, data collection, editing, imputation, estimation, questionnaire development. It also reviews the quality evaluations that are built into the survey process. The paper concludes with a discussion of current research and possible future improvements to the survey.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016230
    Description:

    This publication consists of three papers, each addressing data quality issues associated with a large and complex survey. Two of the case studies involve household surveys of labour force activity and the third focuses on a business survey. The papers each address a data quality topic from a different perspective, but share some interesting common threads.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016231
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. Its is intended for an audience of survey methodologists.

    In 2000, the Behavioral Risk Factor Surveillance System (BRFSS) conducted monthly telephone surveys in 50 American states, the District of Columbia, and Puerto Rico: each was responsible for collecting its own survey data. In Maine, data collection was split between the state health department and ORC Macro, a commercial market research firm. Examination of survey outcome rates, selection biases and missing values for income suggest that the Maine health department data are more accurate. However, out of 18 behavioural health risk factors, only four are statistically different by data collector, and for these four factors, the data collected by ORC Macro seem more accurate.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016233
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    From January 2000, the data collection method of the Finnish Consumer Survey was changed from a Labour Force Survey panel design mode to an independent survey. All interviews are now carried out centrally from Statistics Finland's Computer Assisted Telephone Interview (CATI) Centre. There have been suggestions that the new survey mode has been influencing the respondents' answers. This paper analyses the extent of obvious changes in the results of the Finnish Consumer Survey. This is accomplished with the help of a pilot survey. Furthermore, this paper studies the interviewer's role in the data collection process. The analysis is based on cross-tabulations, chi-square tests and multinomial logit models. It shows that the new survey method produces more optimistic estimations and expectations concerning economic matters than the old method did.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016235
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Police records collected by the Federal Bureau of Investigation (FBI) through the Uniform Crime Reporting (UCR) Program are the leading source of national crime statistics. Recently, audits to correct UCR records have raised concerns as to how to handle the errors discovered in these files. Concerns centre around the methodology used to detect errors and the procedures used to correct errors once they have been discovered. This paper explores these concerns, focusing on sampling methodology, establishment of a statistical-adjustment factor, and alternative solutions. The paper distinguishes the difference between sample adjustment and sample estimates of an agency's data, and recommends sample adjustment as the most accurate way of dealing with errors.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016236
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The Uniform Crime Reporting (UCR) Program has devoted a considerable amount of resources in a continuous effort to improve the quality of its data. In this paper, the authors introduce and discuss the use of the cross-ratios and chi-square measures to evaluate the rationality of the data. The UCR data is used to empirically illustrate this approach.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016237
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Secondary users of health information often assume that administrative data provides a relatively sound basis for making important planning and policy decisions. If errors are evenly or randomly distributed, this assumption may have little impact on these decisions. However, when information sources contain systematic errors, or when systematic errors are introduced during the creation of master files, this assumption can be damaging.

    The most common systematic errors involve underreporting activities for a specific population; inaccurate re-coding of spatial information; and differences in data entry protocols, which have raised questions about the consistency of data submitted by different tracking agencies. The Central East Health Information Partnership (CEHIP) has identified a number of systematic errors in administrative databases and has documented many of these in reports distributed to partner organizations.

    This paper describes how some of these errors were identified and notes the processes that give rise to the loss of data integrity. The conclusion addresses some of the impacts these problems have for health planners, program managers and policy makers.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016238
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Research programs building on population-based, longitudinal administrative data and record-linkage techniques are found in England, Scotland, the United States (the Mayo Clinic), Western Australia and Canada. These systems can markedly expand both the methodological and the substantive research in health and health care.

    This paper summarizes published, Canadian data quality studies regarding registries, hospital discharges, prescription drugs, and physician claims. It makes suggestions for improving registries, facilitating record linkage and expanding research into social epidemiology. New trends in case identification and health status measurement using administrative data have also been noted. And the differing needs for data quality research in each province have been highlighted.

    Release date: 2002-09-12
Stats in brief (1)

Stats in brief (1) ((1 result))

  • Stats in brief: 13-604-M2002039
    Description:

    The latest annual results for the US/Canada purchasing power parities (PPPs) and real expenditures per head in the US compared with Canada are published in this paper. The data were developed for the period 1992 to 2001, using the latest US and Canada expenditure data from the National Accounts and price comparisons for 1999. The paper contains summaries of differences between the results of the multilateral (OECD) study and the Statistics Canada bilateral study. Some differences in classifications have been incorporated, as well as normal national Accounts revisions. Ten tables are presented in an Appendix for 21 categories of expenditure for the GDP.

    Release date: 2002-06-28
Articles and reports (105)

Articles and reports (105) (50 to 60 of 105 results)

  • Articles and reports: 11-522-X20010016282
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The Discharge Abstract Database (DAD) is one of the key data holdings held by the Canadian Institute for Health Information (CIHI). The institute is a national, not-for-profit organization, which plays a critical role in the development of Canada's health information system. The DAD contains acute care discharge data from most Canadian hospitals. The data generated are essential for determining, for example, the number and types of procedures and the length of hospital stays. CIHI is conducting the first national data quality study of selected clinical and administrative data from the DAD. This study is evaluating and measuring the accuracy of the DAD by returning to the original data sources and comparing this information with what exists in the CIHI database, in order to identify any discrepancies and their associated reasons. This paper describes the DAD data quality study and some preliminary findings. The findings are also briefly compared with another similar study. In conclusion, the paper discusses subsequent steps for the study and how the findings from the first year are contributing to improvements in the quality of the DAD.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016283
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The accurate recording of patients' Indegenous status in hospital separations data is critical to analyses of health service use by Aboriginal and Torres Strait Islander Australians, who have relatively poor health. However, the accuracy of these data is now well understood. In 1998, a methodology for assessing the data accuracy was piloted in 11 public hospitals. Data were collected for 8,267 patients using a personal interview, and compared with the corresponding, routinely collected data. Among the 11 hospitals, the proportion of patients correctly recorded as Indigenous ranged from 55 % to 100 %. Overall, hospitals with high proportions of Indigenous persons in their catchment areas reported more accurate data. The methodology has since been used to assess data quality in hospitals in two Australian states and to promote best practice data collection.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016284
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Since 1965, the National Center for Health Statistics has conducted the National Hospital Discharge Survey (NHDS), a national probability sample survey of discharges from non-federal, short-stay and general hospitals. A major aspect of the NHDS redesign in 1988 was to use electronic data from abstracting service organizations and state data systems. This paper presents an overview of the development of the NHDS and the 1988 redesign. Survey methodologies are reviewed in light of the data collection and processing issues arising from the combination of "manually" abstracted data and "automated" data. Methods for assessing the overall quality and accuracy of the NHDS data are discussed for both data collection modes. These methods include procedures to ensure that incoming data meet established standards and that abstracted data are processed and coded according to strict quality control procedures. These procedures are presented in the context of issues and findings from the broader literature about the quality of hospital administrative data sets.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016285
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The three papers presented in this session offer excellent insight into the issues concerning the quality of hospital morbidity data. Richards, Brown, and Homan sampled hospital records to evaluate administrative data in Canada; Hargreaves sampled persons in hospitals to evaluate administrative data in Australia; and McLemoreand Pokras describe the quality assurance practices of an ongoing sample survey of hospital records in the United States. Each paper is discussed, along with the issues and challenges for the future.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016286
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    It is customary for statistical agencies to audit tables containing suppressed cells in order to ensure that there is sufficient protection against inadvertent disclosure of sensitive information. If the table contains rounded values, this fact may be ignored by the audit procedure. This oversight can result in over-protection, reducing the utility of the published data. This paper provides correct auditing formulation and gives examples of over-protection.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016287
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    In this paper we discuss a specific component of a research agenda aimed at disclosure protections for "non-traditional" statistical outputs. We argue that these outputs present different disclosure risks than normally faced and hence may require new thinking on the issue. Specifically, we argue that kernel density estimators, while powerful (high quality) descriptions of cross-sections, pose potential disclosure risks that depend materially on the selection of bandwidth. We illustrate these risks using a unique, non-confidential data set on the statistical universe of coal mines and present potential solutions. Finally, we discuss current practices at the U.S. Census Bureau's Center for Economic Studies for performing disclosure analysis on kernel density estimators.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016288
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    The upcoming 2002 U.S. Economic Census will give businesses the option of submitting their data on paper or by electronic media. If reporting electronically, they may report via Windows-based Computerized Self-Administered Questionnaires (CSAQs). The U.S. Census Bureau will offer electronic reporting for over 650 different forms to all respondents. The U.S. Census Bureau has assembled a cross-divisional team to develop an electronic forms style guide, outlining the design standards to use in electronic form creation and ensuring that the quality of the form designs will be consistent throughout.

    The purpose of a style guide is to foster consistency among the various analysts who may be working on different pieces of a software development project (in this case, a CSAQ). The team determined that the style guide should include standards for layout and screen design, navigation, graphics, edit capabilities, additional help, feedback, audit trails, and accessibility for disabled users.

    Members of the team signed up to develop various sections of the style guide. The team met weekly to discuss and review the sections. Members of the team also conducted usability tests on edits, and subject-matter employees provided recommendations to upper management. Team members conducted usability testing on prototype forms with actual respondents. The team called in subject-matter experts as necessary to assist in making decisions about particular forms where the constraints of the electronic medium required changes to the paper form.

    The style guide will become the standard for all CSAQs for the 2002 Economic Census, which will ensure consistency across the survey programs.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016289
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Increasing demand for electronic reporting in establishment surveys has placed additional emphasis on incorporating usability into electronic forms. We are just beginning to understand the implications surrounding electronic forms design. Cognitive interviewing and usability testing are analogous in that both types of testing have similar goals: to build an end instrument (paper or electronic) that reduces both respondent burden and measurement error. Cognitive testing has greatly influenced paper forms design and can also be applied towards the development of electronic forms. Usability testing expands on existing cognitive testing methodology to include examination of the interaction between the respondent and the electronic form.

    The upcoming U.S. 2002 Economic Census will offer businesses the ability to report information using electronic forms. The U.S. Census Bureau is creating an electronic forms style guide outlining the design standards to be used in electronic form creation. The style guide's design standards are based on usability principles, usability and cognitive test results, and Graphical User Interface standards. This paper highlights the major electronic forms design issues raised during the preparation of the style guide and describes how usability testing and cognitive interviewing resolved these issues.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016290
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Over the last five years, the United Kingdom Office for National Statistics has been implementing a series of initiatives to improve the process of collecting business statistics data in the UK. These initiatives include the application of a range of new technology solutions data collection; document imaging and scanned forms have replaced paper forms for all processes. For some inquiries, the paper form has been eliminated altogether by the adoption of Telephone Data Entry (TDE). Reporting all incoming data in electronic format has allowed workflow systems to be introduced across a wide range of data collection activities.

    This paper describes the recent history of these initiatives and covers proposals that are presently at a pilot stage or being projected for the next four years. It also examines the future strategy of TDE data collection via the Internet, and the current pilots and security issues under consideration.

    Release date: 2002-09-12

  • Articles and reports: 11-522-X20010016291
    Description:

    This paper discusses in detail issues dealing with the technical aspects of designing and conducting surveys. It is intended for an audience of survey methodologists.

    Many types of Web surveys are not based on scientific sampling and do not represent any well-defined population. Even when Web surveys are based on a general sample, it is not known whether they yield reliable or valid results. One way to test the adequacy of Web surveys is to conduct experiments comparing Web surveys with well-established, traditional survey methods. One such test was performed by comparing the 2000 General Social Survey of the National Opinion Research Center with a Knowledge Networks Web survey.

    Release date: 2002-09-12
Journals and periodicals (1)

Journals and periodicals (1) ((1 result))

  • Journals and periodicals: 85F0036X
    Geography: Canada
    Description:

    This study documents the methodological and technical challenges that are involved in performing analysis on small groups using a sample survey, oversampling, response rate, non-response rate due to language, release feasibility and sampling variability. It is based on the 1999 General Social Survey (GSS) on victimization.

    Release date: 2002-05-14
Date modified: