Filter results by

Search Help
Currently selected filters that can be removed

Keyword(s)

Year of publication

5 facets displayed. 0 facets selected.

Author(s)

11 facets displayed. 0 facets selected.

Survey or statistical program

3 facets displayed. 0 facets selected.

Content

1 facets displayed. 1 facets selected.
Sort Help
entries

Results

All (6)

All (6) ((6 results))

  • Articles and reports: 12-001-X202300200014
    Description: Many things have been written about Jean-Claude Deville in tributes from the statistical community (see Tillé, 2022a; Tillé, 2022b; Christine, 2022; Ardilly, 2022; and Matei, 2022) and from the École nationale de la statistique et de l’administration économique (ENSAE) and the Société française de statistique. Pascal Ardilly, David Haziza, Pierre Lavallée and Yves Tillé provide an in-depth look at Jean-Claude Deville’s contributions to survey theory. To pay tribute to him, I would like to discuss Jean-Claude Deville’s contribution to the more day-to-day application of methodology for all the statisticians at the Institut national de la statistique et des études économiques (INSEE) and at the public statistics service. To do this, I will use my work experience, and particularly the four years (1992 to 1996) I spent working with him in the Statistical Methods Unit and the discussions we had thereafter, especially in the 2000s on the rolling census.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202200100005
    Description:

    Methodological studies of the effects that human interviewers have on the quality of survey data have long been limited by a critical assumption: that interviewers in a given survey are assigned random subsets of the larger overall sample (also known as interpenetrated assignment). Absent this type of study design, estimates of interviewer effects on survey measures of interest may reflect differences between interviewers in the characteristics of their assigned sample members, rather than recruitment or measurement effects specifically introduced by the interviewers. Previous attempts to approximate interpenetrated assignment have typically used regression models to condition on factors that might be related to interviewer assignment. We introduce a new approach for overcoming this lack of interpenetrated assignment when estimating interviewer effects. This approach, which we refer to as the “anchoring” method, leverages correlations between observed variables that are unlikely to be affected by interviewers (“anchors”) and variables that may be prone to interviewer effects to remove components of within-interviewer correlations that lack of interpenetrated assignment may introduce. We consider both frequentist and Bayesian approaches, where the latter can make use of information about interviewer effect variances in previous waves of a study, if available. We evaluate this new methodology empirically using a simulation study, and then illustrate its application using real survey data from the Behavioral Risk Factor Surveillance System (BRFSS), where interviewer IDs are provided on public-use data files. While our proposed method shares some of the limitations of the traditional approach – namely the need for variables associated with the outcome of interest that are also free of measurement error – it avoids the need for conditional inference and thus has improved inferential qualities when the focus is on marginal estimates, and it shows evidence of further reducing overestimation of larger interviewer effects relative to the traditional approach.

    Release date: 2022-06-21

  • Articles and reports: 12-001-X202100100006
    Description:

    It is now possible to manage surveys using statistical models and other tools that can be applied in real time. This paper focuses on three developments that reflect the attempt to take a more scientific approach to the management of survey field work: 1) the use of responsive and adaptive designs to reduce nonresponse bias, other sources of error, or costs; 2) optimal routing of interviewer travel to reduce costs; and 3) rapid feedback to interviewers to reduce measurement error. The article begins by reviewing experiments and simulation studies examining the effectiveness of responsive and adaptive designs. These studies suggest that these designs can produce modest gains in the representativeness of survey samples or modest cost savings, but can also backfire. The next section of the paper examines efforts to provide interviewers with a recommended route for their next trip to the field. The aim is to bring interviewers’ field work into closer alignment with research priorities while reducing travel time. However, a study testing this strategy found that interviewers often ignore such instructions. Then, the paper describes attempts to give rapid feedback to interviewers, based on automated recordings of their interviews. Interviewers often read questions in ways that affect respondents’ answers; correcting these problems quickly yielded marked improvements in data quality. All of the methods are efforts to replace the judgment of interviewers, field supervisors, and survey managers with statistical models and scientific findings.

    Release date: 2021-06-24

  • Articles and reports: 62F0014M2020016
    Description:

    A summary of methodological treatments as applied to the August 2020 CPI in response to the effects of the COVID-19 pandemic on price collection, price availability, and business closures.

    Release date: 2020-09-16

  • Articles and reports: 82-003-X202000800002
    Description:

    The purpose of this study was to examine the psychometric properties of the parent-rated Strengths and Difficulties Questionnaire with a nationally representative sample of Canadian children and adolescents.

    Release date: 2020-08-19

  • Journals and periodicals: 89-639-X
    Geography: Canada
    Description:

    Beginning in late 2006, the Social and Aboriginal Statistics Division of Statistics Canada embarked on the process of review of questions used in the Census and in surveys to produce data about Aboriginal peoples (North American Indian, Métis and Inuit). This process is essential to ensure that Aboriginal identification questions are valid measures of contemporary Aboriginal identification, in all its complexity. Questions reviewed included the following (from the Census 2B questionnaire):- the Ethnic origin / Aboriginal ancestry question;- the Aboriginal identity question;- the Treaty / Registered Indian question; and- the Indian band / First Nation Membership question.

    Additional testing was conducted on Census questions with potential Aboriginal response options: the population group question (also known as visible minorities), and the Religion question. The review process to date has involved two major steps: regional discussions with data users and stakeholders, and qualitative testing. The regional discussions with over 350 users of Aboriginal data across Canada were held in early 2007 to examine the four questions used on the Census and other surveys of Statistics Canada. Data users included National Aboriginal organizations, Aboriginal Provincial and Territorial Organizations, Federal, Provincial and local governments, researchers and Aboriginal service organizations. User feedback showed that main areas of concern were data quality, undercoverage, the wording of questions, and the importance of comparability over time.

    Release date: 2009-04-17
Stats in brief (0)

Stats in brief (0) (0 results)

No content available at this time.

Articles and reports (5)

Articles and reports (5) ((5 results))

  • Articles and reports: 12-001-X202300200014
    Description: Many things have been written about Jean-Claude Deville in tributes from the statistical community (see Tillé, 2022a; Tillé, 2022b; Christine, 2022; Ardilly, 2022; and Matei, 2022) and from the École nationale de la statistique et de l’administration économique (ENSAE) and the Société française de statistique. Pascal Ardilly, David Haziza, Pierre Lavallée and Yves Tillé provide an in-depth look at Jean-Claude Deville’s contributions to survey theory. To pay tribute to him, I would like to discuss Jean-Claude Deville’s contribution to the more day-to-day application of methodology for all the statisticians at the Institut national de la statistique et des études économiques (INSEE) and at the public statistics service. To do this, I will use my work experience, and particularly the four years (1992 to 1996) I spent working with him in the Statistical Methods Unit and the discussions we had thereafter, especially in the 2000s on the rolling census.
    Release date: 2024-01-03

  • Articles and reports: 12-001-X202200100005
    Description:

    Methodological studies of the effects that human interviewers have on the quality of survey data have long been limited by a critical assumption: that interviewers in a given survey are assigned random subsets of the larger overall sample (also known as interpenetrated assignment). Absent this type of study design, estimates of interviewer effects on survey measures of interest may reflect differences between interviewers in the characteristics of their assigned sample members, rather than recruitment or measurement effects specifically introduced by the interviewers. Previous attempts to approximate interpenetrated assignment have typically used regression models to condition on factors that might be related to interviewer assignment. We introduce a new approach for overcoming this lack of interpenetrated assignment when estimating interviewer effects. This approach, which we refer to as the “anchoring” method, leverages correlations between observed variables that are unlikely to be affected by interviewers (“anchors”) and variables that may be prone to interviewer effects to remove components of within-interviewer correlations that lack of interpenetrated assignment may introduce. We consider both frequentist and Bayesian approaches, where the latter can make use of information about interviewer effect variances in previous waves of a study, if available. We evaluate this new methodology empirically using a simulation study, and then illustrate its application using real survey data from the Behavioral Risk Factor Surveillance System (BRFSS), where interviewer IDs are provided on public-use data files. While our proposed method shares some of the limitations of the traditional approach – namely the need for variables associated with the outcome of interest that are also free of measurement error – it avoids the need for conditional inference and thus has improved inferential qualities when the focus is on marginal estimates, and it shows evidence of further reducing overestimation of larger interviewer effects relative to the traditional approach.

    Release date: 2022-06-21

  • Articles and reports: 12-001-X202100100006
    Description:

    It is now possible to manage surveys using statistical models and other tools that can be applied in real time. This paper focuses on three developments that reflect the attempt to take a more scientific approach to the management of survey field work: 1) the use of responsive and adaptive designs to reduce nonresponse bias, other sources of error, or costs; 2) optimal routing of interviewer travel to reduce costs; and 3) rapid feedback to interviewers to reduce measurement error. The article begins by reviewing experiments and simulation studies examining the effectiveness of responsive and adaptive designs. These studies suggest that these designs can produce modest gains in the representativeness of survey samples or modest cost savings, but can also backfire. The next section of the paper examines efforts to provide interviewers with a recommended route for their next trip to the field. The aim is to bring interviewers’ field work into closer alignment with research priorities while reducing travel time. However, a study testing this strategy found that interviewers often ignore such instructions. Then, the paper describes attempts to give rapid feedback to interviewers, based on automated recordings of their interviews. Interviewers often read questions in ways that affect respondents’ answers; correcting these problems quickly yielded marked improvements in data quality. All of the methods are efforts to replace the judgment of interviewers, field supervisors, and survey managers with statistical models and scientific findings.

    Release date: 2021-06-24

  • Articles and reports: 62F0014M2020016
    Description:

    A summary of methodological treatments as applied to the August 2020 CPI in response to the effects of the COVID-19 pandemic on price collection, price availability, and business closures.

    Release date: 2020-09-16

  • Articles and reports: 82-003-X202000800002
    Description:

    The purpose of this study was to examine the psychometric properties of the parent-rated Strengths and Difficulties Questionnaire with a nationally representative sample of Canadian children and adolescents.

    Release date: 2020-08-19
Journals and periodicals (1)

Journals and periodicals (1) ((1 result))

  • Journals and periodicals: 89-639-X
    Geography: Canada
    Description:

    Beginning in late 2006, the Social and Aboriginal Statistics Division of Statistics Canada embarked on the process of review of questions used in the Census and in surveys to produce data about Aboriginal peoples (North American Indian, Métis and Inuit). This process is essential to ensure that Aboriginal identification questions are valid measures of contemporary Aboriginal identification, in all its complexity. Questions reviewed included the following (from the Census 2B questionnaire):- the Ethnic origin / Aboriginal ancestry question;- the Aboriginal identity question;- the Treaty / Registered Indian question; and- the Indian band / First Nation Membership question.

    Additional testing was conducted on Census questions with potential Aboriginal response options: the population group question (also known as visible minorities), and the Religion question. The review process to date has involved two major steps: regional discussions with data users and stakeholders, and qualitative testing. The regional discussions with over 350 users of Aboriginal data across Canada were held in early 2007 to examine the four questions used on the Census and other surveys of Statistics Canada. Data users included National Aboriginal organizations, Aboriginal Provincial and Territorial Organizations, Federal, Provincial and local governments, researchers and Aboriginal service organizations. User feedback showed that main areas of concern were data quality, undercoverage, the wording of questions, and the importance of comparability over time.

    Release date: 2009-04-17
Date modified: