Analysis
Filter results by
Search HelpKeyword(s)
Subject
Author(s)
Survey or statistical program
Results
All (58)
All (58) (0 to 10 of 58 results)
- 1. Opening remarks of the Symposium 2008: Data Collection: Challenges, Achievements and New Directions ArchivedArticles and reports: 11-522-X200800010920Description:
On behalf of Statistics Canada, I would like to welcome you all, friends and colleagues, to Symposium 2008. This the 24th International Symposium organized by Statistics Canada on survey methodology.
Release date: 2009-12-03 - 2. International surveys: Motives and methodologies ArchivedArticles and reports: 11-522-X200800010937Description:
The context of the discussion is the increasing incidence of international surveys, of which one is the International Tobacco Control (ITC) Policy Evaluation Project, which began in 2002. The ITC country surveys are longitudinal, and their aim is to evaluate the effects of policy measures being introduced in various countries under the WHO Framework Convention on Tobacco Control. The challenges of organization, data collection and analysis in international surveys are reviewed and illustrated. Analysis is an increasingly important part of the motivation for large scale cross-cultural surveys. The fundamental challenge for analysis is to discern the real response (or lack of response) to policy change, separating it from the effects of data collection mode, differential non-response, external events, time-in-sample, culture, and language. Two problems relevant to statistical analysis are discussed. The first problem is the question of when and how to analyze pooled data from several countries, in order to strengthen conclusions which might be generally valid. While in some cases this seems to be straightforward, there are differing opinions on the extent to which pooling is possible and reasonable. It is suggested that for formal comparisons, random effects models are of conceptual use. The second problem is to find models of measurement across cultures and data collection modes which will enable calibration of continuous, binary and ordinal responses, and produce comparisons from which extraneous effects have been removed. It is noted that hierarchical models provide a natural way of relaxing requirements of model invariance across groups.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010939Description:
A year ago, Communications and Operations field initiated what is considered as Statistics Canada's first business architecture activity. This concerted effort was focused on collection related activities and processes, and was conducted over a short period during which over sixty STC senior and middle managers were consulted.
We will introduce the discipline of business architecture, an approach based on "business blueprints" to interface between enterprise needs and its enabling solutions. We will describe the specific approach used to conduct Statistics Canada Collection Business Architecture, summarize the key lessons learned from this initiative, and provide an update on where we are and where we are heading.
We will conclude by illustrating how this approach can serve as the genesis and foundation for an overall Statistics Canada business architecture.
Release date: 2009-12-03 - 4. Organisation of data collection methodology services in the Australian Bureau of Statistics ArchivedArticles and reports: 11-522-X200800010940Description:
Data Collection Methodology (DCM) enable the collection of good quality data by providing expert advice and assistance on questionnaire design, methods of evaluation and respondent engagement. DCM assist in the development of client skills, undertake research and lead innovation in data collection methods. This is done in a challenging environment of organisational change and limited resources. This paper will cover 'how DCM do business' with clients and the wider methodological community to achieve our goals.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010941Description:
Prior to 2004, the design and development of collection functions at Statistics New Zealand (Statistics NZ) was done by a centralised team of data collection methodologists. In 2004, an organisational review considered whether the design and development of these functions was being done in the most effective way. A key issue was the rising costs of surveying as the organisation moved from paper-based data collection to electronic data collection. The review saw some collection functions decentralised. However, a smaller centralised team of data collection methodologists was retained to work with subject matter areas across Statistics NZ.
This paper will discuss the strategy used by the smaller centralised team of data collection methodologists to support subject matter areas. There are three key themes to the strategy. First, is the development of best practice standards and a central standards repository. Second, is training and introducing knowledge sharing forums. Third, is providing advice and independent review to subject matter areas which design and develop collection instruments.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010946Description:
In the mid 1990s the first question testing unit was set-up in the UK Office for National Statistics (ONS). The key objective of the unit was to develop and test the questions and questionnaire for the 2001 Census. Since the establishment of this unit the area has been expanded into a Data Collection Methodology (DCM) Centre of Expertise which now sits in the Methodology Directorate. The DCM centre has three branches which support DCM work for social surveys, business surveys, the Census and external organisations.
In the past ten years DCM has achieved a variety of things. For example, introduced survey methodology involvement in the development and testing of business survey question(naire)s; introduced a mix-method approach to the development of questions and questionnaires; developed and implemented standards e.g. for the 2011 census questionnaire & showcards; and developed and delivered DCM training events.
This paper will provide an overview of data collection methodology at the ONS from the perspective of achievements and challenges. It will cover areas such as methods, staff (e.g. recruitment, development and field security), and integration with the survey process.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010947Description:
This paper addresses the efforts of the U.S. Energy Information Administration to design, test and implement new and substantially redesigned surveys. The need to change EIA's surveys has become increasingly important, as U.S. energy industries have moved from highly regulated to deregulated business. This has substantially affected both their ability and willingness to report data. The paper focuses on how EIA has deployed current tools for designing and testing surveys and the reasons that these methods have not always yielded the desired results. It suggests some new tools and methods that we would like to try to improve the quality of our data.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010948Description:
Past survey instruments, whether in the form of a paper questionnaire or telephone script, were their own documentation. Based on this, the ESRC Question Bank was created, providing free-access internet publication of questionnaires, enabling researchers to re-use questions, saving them trouble, whilst improving the comparability of their data with that collected by others. Today however, as survey technology and computer programs have become more sophisticated, accurate comprehension of the latest questionnaires seems more difficult, particularly when each survey team uses its own conventions to document complex items in technical reports. This paper seeks to illustrate these problems and suggest preliminary standards of presentation to be used until the process can be automated.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010949Description:
The expansion in scope of UK equality legislation has led to a requirement for data on sexual orientation. In response, the Office for National Statistics has initiated a project aiming to provide advice on best practice with regard to data collection in this field, and to examine the feasibility of providing data that will satisfy user needs. The project contains qualitative and quantitative research methodologies in relation to question development and survey operational issues. This includes:A review of UK and international surveys already collecting data on sexual orientation/identityA series of focus groups exploring conceptual issues surrounding "sexual identity" including related terms and the acceptability of questioning on multi-purpose household surveysA series of quantitative trials with particular attention to item non-response; question administration; and data collectionCognitively testing to ensure questioning was interpreted as intended.Quantitative research on potential bias issues in relation to proxy responsesFuture analysis and reporting issues are being considered alongside question development e.g. accurately capturing statistics on populations with low prevalence
The presentation also discusses the practical survey administration issues relating to ensuring privacy in a concurrent interview situation, both face to face and over the telephone
Release date: 2009-12-03 - 10. Testing for the 2011 Census of Canada ArchivedArticles and reports: 11-522-X200800010950Description:
The next census will be conducted in May 2011. Being a major survey, it presents a formidable challenge for Statistics Canada and requires a great deal of time and resources. Careful planning has been done to ensure that all deadlines are met. A number of steps have been planned in the questionnaire testing process. These tests apply to both census content and the proposed communications strategy. This paper presents an overview of the strategy, with a focus on combining qualitative studies with the 2008 quantitative study so that the results can be analyzed and the proposals properly evaluated.
Release date: 2009-12-03
Stats in brief (0)
Stats in brief (0) (0 results)
No content available at this time.
Articles and reports (57)
Articles and reports (57) (0 to 10 of 57 results)
- 1. Opening remarks of the Symposium 2008: Data Collection: Challenges, Achievements and New Directions ArchivedArticles and reports: 11-522-X200800010920Description:
On behalf of Statistics Canada, I would like to welcome you all, friends and colleagues, to Symposium 2008. This the 24th International Symposium organized by Statistics Canada on survey methodology.
Release date: 2009-12-03 - 2. International surveys: Motives and methodologies ArchivedArticles and reports: 11-522-X200800010937Description:
The context of the discussion is the increasing incidence of international surveys, of which one is the International Tobacco Control (ITC) Policy Evaluation Project, which began in 2002. The ITC country surveys are longitudinal, and their aim is to evaluate the effects of policy measures being introduced in various countries under the WHO Framework Convention on Tobacco Control. The challenges of organization, data collection and analysis in international surveys are reviewed and illustrated. Analysis is an increasingly important part of the motivation for large scale cross-cultural surveys. The fundamental challenge for analysis is to discern the real response (or lack of response) to policy change, separating it from the effects of data collection mode, differential non-response, external events, time-in-sample, culture, and language. Two problems relevant to statistical analysis are discussed. The first problem is the question of when and how to analyze pooled data from several countries, in order to strengthen conclusions which might be generally valid. While in some cases this seems to be straightforward, there are differing opinions on the extent to which pooling is possible and reasonable. It is suggested that for formal comparisons, random effects models are of conceptual use. The second problem is to find models of measurement across cultures and data collection modes which will enable calibration of continuous, binary and ordinal responses, and produce comparisons from which extraneous effects have been removed. It is noted that hierarchical models provide a natural way of relaxing requirements of model invariance across groups.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010939Description:
A year ago, Communications and Operations field initiated what is considered as Statistics Canada's first business architecture activity. This concerted effort was focused on collection related activities and processes, and was conducted over a short period during which over sixty STC senior and middle managers were consulted.
We will introduce the discipline of business architecture, an approach based on "business blueprints" to interface between enterprise needs and its enabling solutions. We will describe the specific approach used to conduct Statistics Canada Collection Business Architecture, summarize the key lessons learned from this initiative, and provide an update on where we are and where we are heading.
We will conclude by illustrating how this approach can serve as the genesis and foundation for an overall Statistics Canada business architecture.
Release date: 2009-12-03 - 4. Organisation of data collection methodology services in the Australian Bureau of Statistics ArchivedArticles and reports: 11-522-X200800010940Description:
Data Collection Methodology (DCM) enable the collection of good quality data by providing expert advice and assistance on questionnaire design, methods of evaluation and respondent engagement. DCM assist in the development of client skills, undertake research and lead innovation in data collection methods. This is done in a challenging environment of organisational change and limited resources. This paper will cover 'how DCM do business' with clients and the wider methodological community to achieve our goals.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010941Description:
Prior to 2004, the design and development of collection functions at Statistics New Zealand (Statistics NZ) was done by a centralised team of data collection methodologists. In 2004, an organisational review considered whether the design and development of these functions was being done in the most effective way. A key issue was the rising costs of surveying as the organisation moved from paper-based data collection to electronic data collection. The review saw some collection functions decentralised. However, a smaller centralised team of data collection methodologists was retained to work with subject matter areas across Statistics NZ.
This paper will discuss the strategy used by the smaller centralised team of data collection methodologists to support subject matter areas. There are three key themes to the strategy. First, is the development of best practice standards and a central standards repository. Second, is training and introducing knowledge sharing forums. Third, is providing advice and independent review to subject matter areas which design and develop collection instruments.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010946Description:
In the mid 1990s the first question testing unit was set-up in the UK Office for National Statistics (ONS). The key objective of the unit was to develop and test the questions and questionnaire for the 2001 Census. Since the establishment of this unit the area has been expanded into a Data Collection Methodology (DCM) Centre of Expertise which now sits in the Methodology Directorate. The DCM centre has three branches which support DCM work for social surveys, business surveys, the Census and external organisations.
In the past ten years DCM has achieved a variety of things. For example, introduced survey methodology involvement in the development and testing of business survey question(naire)s; introduced a mix-method approach to the development of questions and questionnaires; developed and implemented standards e.g. for the 2011 census questionnaire & showcards; and developed and delivered DCM training events.
This paper will provide an overview of data collection methodology at the ONS from the perspective of achievements and challenges. It will cover areas such as methods, staff (e.g. recruitment, development and field security), and integration with the survey process.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010947Description:
This paper addresses the efforts of the U.S. Energy Information Administration to design, test and implement new and substantially redesigned surveys. The need to change EIA's surveys has become increasingly important, as U.S. energy industries have moved from highly regulated to deregulated business. This has substantially affected both their ability and willingness to report data. The paper focuses on how EIA has deployed current tools for designing and testing surveys and the reasons that these methods have not always yielded the desired results. It suggests some new tools and methods that we would like to try to improve the quality of our data.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010948Description:
Past survey instruments, whether in the form of a paper questionnaire or telephone script, were their own documentation. Based on this, the ESRC Question Bank was created, providing free-access internet publication of questionnaires, enabling researchers to re-use questions, saving them trouble, whilst improving the comparability of their data with that collected by others. Today however, as survey technology and computer programs have become more sophisticated, accurate comprehension of the latest questionnaires seems more difficult, particularly when each survey team uses its own conventions to document complex items in technical reports. This paper seeks to illustrate these problems and suggest preliminary standards of presentation to be used until the process can be automated.
Release date: 2009-12-03 - Articles and reports: 11-522-X200800010949Description:
The expansion in scope of UK equality legislation has led to a requirement for data on sexual orientation. In response, the Office for National Statistics has initiated a project aiming to provide advice on best practice with regard to data collection in this field, and to examine the feasibility of providing data that will satisfy user needs. The project contains qualitative and quantitative research methodologies in relation to question development and survey operational issues. This includes:A review of UK and international surveys already collecting data on sexual orientation/identityA series of focus groups exploring conceptual issues surrounding "sexual identity" including related terms and the acceptability of questioning on multi-purpose household surveysA series of quantitative trials with particular attention to item non-response; question administration; and data collectionCognitively testing to ensure questioning was interpreted as intended.Quantitative research on potential bias issues in relation to proxy responsesFuture analysis and reporting issues are being considered alongside question development e.g. accurately capturing statistics on populations with low prevalence
The presentation also discusses the practical survey administration issues relating to ensuring privacy in a concurrent interview situation, both face to face and over the telephone
Release date: 2009-12-03 - 10. Testing for the 2011 Census of Canada ArchivedArticles and reports: 11-522-X200800010950Description:
The next census will be conducted in May 2011. Being a major survey, it presents a formidable challenge for Statistics Canada and requires a great deal of time and resources. Careful planning has been done to ensure that all deadlines are met. A number of steps have been planned in the questionnaire testing process. These tests apply to both census content and the proposed communications strategy. This paper presents an overview of the strategy, with a focus on combining qualitative studies with the 2008 quantitative study so that the results can be analyzed and the proposals properly evaluated.
Release date: 2009-12-03
- Previous Go to previous page of Articles and reports results
- 1 (current) Go to page 1 of Articles and reports results
- 2 Go to page 2 of Articles and reports results
- 3 Go to page 3 of Articles and reports results
- 4 Go to page 4 of Articles and reports results
- 5 Go to page 5 of Articles and reports results
- 6 Go to page 6 of Articles and reports results
- Next Go to next page of Articles and reports results
Journals and periodicals (1)
Journals and periodicals (1) ((1 result))
- Journals and periodicals: 89-639-XGeography: CanadaDescription:
Beginning in late 2006, the Social and Aboriginal Statistics Division of Statistics Canada embarked on the process of review of questions used in the Census and in surveys to produce data about Aboriginal peoples (North American Indian, Métis and Inuit). This process is essential to ensure that Aboriginal identification questions are valid measures of contemporary Aboriginal identification, in all its complexity. Questions reviewed included the following (from the Census 2B questionnaire):- the Ethnic origin / Aboriginal ancestry question;- the Aboriginal identity question;- the Treaty / Registered Indian question; and- the Indian band / First Nation Membership question.
Additional testing was conducted on Census questions with potential Aboriginal response options: the population group question (also known as visible minorities), and the Religion question. The review process to date has involved two major steps: regional discussions with data users and stakeholders, and qualitative testing. The regional discussions with over 350 users of Aboriginal data across Canada were held in early 2007 to examine the four questions used on the Census and other surveys of Statistics Canada. Data users included National Aboriginal organizations, Aboriginal Provincial and Territorial Organizations, Federal, Provincial and local governments, researchers and Aboriginal service organizations. User feedback showed that main areas of concern were data quality, undercoverage, the wording of questions, and the importance of comparability over time.
Release date: 2009-04-17
- Date modified: