Documentation

Warning View the most recent version.

Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please "contact us" to request a format other than those available.

Scope and purpose
Principles
Guidelines
Quality indicators
References

Scope and purpose

Documentation constitutes a record of the statistical activity, including the concepts, definitions and methods used to collect, process and analyze data and produce statistical products. It is intended to promote effective and informed use of data. Quality indicators produced during a statistical activity should be included in the documentation. Analysis of quality indicators, in terms of their impact on the use of statistical products resulting from statistical activity, should also be included in the documentation.

During implementation, documentation is a means of communication to ensure effective development of a statistical activity. It includes not only what decisions were made, but also why they were made, and provides information that will be useful for future development and implementation of the same statistical activity or a similar or redesigned activity.

Principles

The goal of documentation is to provide a complete, unambiguous and multi-purpose record of the statistical activity, including its outputs. Documentation may be intended for various target audiences, such as management, technical staff, planners of other surveys, and users. It should be readily accessible, up to date, timely as to ensure relevance, and comprehensible to its main audience. It can be multimedia format (e.g. hardcopy, electronic format and visual presentation). Care must be taken to preserve statistical activity documents.

Guidelines

Tailor the documentation to the target audience and the general context

  • The level of documentation should consider the target audience for which it is intended. As a result, it must be determined whether documentation should be detailed or condensed, technical or general. In cases where statistical products are disseminated by Statistics Canada, the documentation must meet the requirements of the Policy on Informing Users of Data Quality and Methodology (Statistics Canada, 2000).

  • The scope of the documentation should take into consideration the context in which the statistical activity concerned was carried out, the importance of the statistical activity, whether it is new or recurrent, and whether it is similar to or different from other statistical activities conducted by the organization. This determines what must be covered by new documentation and what can be covered by references to existing documentation.

  • Documentation priorities should also take into account the statistical activity's budget, the appropriate time for publishing the documentation, as well as its short-term and long-term benefits. There must be no delay between completion of the statistical activity and preparation of the documentation. Any such delay may not only affect the documentation's timeliness and relevance, but also erode its accuracy.

Provide complete, accurate documentation

  • Statistical activity documentation generally comes in three types: (1) general documents that, for the most part, provide a current picture of the statistical activity, (2) thematic documents that provide details on implementation, and (3) thematic evaluations.

General documentation

  • Objectives: Include information on the objectives and uses of the data, timeliness, frequency of the statistical activity, and data quality targets. Objectives can change as the survey progresses (e.g. budgetary constraints, perceived feasibility, results of new pilot studies, or new technology). Such changes must be documented as they have an impact on questionnaire design and on test result analysis.

  • Content: Include the concepts, definitions and the questionnaire used. To facilitate integration with other sources, use standardized concepts, questions, methods, processes and classifications. Highlight differences, if warranted. Mention the role of advisory committees and users.

  • Methodology: Deal with issues such as target population, sampling frame, coverage, reference period, sample design, sample size and selection method, collection method and follow-up procedures for nonresponse, edit and imputation, estimation, benchmarking and revision, seasonal adjustment and confidentiality. Provide a methodological overview. Emphasize various aspects for different readers. Provide a consolidated document of technical issues for technical staff.

  • Data quality: Provide general information about coverage, sampling error, non-sampling error, response rates, the rates and effects of edit and imputation, comparability over time and with other data, validation studies, quality assurance measures and any other relevant measures specific to the statistical activity concerned. Describe any unexpected events affecting data quality (e.g. flooding, high nonresponse rate). For technical users, include total variance or its components by source, nonresponse and response biases, and the impact and interpretation of seasonal adjustment.

Detailed documentation on implementation

  • Activity planning and budget

  • Operations: Include an interviewer manual, training manuals, instructions or a manual for supervisors and quality control staff, manuals for data capture and processing staff, and feedback and debriefing reports.

  • For computer-assisted interviewing, provide the computer application's development specifications.

  • Systems: Include information on the data files (record layouts, explanation of codes, basic frequencies, edit procedures), systems documentation (construction, algorithms, use, storage and retrieval) and monitoring reports (time spent on specific activities, trouble areas, scheduling of runs to determine whether processing is on time).

  • Implementation: Document all operations, with inputs and outputs clearly specified. Attach work schedules for each implementation step.

  • Resources: List the resources used and when. Provide an account of all salary and non-salary expenditures (amounts and time). Comment on expenditures in relation to budgets.

Evaluations

  • Prepare a general evaluation of the statistical activity process.

  • Describe cognitive tests, field tests or pilot surveys and report on results and recommendations in relation to specifications.

  • Document methodology evaluations, such as alternative sample designs considered or the performance of the sample design us

  • If the documentation is to be disseminated outside Statistics Canada, it must undergo institutional and peer review as specified in the Policy on the Review of Information Products (Statistics Canada, 2003). Even if it is exclusively for internal use, documentation should be reviewed by managers, target audience representatives or peers for relevance, accuracy and comprehensibility.

Make the documentation accessible

  • In the case of documentation intended for users, provide the documentation elements required for the Integrated Metadatabase (Statistics Canada, 2000c). As the archive for information about Statistics Canada's surveys and programs, the IMDB contains most of the information for users regarding methodology and data accuracy. Electronic products contain a link to the IMDB, which is used to access documentation about the product. For print products, the IMDB provides adequate documentation, in accordance with the Policy on Informing Users of Data Quality and Methodology (Statistics Canada, 2000d).

  • Choose tools that, as mush as possible, provide a central repository for documentation about statistical activities and encourage structured storage and file search. At a minimum, each document should present a clear title, a date and the names of the authors (institutions or individuals). The preservation of Statistics Canada documents is required under the Policy on Document Management.

  • Sort and document references (theoretical and general articles and documents related to the project that were, however, not produced as part of the project).

Quality indicators

Main quality elements:  interpretability, accessibility, timeliness

  • Number of documents published externally and that underwent institutional and peer review.

  • Published statistical products that satisfy the Policy on Informing Users of Data Quality and Methodology.

References

Statistics Canada. 2000.  "Policy on document management." Statistics Canada Policy Manual. Section 5.9. Last updated March 5, 2009.

Statistics Canada. 2000d.  "Policy on informing users of data quality and methodology."  Statistics Canada Policy Manual. Section 2.3. Last updated March 4, 2009.

Statistics Canada. 2002c.  Statistics Canada's Quality Assurance Framework – 2002.  Catalogue No. 12-586-XIE.

Statistics Canada. 2003. "Policy on the review of information products." Statistics Canada Policy Manual. Section 2.5. Last updated March 4, 2009.

Statistics Canada. 2004.  "Policy on standards."  Statistics Canada Policy Manual. Section 2.10. Last updated March 4, 2009.

Statistics Canada. 2007. "Integrated Metadatabase – Guidelines for Authors." Standards Division Internal Communications Network.  No date. http://stdsweb/standards/imdb/imdb-menu.htm

United Nations, Conference of European Statisticians. 1983.  Draft guidelines for the preparation of presentations of the scope and quality of statistics for users. Geneva, Switzerland.

Date modified: