Chapter 2.8: Program evaluation


Program evaluation is a function involving the systematic collection and analysis of evidence on program outcomes. This function is crucial for a national statistical agency. It is used to evaluate program relevance and performance, to facilitate decision-making and to confirm or consequently adjust resource priorities and allocation in a national and international context of budget pressures. The evaluation function is a management best practice since it helps to continually improve the programs and processes used.

Above all, program evaluation is part of a sound management cycle that provides managers with information so that they can validate or adjust the programs for which they are responsible.

In Canada, this function is mandatory for all government departments and agencies. They must all comply with the requirements of the Treasury Board Secretariat, the federal central agency that oversees the evaluation function throughout the Government of Canada.

Program evaluation provides Canadians, parliamentarians, ministers, central agencies and deputy heads with an evidence-based, neutral assessment of resource optimization, i.e., the relevance and performance of programs. Specifically, evaluation

  • supports accountability to Parliament and taxpayers by helping the government to credibly report on the results achieved with the resources invested in programs;
  • informs government decisions, as well as those of managers of the statistical agency, on resource allocation and reallocation;
  • supports deputy heads in managing for results by informing them about whether their programs are producing the outcomes that they were designed to produce, at an affordable cost;
  • supports policy and program improvements by helping to identify lessons learned and best practices.

In Canada, several laws, policies and directives govern the evaluation function. The most relevant of these are the following:

  • Policy on Evaluation: Implemented by the Government of Canada's central agenciesEndnote 1 to provide a coherent legal framework for all federal departments and agencies, the purpose of the Policy on Evaluation is to create a comprehensive and reliable base of evaluation evidence that is used to support policy and program improvement, expenditure management, Cabinet decision-making, and public reporting. The Policy also stipulates that all direct program expenditures must be evaluated every five years. To do this, an ongoing five-year organizational evaluation plan is prepared by each federal department or agency to determine the evaluation projects or programs planned for a given year.
  • Directive on the Evaluation Function: Developed by the central agencies, this directive aims to clarify the roles and responsibilities of departmental staff involved in evaluation so that departmental evaluation functions work effectively to support the evaluation information needs of Canadians, Parliamentarians, Ministers, central agencies and deputy heads. At Statistics Canada, stakeholders include the chief evaluation executive and its team, program managers and the Departmental Evaluation Committee chaired by the chief statistician.

In the interests of transparency, all government departments and agencies are obligated to publicly disclose on their website the summary results of their program evaluation reports.

It is important to add that at Statistics Canada, the program evaluation function is an impartial function not tied to any of the agency's programs or services. The chief evaluation executive reports directly to the chief statistician and to the Departmental Evaluation Committee chaired by the chief statistician.

Strategies, mechanisms and tools

The process to develop and implement program evaluation at Statistics Canada is as follows:

  • Form a Departmental Evaluation Committee chaired by the chief statistician and made up of assistant chief statisticians;
  • Develop a five-year evaluation plan to assess all direct expenditures of statistical programs;
  • Develop performance measurement strategies for the agency and its statistical programs;
  • Evaluate how relevance, such as the degree to which a program meets the needs of program users, aligns with government priorities and complies with legislation;
  • Evaluate performance, such as the achievement of expected results and demonstrated savings and efficiency;
  • The Departmental Evaluation Committee and Departmental Audit Committee approve the evaluation reports and performance measurement strategies;
  • Publish evaluation reports on the Statistics Canada website to respect the principles of accountability in accordance with the Access to Information Act, the Privacy Act and the Policy on Government Security. Information about these reports as well as the date of publication is shared in advance with the Minister's Office;
  • Provide a formal follow-up on recommendations and on the progress of action plans and reports to the Departmental Evaluation Committee; and
  • Develop an annual report on the state of program performance measurement that supports evaluations.

The most relevant tools used to achieve the evaluation function are as follows:

  • The evaluation plan must be done every five years, in compliance with the requirements of the Treasury Board Secretariat, the federal central agency that coordinates the evaluation function across the Government of Canada. This plan must
    • align with and support the departmental management, resources and results structure;
    • include all direct program spending, excluding grants with the exception of subsidies;
    • include all ongoing grant and contribution programs for which their department is responsible;
    • include the administrative aspect of all major statutory spending;
    • include programs that are set to terminate automatically after a specified period of time, if requested by the Secretary of the Treasury Board following consultation with the affected deputy head;
    • support the requirements of the Expenditure Management System, including strategic reviews;
    • identify and recommend a risk-based approach to the Departmental Evaluation Committee. In practice, this means that the risks associated with programs are taken into consideration and a program evaluation hierarchy by risk level is established;
    • submit the departmental evaluation plan, as approved by the deputy head, to the Treasury Board of Canada Secretariat at the beginning of each fiscal year.
  • Performance measurement is fundamental, since it satisfies not only the requirements to improve programs, but also the necessity to prepare reports and be accountable for the organization's performance. At the program level, performance measurement helps program managers to
    • make informed decisions to ensure that their program outcomes are on scope and within budget;
    • adapt to changing priorities;
    • improve their performance reporting (internal and external);
    • support the evaluation of their programs by providing a line of evidence.

The performance measurement system includes the four phases shown in Figure 2.8.1.

Figure 2.8.1: Four phases of the Statistics Canada Performance System

 Four phases of the Statistics Canada Performance System
Description for Figure 2.8.1

This diagram uses boxes to show the four phases of the Statistics Canada performance measurement system. It is made up four main boxes that describe each phase.

The Phase 1 box describes the vision, planning and governance:

  • PM vision and plan
  • Committee structure
  • Study/working groups

The Phase 2 box describes the fundamental elements:

  • Integrated performance measurement framework
  • Performance measurement strategies
  • Potential indicators

The phase 3 box describes the indicator feasibility test:

  • Define and test indicators
  • Data collection process
  • Format of performance reports

The phase 4 box describes the system implementation:

  • Reinforce HR/system capabilities
  • Do an early pilot test and adjust
  • Accelerate implementation
  • Monitor performance and produce performance reports

Three arrows point from each box for phases 2, 3 and 4 to a box entitled Feedback loop.

Two arrows from the Feedback loop point to another box entitled Departmental Performance Report and Program PM reports; this box connects the feedback loop with the two reports.

Lastly, an arrow from the Departmental Performance Report and Program PM reports box points back to box 1, Vision, planning and governance.

Phase 1 consists of developing integrated approaches to ensure application and coordination of mechanisms, in order to operationalize and streamline performance implementation at Statistics Canada.

Phase 2 aligns Statistics Canada's strategic outcomes (e.g., access to statistical data, relevance) with the expected outcomes (e.g., use of data by the Canadian population). Alignment of the strategic outcomes is based on the Program Activity Architecture. This structure establishes relationships between all services or programs that are comparable from one period to another. Once the strategic outcomes have been determined, the performance measurement strategies must be defined.

The objective of Phase 3 is to implement indicator feasibility tests. Working groups prepare and test different versions of an indicator. Feasibility testing can take more than one year before a final version is determined.

Lastly, Phase 4 consists of implementing the system at the data collection, monitoring and report production levels, in particular, departmental Reports on Plans and Priorities (RPP) and the departmental performance reports (DPR).

At the governance level, the mandate of the Departmental Evaluation Committee is to advise the deputy head on the departmental evaluation plan, resource allocation, final evaluation reports, and all other final decisions on other departmental evaluation activities. The Committee is chaired by the chief statistician and includes all the assistant chief statisticians and the Chief Evaluation Executive.

Key success factors

The engagement and leadership of senior management as well as the ongoing participation and engagement of program managers ensure that the evaluation function has constant support within the agency.

In this vein, the evaluation function has the necessary human or financial resources to carry out its responsibilities.

Furthermore, the development of this function externally makes it possible to consolidate networking and share best practices with the evaluator community through the Treasury Board Centre of Excellence for Evaluation.


The program evaluation function requires a solid understanding of the programs and their inner workings. For this reason, recruiting qualified professional evaluators is a challenge, particularly given the shortage in this specialized profile, as well as the requirements of certification and the bilingual profile. All program evaluators are members of the Canadian Evaluation Society (CES), which is responsible for implementing a professional development framework for evaluators.

The other challenge pertains to the program evaluation cycle: calibrating rotating program evaluations over five years, given the Policy on Evaluation, is a constraint to be managed alongside the expectation of program managers with regard to outcomes.

The evaluation function consists of continually ensuring that programs are relevant and perform well.

Program evaluation: Example
Brief overview of the evaluation of the Canadian Health Measures Survey (CHMS)
2007/2008 to 2012/2013

Context and scope

The CHMS is Canada's first nationally representative survey of direct health measures. It was launched in 2007 to address long-standing limitations in Canada's health information system. The principal objective of the CHMS is to collect new and important data on Canadians' health status.


The approach used to evaluate CHMS can be defined as theory-driven and based on a non-experimental design using post-collected information. Findings are collected from multiple lines of evidence: document and literature reviews, administrative and financial data reviews, a series of key informant interviews, and a survey of primary data users. The evaluation efforts ensure an appropriate balance between the level of effort and the context.

Key findings


The CHMS represents a unique program in Canada, providing direct health measures data to support health research, policy, and decision-making. Evidence shows that the program is relevant to Canadians and health organizations, with a clear present and future need for the program. Although most feel that the CHMS responds to current and emerging content needs, there was no content plan ever developed for the CHMS.

Performance – Effectiveness

The CHMS outputs have been produced as expected and the expected immediate outcome is being achieved: reliable and usable data are available on the baseline health status of Canadians, and on the level of, and exposure to environmental contamination. Some issues exist concerning the accessibility of the data, which could impair the long-term outcomes if not addressed. More specifically, even though most researchers are aware of the data, many are unaware of how to access them. This view is corroborated by external interviewees, who believe that the lack of—or complexity of—accessibility is a barrier to the use of the data.

Performance – Economy and efficiency

The shift from predominantly cost recovery to core funding of CHMS, as a result of the 2008 Action Plan, helped stabilize the survey, which had a positive impact on CHMS. In particular, it allowed for better long-term planning and for seeking opportunities to increase operational efficiency. The evaluation results demonstrated sufficient human and financial resources to support the program.

The CHMS is the only nationally representative direct measures survey in Canada and complements other Canadian and international studies. Therefore, there is no evidence of duplication of efforts that might influence efficiency.

Once the findings are revealed, recommendations are made for each finding that requires a change or an improvement by those responsible for the evaluation. The program then develops an action plan that follows up on the recommendations.


Endnote 1

The central agencies refer to the Government of Canada's federal agencies responsible for developing policies and common tools.

Return to endnote 1 referrer


Government of Canada (2012). Policy on Evaluation. Consulted on the 31st of March 2016 and retrieved from

Government of Canada (2012). Directive on the Evaluation Function. Consulted on the 31st of March 2016 and retrieved from

Date modified: