Chapter 5: A lasting legacy: 1995 to 2008
Table of Contents
- Privacy concerns come to the forefront
- The Statistics Act is amended
- Privacy changes in the Census Program
- Job shortages make for a nimble program
- Privacy impact assessments are introduced
- One mortal sin
- Management initiatives
- Navigating the budgetary tides
- A collaborative solution to strategic direction
- Strategic streamlining initiatives help relieve pressure
- Regional office restructuring
- A renewed focus on quality
- A modernized public service
- An increased focus on analysis
- The dawn of the research data centres
- Outreach and communication
- The Daily prevails through two natural disasters
- Software standardization
- CANSIM rides again
- The education program reaches out
- Unique promotions in the Census Program
- Fostering public relations
- Internal communications
- The online momentum continues
- Y2K
- Evolving modes of data collection
- Human resources
- The evolution of employment equity
- Recruitment and training expand
- A new focus on wellness
- New staffing initiatives are implemented
- Generic competitions level the playing field
- The social statistics program
- The Centre for Education Statistics is launched
- Canada's ongoing Family Violence Initiative
- A new and powerful health survey is born.
- Additional landmark health initiatives
- The new Aboriginal Statistics Program
- Two new longitudinal initiatives
- Two new workplace surveys
- The brain drain
- Canada participates in an international survey of youth
- Measuring hate crime
- An international focus on youth delinquency
- The Census Program
- The power of mapping
- Access to historical census information
- The business statistics program
- The Project to Improve Provincial Economic Statistics turns the tides
- A new classification system is introduced
- Science and technology statistics
- Measuring commerce in new ways
- The Survey of Financial Security
- Improving measures of trade
- Mad cow disease
- An environment statistics program takes shape
- Genetically modified organisms
- International technical assistance continues
- A path of continuous improvement
Privacy concerns come to the forefront
As it did when computers were first emerging a few decades earlier, the issue of privacy again came to the forefront, this time as a result of the emergence of the Internet and the resulting proliferation of available information. At the same time, the growth of computing power was providing the opportunity to link data to vastly increase their power and utility, and to decrease the cost and respondent burden associated with data collection.
The concept and power of record linkage had been recognized as early as 1946, when it was the subject of an analytical report published by the Dominion Bureau of Statistics, in reference to the requirements of family allowance legislation or insurance for proof of age, birth order or death. The report began: "Each person in the world creates a Book of Life. This Book starts with birth and ends with death. Its pages are made up of the records of the principal events in life. Record linkage is the name given to the process of assembling the pages of this Book into a volume." While the concept of individual privacy had not yet arisen in 1946, it is now recognized that the linking of records intrudes on privacy by its nature. Thus, beginning in the mid-1980s, the agency established policies and directives on record linkage to ensure the practice occurred only when the public good was clear and outweighed the necessary privacy intrusion.
In the statistical context, privacy provides respondents with protection against intrusive enquiries and with some controls over information about themselves. Confidentiality, on the other hand, refers to keeping information from being revealed in an identifiable manner to any unauthorized person. Both the Privacy Act and the Statistics Act confer an obligation on Statistics Canada to protect the confidentiality of information. In 1998, a request for client information under the Access to Information Act, along with the increased use of Statistics Canada's website by commercial clients and other visitors, prompted the development of the Client Information Policy to clearly delineate the agency's practices with respect to client information.
In line with the requirements of the Privacy Act that provide individuals the right to know why and for what purpose personal information is collected, the agency developed the Policy on Informing Survey Respondents in 1998. This ensured that respondents were fully informed of the expected use of the information they provided, the authority under which any given survey was taken, the confidentiality protection afforded to their responses and the existence of any related data sharing agreements.
Other ways in which the agency seeks to avoid unwanted invasions of privacy include making all of its household surveys voluntary, with the exception of the Labour Force Survey and the census, so Canadians have a choice not to participate if they feel their privacy is being infringed upon. The agency also carefully reviews the questions it asks to avoid, or handle with the utmost care, any actual or perceived invasions of privacy.
In 2004, Lockheed Martin Canada—a global security and aerospace company—was successful in an open-bid process for a contract to process the 2006 Census test data. Under North American Free Trade Agreement rules, Canadian, American and Mexican companies were eligible to apply for the contract. Concerns began to be expressed by Canadians and their members of Parliament about information potentially being accessible by Lockheed Martin Canada's U.S. parent company. While the company had no access to census returns, either at the data operations centre or through the agency's census response database, the agency nonetheless reduced the scope of the company's involvement to assuage the public's concerns. The contract was revised so that the company was no longer responsible for processing the data but would continue to provide hardware, software, printing and support services. All processing was carried out exclusively by agency employees on agency premises.
The Statistics Act is amended
Historical and genealogical researchers have long used historical census information to tell the story of yesteryear and analyze the transformation of societies over time. Records from historical censuses were routinely declassified and transferred to the public archives for research, including the 1891 and 1901 censuses, which were released in 1983 and 1993, 92 years after their collection.
However, in 1998, the 1906 records were not released as scheduled. The 1906 Census was a particular census restricted to the "Northwest Provinces" of Manitoba, Saskatchewan and Alberta. While wooing Alberta and Saskatchewan to join Confederation, then Prime Minister Sir Wilfrid Laurier indicated that all census data collected would remain confidential. The 1906 Census captured Western Canada at a key point in its development: Saskatchewan and Alberta had just joined Confederation the previous year, and the population of Western Canada was rising rapidly. In fact, the 1906 Census found a 93% increase in population compared with five years earlier. The regulations under the statistical legislation of the time (the Census and Statistics Act of 1905) required the census enumerators to assure people that their information would never be released, except to those accessing their own information. Interestingly, the instructions for enumerators suggest that such confidentiality provisions were aimed at census enumerators and were designed to reassure citizens that census information would not be shared with tax collectors, for example. Recall that as far back as 1851, there was a general belief that the census had some direct or indirect connection to taxation or even military conscription.
When the Act respecting the Dominion Bureau of Statistics was written in 1918 by R.H. Coats largely as a consolidation of previous statistical legislation, it codified these earlier confidentiality provisions into law. As a result, subsequent censuses were conducted under legislation that did not include any provision for declassification and release. In fact, legal opinion from the Department of Justice concluded that these later censuses were conducted under changes to the law that guaranteed that the information would never be shown to any other person. The potential therefore existed that no further censuses would ever be released again.
In 1996, researchers were beginning to lobby various ministers, including the Minister Responsible for Statistics Canada and the Minister of Canadian Heritage, as well as the Chief Statistician and local members of Parliament, to allow the release of historical records. Some of the ambiguity with respect to their release was a result of the more recent Privacy Act of 1983, which provided for the release of census records after a 92-year waiting period. That act, however, also stipulated that where other acts provided specific protection to personal records, the provisions of other acts were to prevail.
In 1999, to resolve the ambiguity, the Minister Responsible for Statistics Canada (the Honourable John Manley) established the Expert Panel on Access to Historical Census Records, to provide independent and expert advice on the legal, privacy and archival implications of releasing census records. The panel was chaired by Dr. Richard Van Loon, President of Carleton University, and also comprised the Honourable Lorna Marsden, President and Vice-Chancellor of York University; Professor Chad Gaffield of the University of Ottawa; Professor John McCamus of Osgoode Hall Law School; and the Honourable Gérard La Forest, retired Supreme Court of Canada judge. The panel concluded in its June 2000 report that no perpetual guarantee of confidentiality was ever intended to be attached to census records, that the passage of time diminished concerns about individual privacy and that the value of public access to the records took precedence after a sufficient period of time (92 years was agreed to suffice). It was further recommended that all Canadians be informed that the guarantee of confidentiality with respect to future censuses endured for only 92 years. After a review of the expert panel's report, the Department of Justice reassessed its previous advice and considered that release of the census records from 1906, 1911 and 1916 was indeed possible, even without an amendment to the Statistics Act. However, an amendment to the Statistics Act was recommended for greater clarity, as well as to allow the future release of data from 1921 onward. In fact, the 1906 records were released to the National Archives in 2003.
At the time, the Privacy Commissioner of Canada was opposed to the release of census records, as well as their transfer to the National Archives, and Statistics Canada was nervous that the statistical system could be negatively affected by the release of the historical census records, especially since the 2001 Census was imminent. The agency believed there was a risk that participation in the census might be jeopardized. This was further substantiated by public opinion research undertaken in 2000, which suggested that Canadians disapproved of retroactive amendments to the confidentiality provisions of the act and that this would make them less likely to answer or provide accurate responses to the next census.
A number of bills were introduced in the early 2000s that proposed removing the legal ambiguities in the Statistics Act. After two were unsuccessful, Bill S-18 was assented to in 2005. It proposed permitting the declassification and transfer to the archives of census records taken between 1910 and 2005, 92 years after each census, and proposed that starting with the 2006 Census, Canadians be asked for consent to release their records after 92 years. It was felt that the latter stipulation was required to temper the perception that the release of old census records could be seen as removing a guarantee by the government, and to not jeopardize census compliance. The enactment also called for a review of the administration of that requirement to determine how Canadians responded and whether further change was required. The need to request consent would later be removed in 2017 as part of a future modernization of the act.
Privacy changes in the Census Program
In 1991, despite assurances of confidentiality, respondents voiced concerns that their census forms were received and reviewed by a local census representative who could very well be a neighbour. For the 1996 Census, a test was conducted in eastern Ontario whereby completed forms could be mailed to a district office instead of provided to a local census representative, and any necessary follow-up was carried out by telephone by anonymous interviewers. It was determined that the approach was too operationally risky to be used nationally for 2001. Such a system, however, would eventually be adopted for 2006.
In fact, the 2006 Census embodied a number of important changes that resulted in greater privacy protection. For example, a new Internet response option was introduced, which 2.26 million households took advantage of. As well, while the Canada Revenue Agency had been contracted to manually key all responses in previous censuses, the adoption of automated character recognition technology meant that all returns could be fully processed in-house at the agency's data processing centre. A master list of dwellings was also introduced for all of Canada—and the agency mailed questionnaires to 73% of them. The need for local enumerators to carry out manual edits and follow-up with respondents was mostly eliminated, with follow-up carried out by computer-assisted telephone interviewing from three call centres. As a result, instead of 50,000 field staff, only about 27,000 were required.
Job shortages make for a nimble program
In spite of the agency needing fewer field staff than anticipated for the 2006 Census, hiring and retention of staff was challenging. Only about 17,000 staff could be hired, as the labour market was saturated in urban centres—particularly in many parts of Alberta. Alberta was in the midst of the strongest period of economic growth ever recorded by a province. In fact, over a mere three years (from 2002 to 2005), its gross domestic product in current dollars rose by 43%. It had the highest wages in Canada, resulting in growing shortages of labour and housing. The 2006 Census would find that Alberta's population had grown at a rate of 10.6% since 2001, almost twice the national rate. As a result, the collection period was extended, and staff, including permanent Central Region staff, were deployed to various regions to help alleviate shortages. Follow-up was reduced and targeted to manage the financial fallout.
Privacy impact assessments are introduced
A new government-wide policy in 2002 required all federal institutions to develop and maintain privacy impact assessments (PIAs) to evaluate whether programs or services comply with privacy requirements. The process would assess the privacy, confidentiality and security risks involved with the collection, use or disclosure of information, and allow the development of mitigation measures to avoid or reduce those risks. Statistics Canada formed the Privacy Impact Assessment Review Group to develop an agency-specific policy to meet the requirements of the new policy. The policy was finalized in 2005, along with a generic PIA to address the majority of survey collections undertaken by the agency under the authority of the Statistics Act.
One mortal sin
In January 2006, as part of his annual address to all staff, Dr. Ivan Fellegi addressed the importance of confidentiality at the agency by reminding employees that "there is only one mortal sin in Statistics Canada and that is to not protect respondent information."
Ensuring confidentiality had become a lot more difficult with the increased use of computers and processing power, and with greater demand for access to microdata for more sophisticated research. While the agency aspires to release as much data as it can, it must continue to protect the confidentiality of individuals and their businesses.
A task force of middle managers was set up in the fall of 2005 to recommend measures to ensure employees' continued awareness of their responsibilities with respect to the confidentiality of information. One of the recommendations was to launch an agency-wide confidentiality awareness program, which constituted a first step of mandatory computer-based training within the first three months of employment. Similar training was available to continuing staff and was linked to the renewal of employee identification cards every three years. The role of Confidentiality Awareness Coordinator was assigned to the Director of the Data Access and Control Services Division to oversee all activities related to confidentiality awareness. An Internal Communications Network page was also developed to provide centralized access to information on confidentiality, including best practices and responsibilities. New procedures for administering the oath of secrecy required directors to personally administer the oath on employees' first day of employment. As well, all departing employees would be reminded of their lifelong obligation to abide by the oath, even after leaving the agency.
Management initiatives
Navigating the budgetary tides
One of the constant challenges of a centralized statistical agency is to marry the many different signals of user demand with the budgetary framework provided by the government. This is especially challenging in times of budgetary reduction—and the agency had seen a 10% reduction in its personnel from 1985 to 1995, on top of the 20% reduction that happened in the late 1970s. The agency's robust planning methods were integral to navigating the times and had continued to be refined since the corporate planning process was first implemented in the early 1980s. By the mid-1990s, this process involved preparing annual program reports on each program, reviewing and updating strategic long-term (five-year) directions and priorities, conducting operational planning, and examining specific proposals and making decisions about them. This was also integrated with an annual submission to the Treasury Board covering resource requirements for the next three years, called the multi-year operational plan.
The planning process became known as the long-term planning (LTP) process in the late 1980s. Following submissions of annual program reports each spring, a formal review of strategic priorities and directions would be undertaken by each field over the summertime, culminating in a strategic planning conference in the fall. At this conference, long-term plans were brought forward, including proposals for new initiatives and any opportunities for efficiencies. The lowest-priority programs were identified for possible reduction or cancellation. Human resources and business planning were involved to allow decisions to be taken based on all resource requirements and risks. Proposals were then reviewed, and the Corporate Planning Committee took final decisions on strategic priorities in the new year. The LTP process eventually gave way to the Integrated Strategic Planning Process in 2011, which extended the planning horizon to 10 years and continued to guide and streamline the agency's strategic planning priorities and resource allocation.
A collaborative solution to strategic direction
In 1995, the Clerk of the Privy Council, Jocelyne Bourgon, launched a series of nine task forces led by deputy ministers to explore a variety of issues that had been identified through program reviews. Dr. Fellegi chaired the Task Force on Strengthening the Policy Capacity of the Federal Government, tasked with reviewing Canada's policy development capacity and recommending improvements. The task force produced a report published in December 1996, which became widely known as the Fellegi Report. One of its themes was "the need for more attention to longer-term and strategic issues, including the major horizontal issues cutting across departments, and better interdepartmental forums for considering such issues." In fact, all of the task forces pointed to a need for horizontal integration. This led to the creation of the Policy Research Secretariat—later renamed the Policy Research Initiative (PRI)—which was launched in 1996 in the Privy Council Office to foster collaboration across the public service and identify key issues related to the government's policy agenda. It began as an interdepartmental committee of assistant deputy ministers from over 30 departments and agencies.
In 1998, the role of the PRI changed from a facilitator to a leader in horizontal research projects. One of the ways in which it contributed to horizontal research collaboration was through its leadership of a new interdepartmental committee called the Policy Research Data Group (PRDG). The group consisted of about 25 policy departments and central agencies that focused on identifying data gaps and areas for potential collaboration in priority horizontal policy areas. The PRDG managed a special venture capital fund of about $20 million per year that was available for experimental statistical projects with fixed lifespans. They would be funded for four years on the condition that if they were successful, a policy department would choose to fund them in the longer run or they would be dropped. The priority data projects were thus identified by the group, and then the data were developed by Statistics Canada under what the agency referred to as the Data Gaps II initiative. The PRDG offered an important forum for collaborating and sharing funds on projects of common interest, and facilitated the establishment of priorities.
Some of the funding for the Data Gaps II initiative for fiscal year 1998/1999 covered the continued funding of the Environment Statistics Program (its historical funding had come from the now-expired Green Plan), along with some other developmental work. Each Data Gaps II project had a lead department, and the agency signed memoranda of understanding with each of these departments. The main projects funded by the initiative included the new Longitudinal Survey of Immigrants to Canada, the Survey of Financial Security, investment in environment and education statistics, information on hate-motivated crime and diversity in the Canadian justice system, the Workplace and Employee Survey, the Labour Cost Index, the reconciliation and improvement of the international merchandise trade data, investment in science and technology statistics, and information on the extent and use of Internet-based communication and commerce.
Strategic streamlining initiatives help relieve pressure
In September 2002, six strategic streamlining initiatives were launched at the agency to relieve increasing budgetary pressures by identifying efficiencies in workflows and operations across the organization. One of the reasons for these pressures was that costs related to both field travel and employee benefits were increasing, and the agency's budget had not been increased to compensate. As well, the agency had grown by 20% over the previous five years. The initiatives looked critically at processes that added relatively little value to outputs and activities that could be performed at a less detailed or complex level. They also examined opportunities to use less expensive inputs, such as a greater use of administrative data in lieu of survey data. The intent was to return to the more normal pace from before the rapid expansion that came about over the previous five years.
Some of the key initiatives were conducting a business survey review looking at efficiencies to be gained in workflows and processes, increasing the use of tax data to reduce respondent burden and costs, improving and increasing the use of computer-assisted telephone interviewing to streamline collection of household surveys, reviewing the replacement cycle for computer workstations, and restructuring the regional offices. In the spring of 2003, the agency began consolidating the management and administrative overhead for its regional offices so that regional directors and some of their administrative staff would be located in three areas: Eastern, Central and Western. All nine offices remained open, and front-line operations remained essentially the same—but some activities were realigned to even out the workload and staff between the three regions.
By 2004, the agency was entering a period of greater financial tightness as a result of budget cuts and new collective agreements. Departments were asked to provide proposals for cutting their lowest-priority programs to reduce their operating budgets by a further 5%, which would be phased in over three years. In preparation for the budget decisions, the agency was continuing with its strategic streamlining initiatives, which in this new environment now needed to go beyond garnering efficiencies. The agency also reduced its recruitment to a minimum and was beginning to look at statistical programs that could be candidates for elimination should this become necessary. With an annual turnover rate of about 4%, the agency was committed to achieving its potential reduction in staffing levels without lay-offs. It was planning to achieve its cuts through staff turnover, redeployment and training, especially given that the peak workload of the 2006 Census would soon provide additional possibilities for absorbing staff.
Under a new mandate to undertake cyclical expenditure management reviews of all departments, agencies and programs, the Treasury Board undertook a major review of the agency from June 2003 to November 2004. The agency received top marks for overall management practices, including planning, human resources and finance. The review also underlined the agency's focus on relevance, as well as its limited flexibility for budget reallocation or for new information requests because of its legislative, regulatory and contractual obligations. It encouraged the agency to continue its current efforts at opening lines of communication with other departments and agencies and recommended that the Treasury Board develop a more coherent and timely funding process for the Census of Population and recommended earlier Cabinet engagement on its scope and overall budget.
The government's expenditure review eventually exempted the agency from budgetary cutbacks, and hiring was restarted in 2006. The rollercoaster continued, however, with a new budgetary review (the 2007 Strategic Program Review) by a blue ribbon panel the following year. Programs that had legal requirements and those that involved agreements with the provinces were exempted from the scope of the review. The panel categorized the agency's programs by how it believed they served the public interest. Some of the programs had come about as a result of the Data Gaps II initiative, and some were specific to certain policy departments. Funding for Data Gaps II was conditionally renewed in 2007 for four years, subject to the outcome of the review.
The outcome of the Strategic Program Review was a budget reduction of $21.5 million. Five surveys were discontinued, the Canadian Community Health Survey cycle was reduced to every three years instead of every two, the annual report on the Canadian Environmental Sustainability Indicators was discontinued, and the PRDG investment fund was reduced by 80%, leaving $2.4 million annually for new projects. Most of the projects funded by the PRDG fund at the time were discontinued according to their existing schedules over the following three years. A few of the projects continued to be funded, as they were determined to be fundamental to the national statistical system.
Regional office restructuring
The new initiatives in the early 2000s resulted in increased collection requirements. A new health survey proved particularly challenging, as it was such a massive undertaking, but it was also conducted in parallel with other major surveys such as the Labour Force Survey, the new Longitudinal Survey of Immigrants to Canada, and the National Longitudinal Survey of Children and Youth. This prompted a necessary reprofiling of collection activities and a major restructuring to free up field interviewing personnel. Most of the Labour Force Survey collection work was moved from the field to the regional offices' computer-assisted telephone interviewing centres. As well, annual and quarterly business surveys were moved from the regional offices to the Operations and Integration Division at headquarters, and monthly business surveys were consolidated among the Montréal, Toronto and Edmonton regional offices. A new computer-assisted telephone interviewing centre was opened in Sherbrooke, Quebec, and the Sturgeon Falls regional office was expanded. Plans for additional capacity were prepared but ended up not being required. In fact, the call centres in Montréal and Vancouver were closed in 2007/2008, largely as a result of a decline in survey workload since 2001/2002, but also to reduce collection overhead through consolidation and restructuring.
A renewed focus on quality
In 1997, an audit on data quality management by the Office of the Auditor General (OAG) reinvigorated the agency's focus on quality. In preparation for the audit, the agency summarized its existing quality management practices into the Quality Assurance Framework, which defined data quality along six dimensions: accuracy, relevance, timeliness, accessibility, interpretability and coherence. The framework was used to conduct assessments of four programs at the agency for the audit, after which the OAG recommended it be applied to the entirety of the statistical program. A team of experts from the International Monetary Fund (IMF) was at Statistics Canada in January 2003 to assess Canada's compliance with the Special Data Dissemination Standard. This new standard had been established by the IMF in 1996 in the wake of recent financial turmoil, as a guide for countries providing economic and financial data to the public. It covered several dimensions of quality, as well as prescribed fundamental rules of behaviour for statistical offices. The team found the agency's Quality Assurance Framework to be "commendably complete and (an) effective example of quality assurance practice."
The OAG also recommended the agency-wide adoption of a reporting mechanism. In response, the agency put in place a formal and integrated program reporting mechanism, through which programs reported on relevance, quality, costs, human resources management and strategic direction. It included an extensive quadrennial review and a shorter biennial report on performance, direction and any proposals for change in between. To respond to one of the observations from the OAG that the information the agency provided to the public on the quality of its statistics was inconsistent, a proposal to create a quality secretariat was brought forward. The secretariat would monitor, on behalf of the Methods and Standards Committee, information provided to clients about the quality of statistical products. The Quality Secretariat was established in 2000 to develop and support some of the agency's key quality management policies and practices, develop and manage quality management reviews, and provide advice on quality management to programs.
With the increased use of electronic databases, the agency started to invest resources in the late 1990s in developing metadata for data users, including details about underlying concepts, collection methodology and data limitations. Two fundamental priorities guided the process: the new database needed to be comprehensive in coverage and it needed to be driven by what clients were likely to want rather than what the agency thought they should know. The Integrated Metadatabase (commonly referred to as the IMDB) was implemented in November 2000, initially with documentation describing data sources and methods. It was later expanded to include the definitions of concepts and variables used in the statistical programs.
In 2005 and 2006, three serious errors were discovered after data were released, the most serious of which affected the Consumer Price Index (CPI). Statistics Canada has a long-standing practice of never revising the CPI, which makes the index quite valuable to those who link or index various contracts to its movements over time. However, this also means that there is absolutely no room for error. The income tax brackets are adjusted every year for inflation using the CPI, as are pension payments, labour contracts, rent increases, and large financial contracts such as government and private sector bond and debenture issuances. The potential consequences of an error are thus widespread and enormous.
The error was found to have been a programming error introduced five years earlier, when a new methodology was adopted for the traveller accommodation price index. This error was estimated to have caused the annual average change in the CPI to be understated by 0.1% every year from 2001 to 2005. The mistake was corrected going forward, but was not revised backward. As a result, by the time the error was discovered, the CPI had cumulatively risen by 0.5% less than it should have over that period.
When the agency discovered the error in 2005, it was not immediately transparent. The admission eventually appeared in a two-sentence footnote in a monthly publication in July 2005, which made no explicit mention of an error and was interpreted as an attempt to downplay the situation. The issue caused considerable embarrassment for the agency when it became public a month later, with some calling it the biggest mistake in the agency's history. The error and the agency's handling of it led to widespread coverage in the media and in Parliament. Members of Parliament and government departments received countless letters of protest.
Having prided itself on its data quality assurance practices, especially after having put in place the Quality Assurance Framework and Quality Secretariat, the agency was quite shaken by the incident. The Quality Assurance Task Force, led by a committee of directors general, was struck to conduct a comprehensive review to identify weaknesses and underlying factors, and identify best practices that should be applied across all programs at the agency. The review, which was launched in September 2006, covered nine of the most critical programs: the Labour Force Survey, the Monthly Retail Trade Survey, the Monthly Survey of Manufacturing, the CPI, International Trade, Gross Domestic Product by Industry, the National Income and Expenditure Accounts, Labour Productivity, and the Balance of Payments. A 10th program, the dissemination and communications program, was also added, as the review focused on accuracy in the execution of the programs, particularly in their later stages. Because the review was to be undertaken within four months, 10 separate teams were formed to carry out the review.
By February 2007, the review had found areas where further investments would strengthen quality assurance practices and identified best practices that were shared with other programs. The results and measures taken were made public in The Daily on June 4, 2007. One of the recommendations was the development of the Quality Incident Response Plan, with standard procedures to follow in the wake of data quality incidents. As a result, since 2006/2007, the Quality Secretariat has monitored corrections to The Daily in terms of reloads (after release) and preloads (within 24 hours prior to release). The agency also completed a large-scale quality assurance learning exercise, which was mandatory for all staff at the unit head level and above involved in data production.
The agency learned a number of valuable lessons as a result of the CPI error, and not just with respect to not letting its guard down when implementing changes to programs or in quality assurance practices. Just as importantly, it was reminded of the need to communicate transparently and openly admit when it is wrong and explain the situation. The agency's culture was permanently shaped by the event.
A new government-wide focus on strengthening oversight and accountability led to a revised governmental Policy on Internal Audit in 2006. As a result, the agency invested to bring its internal audit function in line with these new policy requirements. By April 2009, when the policy would come into force, the agency was required to have an audit committee composed of professional and experienced members drawn largely from outside the public service, and to have auditable financial statements. In 2009, the agency established the Departmental Audit Committee, composed of three independent members from outside the public service, as well as a chief audit executive who would report directly to the Chief Statistician. The new committee would provide the Chief Statistician with independent, objective advice and guidance, as well as assurance on the adequacy of the agency's control and accountability frameworks.
A modernized public service
A number of influential reports, including an Auditor General's report from 2000, which criticized the inflexibility of human resources (HR) management in the public service, led to a commitment to change in the 2001 Speech from the Throne and the creation of the Task Force on Modernizing Human Resources Management in the Public Service. A new model was developed, and November 2003 marked the passing of the Public Service Modernization Act. The law, described as the single biggest change to public service HR management in over 35 years, came into force over the following two years. The act created or revised key pieces of legislation, including creating the new Public Service Employment Act (PSEA) to address staffing, employment and political activities; creating the new Public Service Labour Relations Act (PSLRA) to address labour relations, collective bargaining and the resolution of related disputes; amending the Financial Administration Act to address authority and accountability; and amending the Canadian Centre for Management Development Act, which would be renamed the Canada School of Public Service Act, to address learning and development. This wide-ranging reform of HR management in the public service changed the way the government hired, managed and supported its employees. It modernized the staffing system, fostered better labour–management relations and implemented a more corporate approach to learning by creating the new Canada School of Public Service.
The Senior Steering Committee on Staffing at Statistics Canada took the lead on the plan to carry out the necessary changes at the agency, while management, union and HR working groups were formed to develop strategies, policies and procedures. One of the changes for the agency stemming from the new PSLRA was a move away from formal internal grievances toward resolving issues with alternative dispute resolution before they became formal complaints. To this end, the agency and the local unions co-developed a policy and procedures for a new informal conflict management system to help employees confront and resolve difficult situations.
The new PSEA was based more on values than rules. As a result, candidates in competitions would no longer be ranked (and employers would no longer have to appoint people in the order in which they were ranked)—a pool of candidates would be established from which any of the qualified candidates could be selected.
A new performance management program was also introduced as a result of a long-standing feeling among senior managers that the agency could do a better job setting explicit goals for employees and assessing their performance frankly and fairly. As noted in Chapter 2, the process at the agency in the 1970s had been rather labour intensive and therefore costly, with supervisors required to complete long forms every year—the whole process was widely seen as ineffective. This was later replaced by a less bureaucratic exercise that encouraged more frank and direct face-to-face discussions between employees and supervisors. These petered out a bit in the late 1990s and early 2000s, and many felt it was time for a renewal. Thus, in 2005, the new Performance Management Steering Committee was formed to assess performance management practices. The new program increased support for managers in addressing cases of poor performance and improved goal-setting, assessment and feedback for employees. A new course for supervisors called Improving Employee Performance was developed, and greater capacity was established within HR to help supervisors deal with performance management cases. A performance management site was also launched on the internal network.
An increased focus on analysis
The agency's focus on analysis was aligned with the government's focus on quantitatively informed policy development. If there was ever any doubt that nothing is ever new or that what goes around comes around, Dr. Fellegi remarked more than 20 years ago in the 1997 special issue of Scan that "From the agency's point of view, the most important shift is a greater recognition of the importance of 'evidence-based decision making,' a catch phrase that's coming to the fore in this area. What it means is that decisions should be based on relevant and accurate information rather than on hunches or outdated theories…" The ongoing struggle for any statistical agency is the permanent tension between the need to publish timely analysis on policy-relevant topics and the need to ensure the impartiality and objectivity of that analysis. Sometimes even the choice of topic for analysis can be seen as a subjective decision—and this applies even to research conducted in the agency's research data centres by non-public servants. To use the data held in the data centres, the research must be within the realm of study that could conceivably be conducted by Statistics Canada.
The research data centres were one way of encouraging Canadian social science researchers to use Canadian data. Another way was through the establishment of a new fellowship program, whereby, each year, about eight fellowships would be granted to enable young researchers to work at the agency on doctoral or postdoctoral projects. The success of these programs was evident, in that researchers were increasingly teaming up with Statistics Canada even outside the program. A number of other initiatives were undertaken over the years to promote research and analysis. For example, in 1999, the agency began offering a research stipend for access to longitudinal survey data to PhD students to promote not only the agency at large but also an awareness of the value of these new social surveys to the general research community.
The year 2000 marked the germination of what was then being referred to as the Economic Research Institute to attract researchers to work with business microdata, similar to the research data centres for social files. Issues of confidentiality are very different for business data compared with household data—it is virtually impossible to "disguise" a very large business within a dataset. Households, on the other hand, tend to be more similar, more numerous and more easily disguised within datasets by removing names, addresses or other identifiers. The potential for misuse of business microdata for commercial gain is also significantly greater than for household data. The idea would not come to fruition until 2012, when the Canadian Centre for Data Development and Economic Research was established.
Remote access to data was also proving to be extremely helpful in increasing the volume and breadth of analysis conducted by external researchers. This was an innovative initiative that provided researchers with a dummy file structured in the same way as the given survey data, but with fictitious, non-confidential data. Researchers would use the dummy data to formulate their analysis plans and computer programs. They would then submit the programs to Statistics Canada, which would run them, vet the output for confidentiality and email the results back to the researchers. Through greater analysis, the agency's traditional role of monitoring social, economic and environmental issues was deepening, as the agency sought to understand the various factors at work behind the data and serve as a facilitator of valuable research across the country.
The dawn of the research data centres
It was quickly realized that with the increasing supply of rich datasets from the new longitudinal surveys, the agency did not have the resources to fully exploit the data and was struggling with how to make this new type of information available to Canadians. Publishing summary tables could not do justice to the richness of the data available while protecting confidentiality, and access to the microdata was restricted to Statistics Canada employees or researchers working on its premises. This fundamentally restricted the amount of analysis that could potentially be conducted by the broader research community. Information is valuable only if it can be used, and the very richness of the longitudinal data was, in essence, its limitation.
To explore the issue, Dr. Fellegi and Marc Renaud, President of the Social Sciences and Humanities Research Council (SSHRC), created the Joint Working Group on the Advancement of Research Using Social Statistics, chaired by Dr. Paul Bernard, professor of sociology at the Université de Montréal and member of the National Statistics Council. The working group identified three main barriers to the analytical exploitation of the data, including difficulty in gaining access to detailed microdata collected by Statistics Canada, a lack of effective linkages between researchers and those involved in public policy development, and an insufficient number of researchers trained in quantitative analysis. The working group released a report in 1998 that proposed, among other things, the establishment of research data centres (RDCs) in different parts of the country, where researchers could access microdata while meeting the confidentiality requirements of the Statistics Act. Within the year, discussions were underway to set up an initial set of RDCs at universities operating under the same security provisions as Statistics Canada, with a full-time Statistics Canada employee on site.
Nine centres were created, the first of which opened at McMaster University in Hamilton in the fall of 2000. Some of the universities funded their new infrastructure through the Canada Foundation for Innovation. By 2002, the centres were fully operational, and, two years later, there were more than 500 researchers working on over 300 projects in the RDCs. Seven more centres opened from 2004 to 2006, and "branch" RDCs were piloted at the Université de Sherbrooke and Université Laval, and these were essentially operated as extensions of existing RDCs.
The early Canadian Research Data Centres Network was funded for three years (from 2001 to 2004) by SSHRC. In 2006, the RDC network was awarded a multi-year operating grant of about $1.4 million per year for five years from SSHRC and the Canadian Institutes of Health Research. The network also received a grant from the Canada Foundation for Innovation to link all the RDCs. The types of files in the centres would be increased (for example, historical and contemporary census files were placed in the centres), and the feasibility of adding federal and provincial administrative data would be assessed.
The network was immensely successful, enabling access to a rich source of powerful social data for Canadian researchers. New ideas for potential expansion emerged, including the ability to make international comparisons with the data. With respect to subject matter, other areas of potential exploration were brought forward by researchers who were getting to know the data. For example, they pointed to new areas that would be immensely valuable to study, which would require changes to the longitudinal surveys that were originally conceived to fill policy needs. As a result, questions began to be asked about the possibility of creating a tripartite governance mechanism involving both policy departments and academic researchers. From 2000 to 2010, the network was governed by an academic council that included a representative from each of the centres. The executive director of the network and the program manager from Statistics Canada held ex officio positions on the council, while funding partners were observers. However, a governing mechanism involving policy departments would not come about until 2017.
Outreach and communication
When Statistics Canada first established its Internet presence in 1995, the main challenge was to move statistical information from the historical print medium to the new electronic environment and to promote the website. A mere decade later, the agency's website had become its principal dissemination and communication channel to the world.
At first, the agency had two online "products."
One was the free public good module on the website, which offered The Daily, Canadian Dimensions tables and other services. Canadian Dimensions tables were an array of about 160 free statistical tables of general interest to the Canadian public. The module was built on the infrastructure that already existed for the Canada Year Book, with the tables arranged around four themes: the economy, the state, the land and the people. The public good module also contained a commercial gateway, whereby users could identify the CANSIM series they were interested in and preview the costs involved. Then, if they proceeded with the transaction, payment was arranged online via the Toronto–Dominion Bank. The Statistics Canada website was revamped in 2001, with expanded Canadian Statistics tables, which were updated automatically from CANSIM. IMDB content was also accessible, along with community profiles and a new search engine.
The second online product the agency offered was a premium dissemination service, called StatsCan Online, for which registration and subscription fees applied. It was primarily an interface for large-volume users who wanted to avoid the frequently long wait times associated with the Internet at the time. This second module, which operated by direct dial-up using a modem, provided access to The Daily, CANSIM, and international trade and horticultural databases. StatsCan Online was much more user-friendly and provided guaranteed access and a free helpline. The service was eventually displaced by the Statistics Canada website, as visits to the site were growing by leaps and bounds each year.
In November 1996, through a project called Partners in Accessibility that had been proposed to the Diversity Management Directorate of the Public Service Commission, a speech synthesis service was offered for The Daily. By dialling a 1-800 number, people who were print disabled, who had visual impairments, who could not turn pages or who had other disabilities could listen to the publication. It was also presented in braille and in large print.
In the late 1990s, as the agency was managing the transition from paper to electronic technology, it was also starting to restrict paper publication to frequently used reference works or flagship publications that appealed to the general public. Others were revamped or consolidated, such as when 13 different publications on income data were replaced by the new Income in Canada publication. In 2006, the electronic versions of agency publications became free to access. This was immediately reflected by increased website traffic, especially for Canadian Social Trends, the Canadian Economic Observer, and Perspectives on Labour and Income, which experienced 10-fold increases in views and downloads. One of the results of moving away from a subscription-based system was that clients became more anonymous, and the agency could no longer use its client lists to alert them to updates or revisions in datasets. As a result, the website began to offer a registration system to allow clients to register to receive notifications. A new client relationship management system was also introduced in 2007 after three years of development to centralize information on interaction with clients, and to help support the communication program.
The Daily prevails through two natural disasters
Many people in the Central Region will remember the ice storm that ravaged eastern Ontario and western Quebec on Thursday, January 8, 1998. Widespread power outages occurred, as many electrical towers crumpled under the immense weight of the ice. Several areas declared a state of emergency, including the National Capital Region and Montréal, which was one of the most severely affected areas in Canada. Luckily, the power stayed on at Tunney's Pasture, and, although the offices were closed on Friday and Monday, employees came to work to release The Daily every day, including the Labour Force Survey data on Friday, and to make final preparations for the census release on Aboriginal data slated to go out the following Tuesday. In terms of operations, many interviewers were themselves affected by the storm, and many respondents obviously had more pressing concerns than responding to surveys. As a result, collection was either delayed or cancelled in some areas, while some work was reassigned to other offices across the country. True to form, the agency soon released a statistical portrait of the event, detailing its impact on employment, retail sales and agricultural operations.
Five years later, in August 2003, the electrical grid in Ontario and the northeastern United States experienced a failure affecting about 50 million people. Nuclear reactor plants without a grid into which to send their power had to be shut down, while some needed repairs, and it took up to two weeks to get the reactors back up to speed. The electrical failure was followed by a week of strict energy conservation. The event resulted in the closure of Ontario-based Statistics Canada offices for six working days. However, the agency was able to maintain building security and safety, protect networks, continue critical data collection activities and publish The Daily every weekday, with the coordinated efforts of staff across the country. A supplemental survey was also quickly added to September's Labour Force Survey to allow for analysis of the impact of work hours lost in Ontario and Gatineau, Quebec, as a result of the event.
Software standardization
In the late 1990s, employees were quick to jump on new word processing or spreadsheet programs as soon as they came on the market (resulting in what was referred to as "software creep" but perhaps would be more accurately referred to as "software diversity"). This caused issues in that other people without those programs could not open files and ultimately led to costs and required time to manage. As a result, in 1998, the decision was made to coordinate and manage all software centrally, which was facilitated by the new capability of remotely rolling out software packages and updates to all desktops. "Project 2000" was the name of the project to convert to a standard desktop, including the operating system and software for email, calendar, attachment viewing, virus detection, and word processing and spreadsheet functions. It was deployed in 2002. In addition, with more and more electronic documents being created, software was being developed to send electronic documents to the Document Management Centre, and a new document management system for email was also introduced.
CANSIM rides again
In 1995/1996, a new data model was developed and tested for CANSIM to make the database more user-friendly, with multidimensional tables, harmonized labels, documentation, new data sources and a modernized platform. Then, in October 1996, a contest for employees was held to name the new version of CANSIM. Five months later, the new name was unveiled as… CANSIM II! It was decided to maintain the high recognition value of the name.
In 1997, six survey areas took part in a pilot test to redefine their data structure so it would fit into the new CANSIM II format. The pilot proved that the redesign would be beneficial, but that a large investment in data harmonization and standardization would be necessary. Some new data were made available directly from the new version, and other data were migrated to the new database over time. CANSIM II was up and running on the internal network in April 2000 and was made available on external networks a year later, with about 3 million time series, compared with 1 million in the original version. Just two years later, this had grown to 13 million series, mostly as a result of new labour force and health data. One of CANSIM II's new capabilities was the ability to generate tables and other portions of publications in different formats directly from its database. It was also built to be multidimensional, as opposed to its predecessor's capability of presenting only one dimension at a time. It was linked to the IMDB and was more easily searchable. The automated generation of tables was called "dynamic publishing," which would later give way to "smart publishing" to allow the creation, assembly and composition of an entire publication from the database. The system greatly reduced the developmental effort required to produce separate paper and electronic outputs. By 2002, a new pricing strategy for CANSIM II would be implemented, allowing for three options: a set fee per accessed vector, prepayments with volume discounts, and a subscription service for unlimited access with an annual fee.
The education program reaches out
While Statistics Canada's educational program had begun with the development of E-STAT in the mid-1980s, the shift to online services and dissemination prompted the establishment of the Education Outreach Program in 1996. The program offered curriculum-based learning resources to help improve the statistical literacy of students through a customized portal of free information, learning tools and online support. A team of agency employees offered advice and training to teachers across the country and built partnerships with faculties of education, textbook publishers and other organizations supporting the education community. In 2000, the learning resources section of the website was redesigned to offer separate entry pages for students, teachers and postsecondary institutions, and E-STAT was made available online free of charge. By 2003, teachers and students were visiting the online learning resources at a rate of about 4,000 visits per day, and over 9,800 schools were registered to use E-STAT. The program developed other outreach activities, including the international Census at School project, the Classroom Outreach Program, student internship programs, expert speaker programs, and an electronic newsletter about the educational products and services available. The program also worked at ensuring that the latest and most reliable Canadian statistics were included in Canada's education materials, including textbooks and lesson plans.
The Census at School project was an international classroom project that began in the United Kingdom in 2000, developed to expand the statistical knowledge of students. Some of the census questions were common to all countries, while others were developed in Canada by an interprovincial teacher advisory board. Students were involved in the collection and analysis of their own data, which became part of national and global databases for teachers and students to use for research and analysis around the world.
The Classroom Outreach Program started as a pilot project in 1999 as a community outreach activity and a way of introducing statistical literacy to students. Employees shared their expertise in math and technology or contributed other skills or knowledge by working for up to two hours per week in local schools. When it started, 22 employees from the Business and Trade Statistics Field were involved in the pilot project, and, by 2001/2002, about 125 employees were participating from all areas of the agency. In 2004, the University Liaison Program, which was aimed at postsecondary students, began. The Education Outreach Program was in place for 15 years, until it came to an end in 2012.
Unique promotions in the Census Program
The 1996 Census communications program used some novel methods of getting the message out, such as advertising on bus seats, an electronic billboard near the CN Tower, milk bags and cartons, margarine containers, sugar packages, and inserts in the seat pockets of regional airlines. The Quebec regional office created two 32-second videos featuring a well-known comedian, which were shown on closed-circuit televisions at CEGEPs and athletic clubs. Other communications innovations included the "Let our circle enlarge" artwork by Cree Nation artist George Hester that was featured on a 1996 Census poster, and a public service announcement in the eastern and western Arctic with Inuk singer-songwriter Susan Aglukark. Vancouver artist Barb Wood created artwork to be used throughout British Columbia and Yukon for the 1986, 1991 and 1996 censuses. Among the talent who agreed to participate in census promotions was Canadian actor Leslie Nielsen, who promoted the census in public service announcements in 1981 and again in 1996, when they ran on 70 cinema screens in Alberta and British Columbia, and were played every half hour by Blockbuster Video in its 225 Canadian stores and by Rogers Video in its 142 stores. Other celebrities who participated in public service announcements in the Prairie region included Jean Béliveau, Nettie Wiebe and Allan Blakeney.
A community outreach program that had begun for the 1991 Census was recommended for continuation for the 1996 Census, in response to concerns from the African Nova Scotian community, who felt that census figures did not accurately reflect their numbers. An employee on special assignment from Human Resources Development Canada continued and expanded the outreach program for 1996. A significant focus was also placed on encouraging the participation of the Acadian population.
The 2001 Census of Agriculture also had a fairly unique promotion—the Western Region had arranged that 100 railway cars be wrapped in the census logo. An American model railway company later made models of the car for sale, and when the Census of Agriculture team met in Washington with its U.S. counterparts a few years later, several of the miniature models were presented to the Canadian team. A census manager quite aptly described the situation as "a case of art imitating life imitating art." Apparently, some of the full-size cars can still be seen on the railways today.
Fostering public relations
A comprehensive marketing and dissemination plan was prepared in conjunction with the long-term planning process in 1999. Marketing and dissemination would be focused on corporate priorities, including migrating from print to an electronic format, using the Internet to expand products and services, and better serving clients through an increased coordination of sales with regional offices.
In 2005 and again in 2007, Statistics Canada contracted with Environics Research Group Limited to assess the extent to which the brand and role of the agency were recognized, perceptions around the value of the agency, and the extent to which people were willing to participate in its surveys. The 2007 survey found a strong public awareness of the agency and its role, with most adults (80%) holding a positive impression of the agency and feeling that it made a contribution to the quality of life in Canada. In addition, more than half preferred using the Internet to respond to surveys, double the percentage from two years earlier.
The agency continued to develop co-operative relationships with key federal departments and provincial departments—in fact, every time a statistical release was issued with some significant and non-routine information, Dr. Fellegi would send a personal letter containing analytical highlights to the federal and provincial deputy ministers of the appropriate departments.
Internal communications
While an internal communications network had been operational since 1994, it functioned through what was referred to as "FolioVIEWS"—a user interface program that employees accessed through an icon on their desktops. When the intranet came about in 1996, it gave the regions better access to the network, as well as its new external cousin on an externally facing network, which could be accessed from off-site. For a time, employees could choose to use the interface they preferred, although, by 1997, the FolioVIEWS software version was discontinued. Divisional intranet sites also began to be created, with 17 created by the fall of 1996.
April 1997 marked the first issue of @StatCan, a new weekly electronic communications product for employees. All the stories in the quarterly print publication Scan were also published in @StatCan, often in longer formats with more photos, with many articles appearing only in @StatCan because of space or time limitations with the print publication. The final issue of Scan appeared in September 2000, fully passing the baton of internal communications to its electronic progeny.
The online momentum continues
The 1999 Speech from the Throne stated that the Government of Canada was to be "known around the world as the government most connected to its citizens, with Canadians able to access all government information and services on-line at the time and place of their choosing." This prompted the creation of the Government On-Line initiative, which endeavoured to make the most commonly used government services available online, anywhere, anytime and in both official languages. The initiative spanned the years 1999 to 2006, and all federal departments and agencies were to target the end of 2004 to make all their information and services available online. Although most of what the agency produced was available online, the initiative provided additional impetus to continue to evolve into the online sphere.
The final report from the Government On-Line initiative in 2006 remarked that the Internet had emerged as Statistics Canada's primary distribution channel and that the agency's website was among the most frequently accessed sites in all of government. As an example of information as an asset to decision making, it provided an example of the Canada e-Book, produced by Statistics Canada. The Canada e-Book was an online version of the Canada Year Book, which used sound, images, tables, graphs, and analytical and descriptive text to provide an overview of the country. The agency had introduced it as a complement to the Canadian Statistics tables, as a result of market research conducted to gauge interest in an electronic version of the Canada Year Book. It was free of charge and updated dynamically as new information became available. Employees were invited to lend their voices to the e-book to make it accessible to the visually impaired, and amateur photographers were also asked to contribute photographs. The first version in 2003 had the same four sections as the year book and the Canadian Dimensions tables: the land, the people, the economy and the state.
It was updated periodically until its demise in 2005. At that time, the print version of the year book was being redesigned for 2006, going back to its roots as an almanac-style publication, and past year books were beginning to be digitized into an online collection. As well, a new text component to the Canadian Statistics tables was being added, called the Canadian Statistics Overview (CSO)—short articles analyzing the tabular data, essentially a "companion guide" to the facts. Unlike the e-book, the CSO would have no "year" but would gradually evolve with regular, minor updates instead of the massive undertaking to update the e-book content every few years.
Y2K
As the year 2000 drew near, there were widespread concerns that the switch to the new millennium could affect both hardware and software, causing them to operate unreliably. At the heart of the issue, also known as "Y2K," was that to conserve then-precious computer memory, years were stored in computer programs with only two digits, making the year 2000 indistinguishable from the year 1900. Special committees were set up by governments, including a year 2000 task force created by then Minister of Industry John Manley in September 1997. The task force commissioned the agency to conduct a survey to determine the willingness and capacity of businesses to ward off any potential crisis, and to produce the results by November 1997. It did so, and, by the end of 1999, it had conducted three such surveys, called the National Survey on Preparedness for the Year 2000. The intention was that federal departments would use these data to assess how Canadian businesses were dealing with the issue and would identify industrial sectors that could require particular attention. The second survey showed that the problem had been effectively solved; however, the task force recommended the third survey to examine the testing of system fixes and contingency plans in the private sector and by public utilities.
The Y2K issue was top of mind for Statistics Canada for a number of years, second only to the maintenance of its key monthly and quarterly releases. The agency was not just worried about its own systems, it was concerned about its respondents' systems and their ability to respond to surveys. The widespread preparations were ultimately successful, as the rollover at the agency was completely smooth.
Evolving modes of data collection
Data collection up until 1997 had moved from paper questionnaires to computer-assisted telephone interviewing, to computer-assisted personal interviewing, and to imaging and intelligent character recognition, and was starting to move into electronic data reporting (EDR).
The first large-scale use of questionnaire imaging at the agency was for the 1996 Census of Agriculture. This permitted the capture of data from imaged questionnaires, allowing processing staff to reference them easily. It also allowed analysts to view the questionnaires immediately without having to submit requests for specific forms, wait for them to come back and then resubmit them for filing. Intelligent character recognition was also underway in 1997 for a number of surveys, including the Business Conditions Survey; for the tax forms used in the Survey of Employment, Payrolls and Hours; for the Address Register; and for the Salary and Wages Survey. The new technology saved both money and time and was being assessed for data capture for the next census. It was used successfully for the 2001 Census of Agriculture, which also served as a feasibility test for the 2006 Census of Population.
By the end of 2004/2005, the agency had moved all but one survey to the new Blaise collection application developed by Statistics Netherlands for computer-assisted interviewing, in the interest of achieving one standard approach for collecting both business data and social data. The agency began to standardize and reuse question modules to decrease costs and increase timeliness, and take advantage of the call scheduling functions of the software. Where computer-assisted interviewing consolidated interviewing, data capture and some editing directly into one process, EDR was an evolving technology that shifted some of these activities into the user sphere.
The first EDR initiative was in the early 1990s, through what was called the Personalized Electronic Reporting Questionnaire System, a diskette-based application that was distributed to business respondents and installed on their workstations. Respondents were led through a questionnaire that had a number of built-in edits, and then the business would send the diskette back to the agency via courier. Among the first were diskette-based versions of the Annual Chain and Department Store Survey and the Steel Survey. After the dawn of the Internet, the desktop application could be downloaded from the Statistics Canada website instead of receiving the diskettes by mail, and responses were also collected via other modes, including electronic data interchange, email attachments or file transfer protocol. The advantages and savings were many, with respect to timeliness, data quality and cost, but the Internet also brought the challenge of ensuring the security and confidentiality of the transmitted information. When an EDR application was first used for the computer services and Internet providers surveys, the response rate on the first day of launch was higher than that of the entire previous year.
The Government On-Line initiative had provided funding for accelerating the development of the agency's electronic data collection for 2001/2002, which allowed for pilot electronic data applications for eight enterprise surveys and three agricultural surveys. The following year, the agency was granted multi-year financing until 2005/2006 to expand EDR. The approach was twofold: the Secure Internet Response Site, which supported the 11 pilot surveys, and the Personalized Reporting and Exchange Services Site for key data providers, to streamline the reporting of large businesses to reduce their reporting burden. Businesses were able to access their survey inventory, collection calendar and electronic versions of their questionnaires to coordinate their responses.
In 2006, about 50 surveys were using EDR; however, the agency was experiencing a number of technical difficulties. The technology was complex and not user-friendly, and technical problems abounded, making costs soar outside the funding envelope. The download time via modem was very high, and deployment issues arose in response to the many different configurations of respondent computers. The agency was perhaps pushing the envelope a little too far, as most businesses were not ready for the technology. Its adoption by respondents was slow, a phenomenon that was being experienced in other countries as well. As a result, the number of surveys using EDR was cut in half, and the agency went back to the drawing board to review and simplify the electronic data collection strategy and make it more cost-effective. It began working toward server-based solutions (or what was referred to as "zero-install" and "zero-footprint" solutions), as opposed to client-based solutions.
In the meantime, the Internet response channel for the 2006 Census had shown promising results—a take-up rate of about 20%. In 2010, a corporate initiative to use web-based electronic questionnaires as the primary collection mode was introduced. It certainly helped that faster broadband access to the Internet began slowly replacing dial-up access in the mid-2000s.
Human resources
Human resource (HR) management at the agency had moved away from a time of rigid compartmentalization and was undergoing significant modernization. The modernized HR management provided employees with a feeling of security while encouraging horizontal movement, fostered well-being through an organized wellness program, and allowed voices to be heard and change to be made through employee opinion surveys. It also levelled the playing field through generic competitions and strengthened the agency through robust recruitment and training mechanisms.
The agency was implementing a federal public service initiative in 1997 called "La Relève," which was aimed at improving HR management. The name was an acronym for Leadership, Action, Renewal, Energy, Learning, Expertise, Values and Excellence. It was initiated to help manage the rapid downsizing of the public service in the 1990s, as well as the growth in the use of computers, which it was felt had increased the pressures and demands on public servants. The initiative was introduced by the Clerk of the Privy Council, Jocelyne Bourgon, in her fourth annual report to the Prime Minister on the public service. She drew attention to the poor track record of the public service in HR and career planning and asked federal departments to assess their HR requirements and develop HR management plans. Her report emphasized that the federal public service must reflect and embrace different backgrounds, cultures, experiences, interests and styles.
She also suggested departments might follow the examples of Statistics Canada and Natural Resources Canada and focus on medium- to long-term HR planning. The agency had started to renew its HR a few years earlier (realizing it was faced with many impending retirements and increasing technological change), with a strategy based on recruitment, training, career-broadening initiatives and a positive work environment. In fact, Statistics Canada had pioneered a microsimulation model for the age structure and the promotion and retirement patterns of staff to identify bottlenecks, and set recruitment and promotion targets. This was called PERSIM (the Personnel Simulation Model), and it was shared with the Treasury Board and the Public Service Commission to help them develop an overall recruitment strategy.
Departmental action plans included three pillars: recruitment, employment equity and retention. One of the vulnerabilities that was identified in the agency's La Relève plan was the need to increase representation of employment equity group members. Although significant progress had been made first with respect to Francophones, and then women, Aboriginal people and persons with disabilities, there had been relatively little progress with respect to visible minorities at Statistics Canada or in the federal government.
The evolution of employment equity
In 1967, the first Public Service Employment Act was enacted and began to raise questions about employment equity and representativeness in the federal public service. The United Nations had just established March 21 as the International Day for the Elimination of Racial Discrimination in 1966. It called upon the international community to redouble its efforts to eliminate all forms of racial discrimination in the wake of the Sharpeville massacre in 1960, which shocked the world and drew worldwide condemnation of South Africa's apartheid policies.
In 1971, Canada adopted an official policy of multiculturalism to ensure that all citizens could keep their identity, take pride in their ancestry and have a sense of belonging. The Canadian Human Rights Act, which was enacted by Parliament in 1977, protected Canadians from discrimination based on any of 10 grounds: race, national or ethnic origin, colour, religion, age, sex, marital status, family status, disability and conviction for an offence for which a pardon has been granted. Sexual orientation was later added to the list of prohibited grounds of discrimination in 1996. In 1982, multiculturalism was recognized in the Canadian Charter of Rights and Freedoms, and, in 1988, Canada enacted the Canadian Multiculturalism Act.
Two important reports published in 1984 shaped the future of employment equity in Canada. The first was Equality Now!, a report produced by the Special Committee on Participation of Visible Minorities in Canadian Society. The committee presented 80 recommendations in the areas of social integration, employment, public policy, legal and justice issues, media, and education. The second key report was Equality in Employment, which was produced by the Royal Commission on Equality in Employment, chaired by Justice Rosalie Abella. The commission explored ways of promoting equality in employment among women, Aboriginal people, persons with disabilities and visible minorities. Their report introduced the term "employment equity" and made a number of recommendations, including the need for employment equity laws. In 1985, visible minorities were added to the groups covered by the federal government's affirmative action program, a voluntary program introduced in 1983. The same year, the right to equality was added to the Canadian Charter of Rights and Freedoms, stipulating that all individuals in Canada, regardless of race, religion, nationality, ethnic origin, colour, sex, age, or mental or physical disability, are equal in the eyes of the law.
These two landmark reports prepared the ground for the Employment Equity Act in 1986, which aimed to ensure that no one would be denied employment opportunities and benefits for reasons unrelated to ability. It also aimed to identify and eliminate systemic barriers faced by designated groups. However, the act did not apply to the federal public service, the Royal Canadian Mounted Police or the military until a later revision in 1995. The Employment Equity Act required employers to implement positive practices to achieve a representative public service through the recruitment, retention and promotion of persons in the designated employment equity groups (members of visible minorities, women, persons with disabilities and Aboriginal peoples).
It also led to the need for data on the designated groups, which prompted the creation of the Employment Equity Data Program at Statistics Canada in 1986 to coordinate data development. Data meeting the definitions for designated groups under the Employment Equity Regulations were fairly easy to produce on women, Aboriginal peoples and persons with disabilities, using the census and the postcensal Health and Activity Limitation Survey. Data were a little more challenging to produce for visible minorities. The 1986 and 1991 censuses did not include questions that would enable direct identification of members of visible minorities, so an indirect multistep approach was used to derive estimates primarily from ethnic origin, in conjunction with place of birth and mother tongue. The following census in 1996 included a question about "population group" to enable the measurement of the visible minority population more directly. The new question became the most publicized issue in the 1996 Census, especially in western provinces where a small number of journalists and politicians encouraged Canadians to identify as "Martians."
Statistics Canada's Visible Minority Consultative Group was created in the mid-1990s to develop and implement regular action plans addressing priorities to further the agency's goal of an inclusive organization that equally supports all employees, focusing on issues related to visible minority groups. Today, it is a consultative body to senior management on issues affecting the employment, retention, career development and progression of employees from the visible minority community. It reports to the Employment Equity and Diversity Committee, which provides policy advice and implements programs to ensure equitable representation and treatment of employees in designated groups.
With respect to Aboriginal people employed in the public service, their resignation rate was more than double that of non-Aboriginal employees, which prompted the Treasury Board to initiate a study in 1989, carried out by the Public Service Commission in 1990. The interview data suggested that some underlying causes included a sense of isolation from their communities and fellow employees, ongoing discrimination, and a feeling of being concentrated in what was referred to as "Native content positions." These findings offered insights and a basis for further consultation and change.
The government-wide Task Force on the Participation of Visible Minorities in the Federal Public Service was launched by the President of the Treasury Board in 1999, as the government had not yet met the employment equity objectives required by the Employment Equity Act. Two of the core recommendations of the task force were to establish a recruitment benchmark targeting the same proportion of visible minorities in the public service as in Canadian society, and to change the corporate culture to make the public service a more welcoming and trusted environment for visible minority employees. The government accepted the recommendations of the task force, and a government policy called "Embracing Change" was launched in 2000 to implement them. An information session was held at Statistics Canada in January 2003 to announce the changes that would take place and to build support for the initiative. Statistics Canada's action plan included a developing a communications strategy to promote an open environment, conducting sensitivity training, reviewing generic competition results, offering career counselling for members of the employment equity groups, and developing a comprehensive policy on accommodating the needs of employees. It also established champions representing each equity group to help the agency set priorities.
To monitor the progress of legislative and policy initiatives, in 2004, the Standing Senate Committee on Human Rights began to monitor issues of discrimination in the hiring and promotion practices of the federal public service and to study the extent to which targets to achieve employment equity were being met. The item became an ongoing order of reference for the committee, and its first report was released in 2007, entitled Employment Equity in the Federal Public Service: Not There Yet. It found that although women, Aboriginal peoples and persons with disabilities were better represented than their workforce availability, visible minorities remained underrepresented, and none of these groups were well represented in executive levels or across all occupational groups. It called for strengthened leadership, concrete measures and the removal of systemic staffing barriers. The second report in 2010 and third report in 2013 found that while progress had been made in achieving employment equity goals over the years, there was still work to be done to ensure a federal public service that was truly representative of all Canadians at all levels.
Recruitment and training expand
Recruitment, training and employee development were taking centre stage as a result of a massive new project in economic statistics, a major expansion in health statistics and new postcensal surveys, as well as the new initiatives stemming from the Data Gaps and Data Gaps II initiatives. To complement the increased emphasis on recruitment, the agency produced a video in 2002 entitled "We are Statistics Canada" on its core mission and values. The video was first produced on CD, and then in 2003 became the first video to be featured on Statistics Canada's website. It also earned a merit award in audiovisual presentation from the International Association of Business Communicators.
To help train employees to transform data into information, a pilot run of the new six-week Data Interpretation Workshop was held in early 1996 with 12 employees, to replace the previous three-week Principles of Data Analysis course. Senior analysts acted as advisors to the participants, who prepared a manuscript that could be submitted for publication. As a complement to the course, the Analysis Coaching Program was launched in 2003 to coach employees through the preparation of a short analytical article without having to leave their work to go on a course.
Similarly, the pilot run of the 14-week Business and Economics Statistics Training (BEST) program took place in September 1996. The BEST program grew out of the Project to Improve Provincial Economic Statistics (PIPES), as the massive new initiative needed employees with the skills to design, implement and run major changes to the business and economic statistics programs. The tight deadlines and sheer magnitude of PIPES also meant that the project was drawing heavily on staff from across the agency, and therefore leaving unsustainable labour gaps in other programs. As an example of the degree of turnover, the Industrial Organization and Finance Division held a party in 1997 to bid farewell to 30 employees and welcome 15 new ones—significant numbers, considering the division had about 100 employees at the time. The BEST program was designed to give new employees a solid base of knowledge and skills, and some exposure to the agency's subjects and disciplines in business and economics, such as PIPES, the System of National Accounts, business survey methods, the Business Register, classification systems, business financial statements and project management. It was designed to quickly bring employees up to speed to be able to deploy them into positions at an accelerated pace. Over 80 employees gave presentations to the first class.
Recruitment and development programs that combined job assignments and training were taking off in the late 1990s. Computer Systems (CS) staff were in high demand with PIPES, Y2K and other new programs under development. They were also increasingly in demand from the private sector, so the CS Recruitment and Development Program went into overdrive in 1998 to hire about 125 new university graduates into the two-year development program. The agency also introduced the new Social Sciences Support (SI) Recruitment and Development Program to help address the growing shortage of junior-level technical skills and provide career opportunities for support staff.
The Recruitment and Development Division was created in the early 2000s to consolidate the previously decentralized recruitment efforts and coordinate initial training and development. The new division also helped to make the agency's recruitment more competitive, speeding up its process so the agency could make earlier offers to the best and the brightest among the potential new hires. It placed greater emphasis on recruitment at all levels and helped those divisions that were experiencing shortages of staff or that were soon to be affected by high rates of retirement.
In 2002, projections of the volume of upcoming retirements by 2010 highlighted that this was a vulnerability, especially among Executive (EX) positions, and the agency was working to mitigate the risk. In the 2002 @StatCan special issue, Dr. Fellegi referred to the rejection of the notion of picking one successor per future vacancy—the "crown-princing" approach. "Rather, we decided that we must develop a pool of talented and well-trained individuals at each level from whom we can select the best as and when vacancies at the next higher level occur. This is a much fairer, more robust and, I believe, more effective strategy. However, it is clearly much more labour-intensive." Much of the increased focus on training, including the new Senior Management Development Program in 2000 and the EX Selection and Development Program in 2001, was aimed at mitigating the shortfalls. Some of the other initiatives to assist with succession planning were the Alumni Program, generic competitions, the new Recruitment and Development Division, and a mentoring program that was being reviewed in 2002.
The Alumni Program was a program created in 2000 through which retiring employees with significant corporate knowledge were offered the opportunity to continue to work part time and share their skills. They could return to work to pass on historical knowledge—for example, through optimizing or developing projects, helping with selection boards, or increasing the agency's flexibility to handle periods of high workload. The initiative was highly successful and continues to exist.
A new focus on wellness
In recognition of the importance of a positive working environment, a working group of middle managers was tasked with the issue of supporting the well-being of staff in 2000, and carried out extensive research on innovative practices in the private sector. As a result of this work, the Workforce Wellness Committee was established in 2001 as part of the management committee structure. Its aim was to recommend concrete measures to promote workplace wellness, to research issues affecting employees and to facilitate positive initiatives. One of its first initiatives was to launch a wellness website for employees. Two years later, Statistics Canada received a National Quality Institute (NQI) award for demonstrating that employee health and well-being were an integral and strategic part of its basic activities. As part of its evaluation process, the institute sent a team of five people to the agency for three days and held focus group sessions with over 200 agency employees. While the NQI's Healthy Workplace Award was well established in the private sector, Statistics Canada was the first public sector organization to receive the award.
New staffing initiatives are implemented
In the late 1990s, the public service implemented the Universal Classification Standard initiative, through which it aimed to simplify the job classification system and increase fairness in the evaluation of public service jobs. As a result, the agency consolidated its job descriptions and ended up with about 250 work descriptions, compared with over 2,300 before the initiative started. It also established the Career Streams Committee to identify the skills and the depth of training and knowledge needed at each level within the different career streams at the agency. Information on the training and career-broadening experience needed for progression within each major occupational group and level was provided to employees to use as career planning tools.
In 2006, the federal public service was in the process of amalgamating the Economics, Sociology and Statistics (ES) and Social Science Support (SI) groups into the new Economics and Social Science Services (EC) occupational group. It was a challenge to ensure that the generic job descriptions reflected the duties performed, and, as part of the conversion process, each employee had an opportunity to comment on the description to which their job would be mapped. They could also seek recourse if they were unhappy with the results for their position.
Another key change occurred when the federal public service implemented the new Term Employment Policy in 2003/2004. This change would enable term employees to more quickly become indeterminate employees by decreasing the cumulative working period required from five to three years. In 2004, about 10% of Statistics Canada employees had term status, and 153 employees subsequently became indeterminate as a result of the policy.
As well, starting in April 2004, all bilingual positions in the federal public service had to be staffed by individuals who were bilingual when they were hired. This was a result of the Policy on Official Languages for Human Resources Management. This created a great demand for language training at the agency and across the public service. Increased funding was made available through the government's five-year Action Plan for Official Languages, also referred to as the "Dion plan," as it was part of the mandate of Minister Stéphane Dion. The plan was to strengthen linguistic duality in the country, strengthen the vitality of minority official language communities and better reflect both official languages in the federal public service. In fact, starting in 2001 on a de facto basis, and formally in 2003, a Minister Responsible for Official Languages was appointed for the first time. To evaluate the new action plan and prepare for its possible renewal in 2008, the government asked Statistics Canada to design and implement a postcensal survey: the Survey on the Vitality of Official-Language Minorities. Data were released in December 2007 and were used to develop policies and programs for official language minorities. Ten federal agencies and departments helped to finance the survey, which sought information from the Francophone minority outside Quebec and the Anglophone minority in Quebec.
Generic competitions level the playing field
Generic competitions were introduced on a large scale in the mid-1990s in response to the first Employee Opinion Survey, which found that employees believed that the job competition process was not always fair. Local competitions naturally favoured local candidates who had a built-in advantage in the area of the competition, and made promotional opportunities haphazard, depending on the mobility of the more senior personnel in each area. The generic competition process continued to be expanded and refined over the years to increase fairness and transparency.
The social statistics program
Dr. Fellegi referred to social statistics as "the Cinderella of statistical systems." There had been numerous attempts over the years to create a comprehensive social statistics framework, but it was not until the government's new focus in the mid-1980s on which programs and policies were working, and why, that major investments began to be made in the social statistics program. The social statistics program underwent significant expansion and redefinition, with longitudinal surveys playing a key role in trying to understand the transitions toward successful outcomes. Social statistics were finally moving beyond simply monitoring processes and expenditures toward social outcomes.
The Centre for Education Statistics is launched
The education statistics program at the agency saw another boost in 1996 when the Centre for Education Statistics was established as a joint undertaking between Statistics Canada, the provinces and territories, and the Council of Ministers of Education. The advantage of a collaborative approach had been seen with the successful Canadian Centre for Justice Statistics, which had opened 15 years earlier. Such an approach afforded a greater sense of ownership for those who provided and used the centre's data. Within a few years, in collaboration with the Council of Ministers of Education, the agency published a report on education indicators as part of the Pan-Canadian Education Indicators Program. This report contained the most extensive range of comparative indicators ever accumulated on the Canadian education system to aid in decision making, policy formulation and program development. This also marked the first major project born of the collaboration with the Council of Ministers of Education and the provincial ministers of education under the aegis of the Canadian Education Statistics Council.
Canada's ongoing Family Violence Initiative
Under Canada's ongoing Family Violence Initiative, Statistics Canada also collected and analyzed data from a variety of sources to produce an annual report called Family Violence in Canada: A Statistical Profile. First released in 1997, it provided the most current data on the nature and extent of family violence in Canada, as well as trends over time. The first issue addressed topics including spousal assault; child abuse; criminal abuse of older adults; and criminal harassment or stalking, as a result of the 1993 amendments to the Criminal Code, which had added the new anti-stalking offence of criminal harassment. In 1999, the first General Social Survey on Victimization was released, providing information on spousal violence against women and men. The General Social Survey had doubled its sample size through funding from the Policy Research Initiative. The second cycle on victimization was released in 2004, and, two years later, Statistics Canada and the federal, provincial and territorial ministers responsible for the status of women released a report entitled Measuring Violence Against Women: Statistical Trends 2006.
A new and powerful health survey is born
Health policy in the country was undergoing a major shift, with an increased emphasis on health promotion, as well as regionalization and integration of various care and support programs. The 1999 federal budget approved a major expansion in health funding for four years, called the Health Information Roadmap Initiative. This was a collaborative effort that had come about in early 1998 as a result of consultations carried out by the federal Minister of Health's Advisory Council on Health Infostructure, the Canadian Institute for Health Information (CIHI) and Statistics Canada. They consulted with health administrators, researchers, caregivers, government officials, health advocacy groups and consumers to identify the country's health information needs. One of the priorities that came out of these consultations was the need for an integrated health information system that incorporated regional and community-level information and allowed for meaningful comparison across jurisdictions. The vision and action plan resulting from the consultations were endorsed by the Conference of Federal-Provincial-Territorial Deputy Ministers of Health. Soon thereafter, Budget 1999 earmarked funding over three years to implement the work plan.
In response to the consultations, Health Canada, CIHI and Statistics Canada launched a collaborative process to identify indicators that could be used to report on the health of Canadians, as well as the Canadian health system. The intention was to share this information, while respecting privacy and confidentiality, and support regional health authorities in monitoring the progress of their health-related initiatives through high-quality comparable information. A strategic framework was developed to guide the work and monitor achievements, and experts from the regional health authorities and from provincial, territorial and federal health ministries, as well as academics, were consulted to develop a set of indicators for the framework. This work culminated with the 1999 National Consensus Conference on Population Health Indicators, where a first set of comparable health indicators in the areas of health status, outcomes of health services and quality of health services was selected. The suite of indicators would expand with the development of new data sources, new benchmarks and knowledge growth. To extend the reach of the project to a wider audience and improve access to the indicator data, Statistics Canada and CIHI created the Health Indicators Internet publication, accessible from both websites, containing the full suite of regional-level indicator data produced through the project.
In 2002, via a separate initiative, all provinces and territories and the federal government published a set of comparative health indicators for their jurisdictions. Statistics Canada provided most of the data for these reports, including by launching a special survey to fill two key data gaps on waiting times for key diagnostic and treatment services and on access to first contact services. A new 2003 health accord resulted in a need for additional indicators, which Statistics Canada was again involved in developing, and for which it supplied nearly three-quarters of the required data.
The two key questions the government was seeking regular information on were "How healthy are Canadians?" and "How healthy is the health care system?" Statistics Canada and CIHI co-published two reports answering these questions in 1999 and 2000, with the agency taking the lead on the first question in a special issue of its Health Reports publication, and CIHI taking the lead on the second question in a report entitled Health Care in Canada, 2000: A First Annual Report.
One of the central elements for the development of regional-level data was a health survey that could provide estimates for individual health regions, where a larger number of decisions about the health system were being made. The new Canadian Community Health Survey (CCHS) began in 2000 with about 130,000 respondents, providing a range of information on health status and risk factors for 136 health regions across the country. It became the agency's largest household survey after the census. All provinces, territories and regions were offered the opportunity to choose optional content modules based on their data needs, which resulted in 27 different versions of the survey being conducted. Canadians were also asked for, and largely provided, consent to link their provincial health records with the survey data. The rich dataset created would enable researchers to link lifestyle practices (such as smoking, exercise, regularity of physician visits, stress and workload) to health outcomes (including use of the health care system, hospital stays and physician visits), as well as to study the long-term benefits of major health interventions.
Until 2007, the CCHS ran on a two-year cycle, surveying 130,000 respondents one year on core questions, and the next year surveying 30,000 respondents with a more in-depth questionnaire on a special topic that involved a separate stream of consultations. For example, in preparation for the 2002 cycle, which addressed mental health and well-being, extensive consultations were carried out with privacy commissioners, health associations and mental health experts to develop an appropriate approach. This survey provided national estimates of the prevalence of major mental disorders and problems, helping to shed light on issues such as access to and utilization of mental health services, the prevalence of episodic and chronic mental health problems, and the availability of social support. Beginning in 2007, the core survey switched to continuous collection, surveying 130,000 respondents over the two-year period, to level out the interviewing workload and yield annual releases to support more timely health surveillance. The special in-depth surveys continued to operate every two years. The same year, a rapid response functionality was developed, whereby external clients could have data collected, processed and released within a four- to six-month window.
Additional landmark health initiatives
The agency conducted a joint survey of health with the U.S. National Center for Health Statistics in 2003, to improve North American comparability of data on health—especially given the difference between the largely private U.S. health care system and Canada's publicly funded system. This was the first time Statistics Canada conducted data collection in a foreign country.
The agency also received new funding in 2003 for another major health survey, but one that would take a wide range of physical measures. Under the direction of an expert advisory committee, and with guidance from the U.S. National Center for Health Statistics, which had experience with a similar type of survey, work began to launch this unique survey. The new Canadian Health Measures Survey would provide data on indicators of chronic diseases, fitness, environmental exposures, nutritional status, infectious disease and risk factors, as well as protective characteristics. A pretest was conducted with the help of the Calgary Health Region in 2004 to determine whether Canadians would agree to participate in such a survey and to help delineate the costs and logistics required. After a successful pretest, the project team conducted a dress rehearsal in early 2006 to prepare for full-scale data collection. The survey included a household interview as well as a clinic visit to collect the physical measures, with the first cycle involving 5,500 Canadians in 15 communities across the country over a two-year period. The mobile clinics where the physical measures were collected were customized trailers, including two that were on loan from the National Health and Nutrition Examination Survey, a similar direct-measures survey already conducted in the United States. A new division was also created at the agency—the Physical Health Measures Division—to support the survey. The first data release in November 2008 presented preliminary data on blood levels of lead, mercury and cadmium from the first eight collection sites. Budget 2008 secured the future of the program.
Two key postcensal surveys included the Health and Activity Limitation Survey and the Ethnic Diversity Survey.
The Health and Activity Limitation Survey was a postcensal survey previously carried out in 1986 and 1991 that was renamed the Participation and Activity Limitation Survey (PALS) for 2001 and 2006. The new name reflected the fact that the survey would focus on the participation of persons with activity limitations. The 2001 sample size was about 40,000 and provided information on the characteristics of adults and children with disabilities; their need for support; and their participation in education, employment and everyday activities. The survey was funded by Human Resources and Skills Development Canada (Human Resources Development Canada in 2001). The 2006 cycle was the last time PALS was conducted, as the New Disability Data Strategy was launched by Human Resources and Skills Development Canada in 2010.
The agency was also collaborating with Canadian Heritage on the postcensal Ethnic Diversity Survey to provide information on ethnic diversity in Canada and the impact on socioeconomic outcomes, and to help improve information on how people interpret and report their ethnicity. Starting in April 2002, 30 representatives from the agency interviewed about 42,000 individuals in the two official languages, as well as in Mandarin, Cantonese, Italian, Punjabi, Portuguese, Vietnamese and Spanish, using computer-assisted telephone interviewing. Information collected was used not only to inform policy and program development in Canadian Heritage but also to help content development for the 2006 Census.
The new Aboriginal Statistics Program
The Aboriginal Peoples Survey (APS) was first conducted in 1991 to develop both core national data and data for specific Aboriginal groups. Responding to a recommendation from the Royal Commission on Aboriginal Peoples, and as part of the federal Gathering Strength initiative, the agency was asked to conduct the survey on a regular basis and develop a program to build statistical capacity in Aboriginal organizations. As a result, the survey became a postcensal survey. The 1991 and 2001 surveys were designed to produce data for both on- and off-reserve populations, while the 2006 cycle was the first to exclude the on-reserve population in the provinces. Later, the 2012 survey was the first to exclude the on-reserve population for both the provinces and the territories.
To respond to the request to help build statistical capacity, the agency created the Aboriginal Statistical Training Program, for which it conducted a pilot in February 1999. The two-week course was aimed at Aboriginal people whose jobs required them to work with statistics. It was designed to show participants how to define their data needs; how to find data; and how to use data effectively to support their organizations' decision making, planning, programming and evaluation. It also provided an introduction to survey taking to address cases where primary data did not exist. As well, the Aboriginal Internship Program was established, whereby interns would be hired for two years at the agency to learn from a variety of statistical activities.
The agency was asked in 2003/2004 to develop a plan for a comprehensive Aboriginal statistics program that would meet the information needs of Aboriginal groups, governments and others. Two years of funding were provided, totalling $10 million. The program was to deliver statistics similar to those available for the non-Aboriginal population and provide statistical training and skill development to First Nations people, Inuit and Métis to facilitate Aboriginal self-government. Program options were to be prepared for consideration by Cabinet in the spring of 2005. In the meantime, the agency was engaging with Aboriginal groups, conducting training, and developing survey approaches for both the on-reserve population and the off-reserve population. Nine Aboriginal liaison officers were appointed to serve as primary contacts with Aboriginal groups and organizations. Questions were added to the Labour Force Survey, and experimental estimates were produced for the four western provinces. Pilot household surveys were also conducted in five First Nations communities, as well as pilot surveys of public sector statistics for First Nations governments. Work was also underway to develop a postcensal survey of Aboriginal children to collect information about the early development of children younger than 6 living on and off reserves across Canada. Much of this ongoing work was dependent on securing additional funding through the Cabinet proposal for an ongoing program, as the initial funding was set to expire in March 2005. Funding was secured over five years in 2007 for the Aboriginal Peoples Survey.
Two new longitudinal initiatives
Funded through the Data Gaps II initiative, a new longitudinal survey was implemented to better understand how immigrants adjusted to life in Canada. Results also allowed for an analysis of the association between socioeconomic background and success in Canada, and showed which services were most effective in helping immigrants settle into Canadian society. Interviews were offered in 15 languages and began in the fall of 2000. An initial sample size of 20,000 recent immigrants was planned, with respondents interviewed three times over the four years since their arrival.
Three years of consultations with ministries and departments of health and the Canadian Institute for Health Information came to fruition in 2008 when respondents began giving permission to link their survey results to their provincial health care records. The linkage had enormous analytical potential by connecting survey data on socioeconomic background, risk factors and self-assessed health to provincial records to analyze relationships between risk factors, socioeconomic characteristics, health care utilization, interventions and outcomes. This became known as the Longitudinal Health and Administrative Data Initiative. At the first federal–provincial steering committee meeting, the members agreed on priority research topics such as end-of-life care, impacts of mental health problems on care utilization, Aboriginal health, cancer survival and acute health care episodes that could have been avoided. Provinces signed memoranda of understanding in 2008 and 2009 to participate in the initiative.
Two new workplace surveys
The Workplace and Employee Survey was a first attempt at a large-scale employer–employee survey to answer productivity and competitiveness questions. It was funded through the Data Gaps II initiative and conducted in collaboration with Human Resources Development Canada. This was a dual survey—first starting with a sample of employers and then drawing a sample of their employees—allowing for information from both the supply and demand sides of the labour market. It provided information on employees' impact on business practices, including technology use, training, wages, downsizing, and the existence of foreign and domestic partners. It also allowed for the examination of the positive or negative effects of employee education, experience and turnover on business. The pilot test was run in 1996 and the first full-scale survey was conducted in 1999. The annual longitudinal survey tracked workplaces over a six-year period and followed their employees for two years. The survey was conducted from 1999 to 2006, although the last wave included only employer data.
Data from a new survey of 315,000 nurses, conducted in partnership with the Canadian Institute for Health Information and Health Canada, were released in 2006, covering working conditions, challenges, and physical and mental well-being. The National Survey of the Work and Health of Nurses was the first nationally representative survey to focus on the working conditions and health of the largest occupational group in the health care sector, including registered nurses, licensed practical nurses and registered psychiatric nurses employed in all provinces and territories. The survey was developed in collaboration with organizations representing practising nurses, health care researchers, health information specialists and federal government departments.
The brain drain
There were increasing concerns about the "brain drain," which referred to the loss of well-educated individuals to other countries, primarily to the United States, seeking better opportunities or higher salaries. The Survey of 1995 Graduates Who Moved to the United States was undertaken to examine graduates' characteristics, their reasons for relocating, their education and work experience, and their future plans. In 1998, the agency used the results from this survey, as well as other data, in a comprehensive study of the flow of knowledge workers between Canada and other countries. It found that while Canada did have a relatively small brain drain to the United States, this was offset by an inflow of highly skilled workers into the country from the rest of the world. For every university degree holder migrating from Canada to the United States, there were four university degree holders migrating from the rest of the world to Canada.
Canada participates in an international survey of youth
Canada was one of over 30 countries participating in the Organisation for Economic Co-operation and Development's Programme for International Student Assessment (PISA), which actually became a policy tool leading to education reform in some countries. The program was designed to provide indicators of student achievement at age 15, with reading literacy assessed in 2000, mathematical literacy in 2003 and scientific literacy in 2006. It provided information on competence and the impact of socioeconomic background and schools. In Canada, over 30,000 15-year-olds from more than 1,000 schools took part in 2000.
The Youth in Transition Survey complemented the Organisation for Economic Co-operation and Development survey by tracking movements of young people and looking at factors influencing school–work transitions. In fact, the first cycle was integrated with PISA for the younger cohort. First launched in 2000, the Youth in Transition Survey followed two cohorts, one aged 15 and one aged 18 to 20, collecting information every two years until the youth reached their mid to late 20s.
Measuring hate crime
In 1965, the Minister of Justice appointed a special committee (the Cohen Committee) to study and report on hate propaganda in Canada. The report was made public in 1966 and resulted in hate propaganda being made an offence under the Criminal Code.
The first country to mandate collection of hate crime statistics was the United States through the Hate Crime Statistics Act of 1990. In Canada, while various police departments voluntarily collected hate crime statistics, there was no centralized system. Hate-motivated crime was identified as a major data gap in the late 1990s. The 1999 General Social Survey on Victimization first asked specific questions related to hate crimes, which provided a major development in hate crime research, as it provided the first available national estimates on hate crimes.
In 1999, the Canadian Centre for Justice Statistics received four years of funding from the Policy Research Initiative to gather information about criminal behaviour motivated by hate or discrimination in the Canadian justice system. Consultations were first carried out through the Police Information and Statistics Committee of the Canadian Association of Chiefs of Police to establish a common definition of hate crime. A pilot study on hate crime with 12 major police services then assessed the feasibility of collecting hate crime statistics through police agencies. The same project looked at the diversity of victims, offenders and workers in the justice system to allow for the assessment of equality of access to justice services.
The Uniform Crime Reporting (UCR) Survey had been producing a continuous record of crime and traffic statistics from every police agency in Canada since 1962, and, in 1988, a significantly revised version referred to as "incident-based" began, which captured data on the characteristics of incidents, victims and the accused. The survey was again modified in 2005 to allow police to identify hate-motivated crimes by capturing data on incidents motivated by hate based on race, national or ethnic origin, language, colour, religion, sex, age, mental or physical disability, or sexual orientation. This aligned with changes to the Criminal Code that allowed for increased penalties when there was evidence that an offence was motivated by bias, prejudice or hatred toward a particular group. Canada was one of the first countries in the world to collect such data, and the program was seen internationally as a model for measuring the nature and extent of hate-motivated crime.
In 2006, the Hate Crime Supplemental Survey was funded by Canadian Heritage in support of Canada's Action Plan Against Racism, which was a five-year project to fight racism and promote inclusion. This was a special survey of police service reporting data as part of the UCR Survey, but whose electronic reporting systems had not been converted to the new system. The affected services manually provided information on incidents motivated by hate, which Statistics Canada could link to the UCR Survey data.
An international focus on youth delinquency
Because statistics on youth delinquency were based on policy sources, they referred only to acts of crime or mischief that were reported. To provide broader data and data in the context of relationships or bonds with parents, schools and friends, the International Self-Report Delinquency Study was initiated by the Dutch Ministry of Justice and first conducted in 1992 with 13 European countries, as well as the state of Nebraska in the United States. A second study was conducted in 2006 with about 30 European countries, the United States and Canada. The Canadian arm of the study was called the International Youth Survey, which was sponsored by the National Crime Prevention Centre, a division of Public Safety and Emergency Preparedness Canada. The International Youth Survey was conducted as a voluntary survey with about 60,000 students in Toronto, after obtaining parental consent. It provided information to help address questions related to risk and protective factors for misbehaviour (such as drug and alcohol use, parental supervision and relationships), and how schools and communities could assist high-risk children in developing prosocial behaviours and positive school outcomes.
The Census Program
Census Day for 1996 was switched to mid-May from early June to maximize the number of Canadians at home during enumeration and allow sufficient time to conduct follow-up processes before the summer holiday period. For the Census of Agriculture, the Progress of Seeding Follow-Up Survey was conducted for roughly 100,000 farm operators who reported less than 90% of their field crops seeded on Census Day, to evaluate the impact of the date change and verify or update the crop data they reported.
The 1996 Census of Population included new questions on unpaid household activities and to identify the visible minority population. For the first time, all census questionnaires were printed on recycled paper, and all standard products were made available in electronic format, including CD-ROM and diskette. Some information was made available on the Internet. In addition, as funding provided by the Treasury Board was sufficient only to conduct a basic enumeration, six federal departments that relied heavily on census results to implement many of their programs and policies contributed $55 million so that a full-scale census could be carried out. It was also the first time the agency drew attention up front to the fact that it was the law to complete the census. For previous censuses, the emphasis had been placed on individuals' civic responsibility for the benefit of all Canadians, and the legal implications were brought forward only with delinquent respondents.
The agency once again used Revenue Canada facilities to process questionnaires for the 1996 Census. Soon, however, the agency was beginning to investigate new methods to capture census records, as the permanence of the tax facilities was questionable, given that a large portion of the population was beginning to file their income tax returns electronically. The agency was also working toward making its census forms available to Canadians via the Internet. In preparation, the agency was enhancing and increasing the scope of the Address Register with the intention of mailing out questionnaires in 2006 to about two-thirds of the households. It was also working on an Internet option with appropriate security and built-in edits, intelligent character recognition, and automated checks for completeness, and it aimed to conduct follow-ups from a series of regional computer-assisted telephone interviewing sites.
The 2001 Census marked the first time information was collected on same-sex couples living common law, while the next census in 2006 collected information on same-sex marriages after their legalization in July 2005. The 2006 Census also included a landmark new question seeking permission to use data from income tax records to lower respondent burden.
It was very difficult to recruit and train staff, as the employment situation was very strong in the spring of 2001 and there was a great deal of competition for qualified personnel. The 2001 Census used the Internet as its primary delivery vehicle for the first time for data products and services to the public and media. A new edit and imputation system called the Canadian Census Edit and Imputation System (CANCEIS) was used for processing the demographic, labour, mobility, place of work and mode of transport variables—about half of all the variables from the 2001 Census of Population. It was enhanced and successfully used again for the 2006 Census and processed nearly 100% of the census variables. CANCEIS was also used in the processing of the 2001 Ukrainian census, the 2000 Brazilian census, the 2000 Swiss census, the 2005 Peruvian census and the 2011 U.K. census.
Because of the substantial changes in methodology being planned for the 2006 Census, it was deemed necessary to test these changes in a full dress rehearsal in 2004, which had not happened since 1974. As a result, there was much less time than usual to develop the numerous systems feeding the census, and the agency decided to contract out a large portion of the development process. The contracted-out systems dealt with the logistical part of the operation, including keeping track of returns and ensuring census takers were up to date with the status of each questionnaire to cue them for required follow-up.
The power of mapping
When computerized census enumeration maps were first produced in the mid-1980s, they proved to be valuable collection tools. However, their production was manual and labour intensive, with varied quality depending on the source documents. For the first time, for the 2001 Census, all maps were produced in an automated fashion using a set of digital geographic databases that were operated and updated in partnership with Elections Canada.
The agency had a long track record of informal co-operation with Elections Canada, and, in April 1998, they signed a memorandum of understanding to formalize this co-operation on a "joint build project." The project was to develop a shared national database of streets and to share mapping data, updates and infrastructure in a single network file called the National Geocartographic Database. This file of streets, names and address ranges combined with geographic and political boundaries would support Elections Canada in its voter enumeration and Statistics Canada in its collection and dissemination activities. The two agencies jointly maintained the geographic frame and, in the lead-up to the 2006 Census, reached out to other federal government departments and provincial agencies to acquire road updates, which resulted in improved road network information.
Until 2006, the census enumeration area was the smallest nationwide geographic unit, but data comparisons proved difficult, as these areas were not always stable over time. As a result, city blocks in urban areas and analogous entities bounded by stable features in rural areas began to be used as the smallest standard building unit for dissemination purposes. The use of these smaller areas as the foundational building blocks also gave the flexibility to design enumeration areas more accurately and to allow custom dissemination areas based on the needs of users.
In 2005, the agency's Road Network File, which used the National Geocartographic Database as its source, became free to the public. In 2001, it had been a $25,000 product. The move was in part to promote partnerships with the provinces under data sharing agreements. This decision was also made because more and more departments were making such datasets available, and Statistics Canada wanted to promote the adoption of its geographic products for applications including mapping, geocoding, searching, area delineation and database maintenance. A longer-term project was also initiated to migrate the Road Network File to a more accurate and timely GPS-compliant model from Natural Resources Canada in preparation for the 2011 Census.
Access to historical census information
The Canadian Century Research Infrastructure (CCRI) project was launched in 2002 and aimed to create public-use sample files from the historical censuses of 1911, 1921, 1931, 1941 and 1951. It facilitated the availability of census databases spanning 130 years, as the new databases were linked to existing databases for 1871 to 1901 and for 1961 to 2001. It was one of the largest-ever social science initiatives at the time, as it created the foundation for research on the transformation of Canadian society since the late 19th century. Funding was provided by the Canada Foundation for Innovation; the provincial governments of Ontario and Quebec; private sector corporations; and various other institutions, trust funds and foundations.
The internationally recognized research initiative was led by Dr. Chad Gaffield of the University of Ottawa's Institute of Canadian and Aboriginal Studies, along with team leaders at seven partner Canadian universities (the University of Ottawa, the University of Victoria, York University, the University of Toronto, the Université du Québec à Trois-Rivières, the Université Laval and Memorial University of Newfoundland). Each university developed a CCRI research centre that met the security requirements of Statistics Canada. Other partners included the Institut de la statistique du Québec, IBM Canada, the International Microdata Access Group, Library and Archives Canada, the Newfoundland and Labrador Statistics Agency, and Statistics Canada. By 2006, the project had grown to involve over 130 researchers, students and professional staff across the country.
Much of the work across the seven university centres involved keying data from manuscript census records that were often hard to decipher, as well as cleaning and coding the data. In addition to creating the census databases, the project also involved research into qualitative and contextual data, such as what was reported in the newspapers for each census about enumeration or census results and what was discussed in the House of Commons, the Senate and provincial assemblies, along with other published or unpublished documents relating to each census. The data were also overlaid with Geographic Information System map layers to allow georeferencing. The result, a large-scale multidisciplinary searchable and interactive research infrastructure, was made available through research data centres across the country in 2009.
The business statistics program
The Project to Improve Provincial Economic Statistics turns the tides
In 1996, the Government of Canada and the governments of New Brunswick, Newfoundland and Labrador, and Nova Scotia agreed to harmonize their sales taxes to reduce business costs, simplify taxes and reduce administrative costs. Starting on April 1, 1997, a single common tax of 15% (the harmonized sales tax) began to be charged instead of the four separate taxes that had previously applied (the federal goods and services tax and the individual provincial taxes). The federal and provincial governments needed an unbiased and trusted intermediary to provide the data required to calculate the revenue shares, about $25 billion annually, between the four governments. Statistics Canada was asked to be that intermediary, despite its status as a federal agency, which was quite a distinction and an honour.
The provinces had agreed to the harmonization only so long as their tax revenues could be as stable and predictable as they were with just the sales tax. This added a complication because businesses were entitled to a rebate of the goods and services tax they paid on the input to what they sold. As a result, tax revenue needed to be shared on the basis of the final sale within a province, although it was collected at all stages of production wherever that production occurred. Until this time, there had been no tracking of whether a sale was final or whether an item would be an input into a further refinement and sold again.
This new project was called the Project to Improve Provincial Economic Statistics (PIPES)—a massive undertaking that increased the agency's annual budget by about $43 million. The broad goal of the project was to produce economic statistics of roughly equal reliability for all 13 provinces and territories, not just for the provinces participating in the harmonized sales tax. As a result, larger samples, in relative terms, would be needed in the smaller provinces. A large number of staff needed to be hired, while, at the same time, the agency needed to manage downsizing in other areas as a result of budget reduction requirements.
Work on PIPES started in December 1996, as the agency began conducting consultations and detailing action plans. A number of task groups were created to spearhead work on the project, the first consisting of 14 directors general from across the agency to provide oversight and coordination. The other groups focused on managing external relations and response burden; designing and implementing a new unified enterprise-based business survey system; managing the transition between the existing and new survey system; and managing human resources issues such as training, recruitment and staffing. The project, led by Philip Smith, then Director General of the PIPES Implementation Branch, involved new business and household survey activities, a program of annual provincial input-output tables, and initiatives to expand the use of tax and other administrative data.
This massive agency-wide mobilization of resources over several years had the primary objective of producing input-output accounts and income and expenditure accounts for all of Canada's provinces and territories. The provincial industrial level of detail available needed to be improved, along with the integration of economic statistics at the provincial and national levels. The agency needed to develop annual interprovincial trade data, implement a new classification system and double the scope of the Business Register to include "zero-employee businesses" (essentially adding about a million businesses). Some surveys needed to be expanded, such as the Survey of Family Expenditures, which would be conducted annually instead of every four years. This survey would help the agency have two measures of consumer expenditures—from the perspective of businesses and from the perspective of consumers. The Methodology Branch and the Informatics Branch needed to assist in the redesign of all affected surveys, and the Regional Operations Branch and the Operations and Integration Division needed to handle increased collection and processing, and assist businesses with the change in reporting activities. The Human Resources Branch needed to hire many new people and mobilize experienced people through rotations, reassignments and corporate assignments.
The new Unified Enterprise Statistics Program (UESP) was a modernized, consistent approach to business surveys that grew out of PIPES. The six main elements of the UESP were to use the Business Register; use tax data to the fullest extent, where feasible; harmonize and integrate questionnaires; adopt enterprise-centric collection; have a unified approach to sampling, data capture, edit, imputation, allocation, calendarization and estimation; and use common microdatabases. It started in 1998 for reference year 1997 with seven pilot industries that had previously not been surveyed, or surveyed only in a limited way. The pilot profiled aquaculture, couriers and messengers, taxis and limousines, construction, food services and drinking places, real estate lessors, and real estate agents and brokers. The second edition covered reference year 1998 and was expanded to include the first major industry—wholesale trade. After four years of development, the UESP came to integrate more than 20 surveys, including the Annual Wholesale Trade Survey, the Annual Retail Trade Survey and the Annual Survey of Manufactures.
The Business Survey Redesign Project, which began in 1984, had first initiated the corporate Business Register (BR), but it was with the infusion of funds into the UESP from PIPES that the register was strengthened and finally adopted by all business surveys. A major re-engineering of the BR started in 2005 and was expected to take three years. The register was modernized to simplify operation concepts and processes, purge outdated technology, and reduce production costs. Then, the migration of surveys to the new register began in the fall of 2007 and was to take about six months, respecting the individual survey cycles. The redesign was complete in 2008, resulting in an integrated environment and improved tools. Plans were also being made to add the public sector universe to the BR.
The agency also realized it would need to create case managers for large enterprises, since the new reporting requirements could appear quite complicated. The agency began to look at data collection from complex enterprises as a single integrated requirement negotiated on a case-by-case basis instead of a series of individual surveys. The Key Provider Manager Program (KPMP) was expanded to encompass more large enterprises, and, by 2005, covered about 180 of the most data-critical Canadian businesses, providing one-stop assistance and response coordination services. From 2005 onward, the target was to double the number of enterprises covered without a net cost increase. To achieve this, a pilot project was conducted to merge the KPMP with the Large Businesses Profiling Program from the Business Register Division, which had been focusing on staying on top of the constantly changing structure of large enterprises. The new merged program would be called the Enterprise Portfolio Management Program and would holistically manage the agency's relationship with about 350 of the country's largest businesses in several respects, including profiling, survey reporting arrangements, issue resolution, coherence analysis and data collection, in a more integrated fashion.
The number of small businesses exempted from the UESP grew as the agency began to make greater use of tax data in lieu of survey data, through the Tax Replacement Project. The target to replace questionnaires with tax records for 50% of simple businesses (operating in one province and in one industry) was achieved a year ahead of schedule and was raised to 60% by 2006. The content of the UESP questionnaires was also being streamlined to reduce respondent burden, and questionnaires were mailed out earlier to coincide with the fiscal year end of businesses, when their records would be closer at hand. While the agency had been reducing respondent burden in an aggregated way through various initiatives since the late 1970s, it was not until 1998/1999 that it was able to control the burden for an individual business. A system was developed to track and archive all previous survey contacts for a given business.
Some of the new annual household surveys implemented as part of PIPES included the Survey of Household Spending, the Homeowner Repair and Renovation Survey, and the Canadian Travel Survey. The new surveys brought about by PIPES were initially referred to as "feeder surveys." By 2000, it had transitioned from a project to an ongoing operational program, with many of the new surveys and the national accounts expansion continuing to the present day.
The need for improvement of provincial economic statistics had been identified years earlier, and PIPES, although designed to assist the tax system, was instrumental in improving provincial economic statistics, as well as integrating and improving the coherence of all economic statistics. The project realized its ultimate objective of producing detailed provincial input-output tables and income and expenditure accounts annually for every province and territory.
A new classification system is introduced
After two years of intensive work, the North American Industry Classification System (NAICS) was signed off on in December 1996 by the chief statisticians of Canada, the United States and Mexico. The new system allowed for comparison of the performance of different industries within the North American free trade area, replacing the Standard Industrial Classification, which had been developed in 1948 to measure the Canadian economy and which had been revised at 10-year intervals. The North American Free Trade Agreement had been signed in January 1994, and, by August, agreement was reached that the three statistical agencies would develop a common industrial classification. Each country conducted its own extensive consultations, including with industry and trade associations, forecasters and research institutes, advisory committees, and policy departments. Canada's efforts were led by Standards Division director Shaila Nijhowne (in fact, she would receive a career excellence award in 2001 for her work at the agency, including for her contributions to NAICS). The three statistical agencies met at regular intervals to reach agreement.
The new system was implemented for reference year 1997 for Canada and the United States, and the following year for Mexico, when it was to conduct its economic census. The new system contained a completely new sector on information and cultural activities, greater prominence of business services, and less emphasis on manufacturing. Statistics Canada began publishing data from the System of National Accounts using NAICS in 2002 and began the work of recasting past series.
With NAICS in place, the agency began working with the statistical office of the European Union (Eurostat) toward a common industry classification for Europe and North America by 2007. A two-year pilot was also launched to assess the feasibility of developing common lists of products across Canada, the United States and Mexico. This pilot was called the North American Product Classification System, and it started with five service industries for which a comprehensive list of products was developed in 2000.
Science and technology statistics
When the 2007 Strategic Program Review resulted in the cessation of many of the projects funded by the Data Gaps initiative, the core Information System for Science and Technology was considered fundamental to the national statistical system and was preserved. The government had launched the new Science and Technology Strategy in 1996, and, in fact, the Data Liberation Initiative was part of this strategy. Two years earlier, the government had announced its intention of reviewing federal science and technology to investigate how federal investment could create economic growth and jobs, and it subsequently launched a long-term consultation process. One of the outcomes of that review was the new Information System for Science and Technology project to develop indicators of activity and a framework to paint a picture of science and technology in Canada. In collaboration with Industry Canada, the agency was developing statistical measures in the areas of innovation systems, innovation, government science and technology activities, industry, and human resources.
Measuring commerce in new ways
The SARTRE program had nothing to do with the French philosopher, unless you count Jean-Paul Sartre's influence on disciplines such as sociology. The program was the Small Area Retail Trade Estimates program, which was formalized in 1999 as a new custom program to produce annual estimates on retail trade and the number of stores within a small geographic region (by the first three digits of the postal code for urban areas and by all six digits for rural areas). The estimates were produced by combining the Annual Chain and Department Store Survey and the tax returns from corporations. The cost-recovery program was terminated and the agency stopped producing the custom data tables in 2007.
With the expansion of the use and availability of the Internet came new opportunities—not only for government agencies, but also for businesses—to carry out transactions with both their consumers and their suppliers. In August 2000, the agency released results from the first cross-economy survey on the use by businesses of information and communication technologies and electronic commerce. While virtually all public sector institutions were using the Internet, roughly half of all businesses were active online, with only 1 in 10 using the Internet to sell goods and services at the time.
The Survey of Financial Security
Funded through the Data Gaps II initiative, the 1999 Survey of Financial Security collected information that had not been available since 1984. It provided policy makers with information about at-risk groups, student loans and the distribution of wealth, and helped them understand how well Canadians were prepared to support themselves in the event of major changes such as a long-term illness, retirement or job loss. Although the survey was conducted again in 2005, permanent funding was not established, and it wound down when the Policy Research Data Group investment fund was reduced as a result of the 2007 Strategic Program Review.
Improving measures of trade
Until the mid-2000s, Statistics Canada was measuring export price indexes mainly by assuming they were equal to domestic price indexes multiplied by the currency exchange rate. Import prices were often measured by assuming they were equal to price indexes from the United States or other countries, again adjusted by an appropriate exchange rate. However, exporters and importers may charge more or less than these assumptions imply—they may "pass through" either more or less of any exchange rate change into their transaction prices. Thus, the agency's methodology was merely an approximation. As exchange rate volatility became more evident in the 2000s, the need for directly measured trade price indexes became more apparent, and the Bank of Canada raised concerns. Accordingly, Statistics Canada began to collect export and import prices directly from importers and exporters and used the resulting price indexes to deflate international trade flow data. The program started with a pilot survey in 2007 with long-term planning funding and eventually became base funded. It continued to expand by including a few additional product classes when funds allowed.
While the 1987 bilateral agreement with the United States on exchanging import data had vastly improved both countries' bilateral trade data, there remained an issue with respect to exports to other countries, which were underreported. Statistics Canada worked jointly with the Canada Border Services Agency (CBSA) in the mid to late 2000s to address the issue, and, together, the agencies implemented an online reporting system to help track non-U.S. exports. In addition, the CBSA strengthened regulations requiring goods to be declared prior to export and increased its enforcement efforts, all of which contributed to reducing the underreporting.
Mad cow disease
Until 2003, when a breeder cow in Alberta tested positive for bovine spongiform encephalopathy (BSE), or mad cow disease, Canada was one of the largest exporters of beef in the world. Within hours of the news, about 40 countries, including the United States, imposed a ban on Canadian beef products, and the value of exports dropped to almost zero for about three months. Canada had historically relied on the slaughtering facilities of the United States, and, with the border closed to cattle, the Canadian facilities were unable to handle the increased demand. The first case of BSE had been detected in Canadian cattle in 1993, and 10 years after having implemented new monitoring measures, the industry was devastated.
Statistics Canada was called upon to provide new data to the Natural Disaster Assistance team at Agriculture and Agri-Food Canada. A special survey was carried out to provide information about the size of the at-risk cattle population and of the population of slaughter-ready cattle that were typically exported to the United States. The financial and emotional fallout of the event to the industry was dramatic, with millions of dollars lost in revenue every day. The borders to the United States were reopened partially in 2005, but would not be fully opened until 2007. Statistics Canada tallied a loss of $2.5 billion in exports, $2 billion in gross domestic product, $5.7 billion in total outputs, $1 billion in labour earnings, and 75,000 jobs.
An environment statistics program takes shape
Environmental policy analysis took a downturn in Canada in the late 1980s and early 1990s as the federal government exerted efforts to eliminate its budget deficit. However, Canada's Kyoto Protocol commitment in December 1997 to reduce greenhouse gas (GHG) emissions to 6% below 1990 levels led to the realization that there was a lack of detailed information in this area. This resulted in the formation of a federal statistical working group in 1999 co-chaired by Statistics Canada and Natural Resources Canada to look into priority data needs. The agency received funding to expand and improve the industrial detail with respect to energy consumption through the Energy Statistics Program, as well as to undertake a survey of energy use of commercial and institutional buildings, and to conduct a feasibility study on apartment building energy use. Discussions also got underway to enable the collection of information on vehicle fuel use and farm energy conservation practices.
In 2000, the National Round Table on the Environment and the Economy led a government-appointed task force to develop a set of environmental sustainable development indicators. Statistics Canada was active on the task force, which recommended six indicators in its 2003 report on the Canadian Environmental Sustainability Indicators. Shortly thereafter, the government announced its intention to implement three of the six, and the agency worked collaboratively with Health Canada and Environment Canada to develop an implementation plan. The indicators covered air quality, water quality for the protection of aquatic life, and GHG emissions, with the first report released at the end of 2005. The second report on the Canadian Environmental Sustainability Indicators was released in 2006, showing among other findings that GHG emissions had continued to rise by 27% between 1990 and 2004, exceeding the Kyoto Protocol target by 35%.
In 2002/2003, the National Round Table on the Environment and the Economy requested that Statistics Canada submit a plan for an expanded set of structured environmental accounts. The agency's plan involved working with other departments to make use of existing environmental information and then to determine how to best fill the data gaps that remained, including through a series of new environmental surveys. The agency drafted a plan for the expansion of the System of National Accounts and developed modules on GHG emissions and the environment "industry" to add to existing surveys.
In 2005, the agency was selected ahead of several private companies to carry out a new survey in collaboration with Environment Canada to monitor progress toward the Kyoto Protocol goals on GHG emissions. The GHG reporting project was a high-profile initiative led by Statistics Canada in partnership with Environment Canada, Natural Resources Canada, provincial and territorial governments, and industry. The team developed an Internet-based reporting system harmonized to meet jurisdictional needs, avoid duplication, and minimize burden and cost. The survey collected details on emissions from large industrial facilities emitting more than 100,000 tonnes of GHGs each year. In partnership with Environment Canada and Alberta Environment, Statistics Canada released the first data from this new mandatory reporting system for emissions of six GHGs in 2006. The information contributed to Canada's international GHG reporting, as well as to setting new regulations, targets and timelines for reduction.
In 2006, the agency released results from the prototype Households and the Environment Survey, which was conducted as a Labour Force Survey supplement. It provided provincial-level tables, describing the environmental practices and behaviours of Canadian households. The survey was migrated to a platform based on the Canadian Community Health Survey in 2007, which was thought to be a more suitable frame that would enable joint analysis of environmental and health data.
Genetically modified organisms
There was considerable debate coming to the forefront in the 1990s and early 2000s about the health and safety implications of growing and consuming genetically modified organisms (GMOs). Modern genetic modification had begun in the mid-1970s, but concerns began to be expressed almost immediately on potential ramifications of the new technology, and a moratorium was observed until the Asilomar Conference on Recombinant DNA in 1975. The conference drew up guidelines to ensure the safety of experiments using recombinant DNA technology. By the early 1990s, genetically modified food crops began to be consumed by the public; however, there was very little information available on the extent to which GMOs were produced in Canada. The extent of organic farming practices was also unmeasured. Information on both topics was collected through the addition of questions on existing agriculture surveys.
International technical assistance continues
As well as collaborating internationally on various initiatives to exchange ideas and co-develop processes or systems, Statistics Canada continued to be active in providing technical aid to the statistical administrations of other countries. International technical assistance was mostly funded by the Canadian International Development Agency (CIDA). Statistics Canada's technical assistance extended to countries including Argentina, Armenia, Bangladesh, Brazil, China, Colombia, Cuba, the Czech Republic, Eritrea, Georgia, Haiti, Hungary, Indonesia, Jamaica, Malaysia, the Philippines, Poland, Russia, Turkey and Zambia.
In 1996, the agency began a new international co-operation program with China's National Bureau of Statistics that focused on organizational development, market economy measurement and the building of technical capacity. This five-year, $9.4 million initiative was funded by CIDA to support Canada's foreign aid objectives in China and included training at various levels and in various subject matters. This was known as the Statistical Information Management Project (SIMP). In 1998, the two statistical agencies developed a set of guiding principles to ensure that projects would be pursued in a consistent and effective manner, with a particular emphasis on the sustainability of the activities. The first phase of SIMP was from 1996 to 2004, during which national accounts, household surveys, resource management and a survey skills development course were covered. In fact, Statistics Canada authored a comprehensive training manual on the basic concepts of survey methodology, management, operations, analysis and quality assurance for household surveys to train 30,000 core statistical workers throughout China. Another five-year co-operative effort, known as SIMP II, began in 2005. The second phase of the program was also funded by CIDA, as well as the Chinese Ministry of Commerce, and was aimed at particular social, economic and environmental statistical projects in line with China's international obligations, as well as the use of administrative information.
The work of the voluntary city groups continued, with the model for the first group—the Voorburg Group on Service Statistics—successfully copied into other domains, including the London Group on Environmental Accounting, the Canberra Group on Household Income Statistics, and the Ottawa Group on Price Indices. In the early 2000s, the London Group was focusing on the development of environmental accounts linked to the System of National Accounts and was drafting a revised version of the United Nations interim handbook on the system of integrated environmental and economic accounting, which was submitted to the United Nations Statistical Commission in 2002. At its 43rd session, which was held in 2012, the Statistical Commission adopted the System of Environmental-Economic Accounting 2012 Central Framework as the initial international statistical standard for environmental economic accounting.
In the early 2000s, the Swiss Federal Statistical Office and the Hungarian Central Statistical Office asked Statistics Canada to conduct peer reviews of their organizations. This was a new type of international work, with a focus on adaptability to evolving needs, effectiveness and credibility. The director of the International Monetary Fund, upon Dr. Fellegi's retirement in 2008, explained that "Ivan led the way in the evaluation of national statistical systems (e.g., those of Switzerland and Hungary) and the wider European statistical system. In the latter case, my own recent peer review of Eurostat drew heavily on Ivan's earlier work. By pointing out strengths and areas for improvement, evaluations help raise the bar on what an efficient and effective statistical system entails. This paved the way for the international statistical community to conduct such evaluations of member country statistical systems."
In 2006, the agency was in discussions with CIDA about a major Canadian international statistical capacity-building initiative that would see a number of countries engaged in long-term technical assistance, encourage other statistically developed countries to do the same, and establish a training institute for top-level managers of statistical offices on management techniques and other issues of particular importance to statistical offices.
A path of continuous improvement
Dr. Fellegi had informed the Privy Council Office in 2006 that he wanted to take his retirement in two years' time. His announcement to all staff at the agency was made on February 15, 2008. In June, he retired after 23 years of being Chief Statistician and 51 years after joining the agency in 1957. His retirement ceremony was broadcast on the desktops of employees, another first at Statistics Canada. Invited speakers gave testimonials, either in person or via video. International accolades were shared from the World Bank, the United Nations, the International Monetary Fund, the National Bureau of Statistics of China, the Organisation for Economic Co-operation and Development, and the International Statistical Institute, as well as the chief statisticians of numerous countries.
On that occasion, it was announced that in Dr. Fellegi's honour, the boardroom on the executive floor of the R.H. Coats Building would bear his name. He was also awarded the title of Chief Statistician Emeritus by then Prime Minister Stephen Harper, and he continues to this day to come to his office (10 years after his retirement) to provide advice and lend a willing ear to anyone who seeks his counsel.
Under Dr. Fellegi's tenure, the agency had been ushered into an era of stability and of confidence. He captained the ship that successfully weathered the budgetary tides while strengthening the agency's reputation at home and abroad. Because of his belief in the importance of paying attention to the spirit of innovation during lean years, the agency's research and analysis capacity grew and prospered. His legacy also included robust human resource strategies with a strong training program, a strengthened and integrated economic statistics program, a rich and outcome-focused social program, much-improved media and respondent relations, a strong statistical infrastructure with a robust Business Register, and stellar classification systems and methodological capacity. Statistics Canada had shifted into an international leadership role.
The agency has come a long way since the formation of the Dominion Bureau of Statistics in 1918. Over the course of the agency's history, the process of change has been experienced as a massive upheaval at times, and sometimes as a slow and steady progression. You have no doubt heard the adage "history repeats itself." This is true, as well, in the life of a statistical agency. Sometimes change seems quite tangible, such as in the 1930s, when the bureau's resident inventors—A.E. Thornton and his assistant Fernand Bélisle—designed and built custom tabulation machines. However, growing information needs and constantly shifting technologies for gathering, compiling, analyzing and disseminating information are not new. One could argue they have never been new. As the world around us evolves, so do the tools and processes harnessed to cope with the statistical needs of an ever-more-quickly changing society. But the need to adapt to change has always been present.
"Change is a constant factor in a modern industrial society. The challenge for Statistics Canada is to monitor and report on the many aspects of change, providing an information base that equips Canadians to build on the past and shape the future." – 1984/1985 annual report
- Date modified: