Chapter 3.2: Modernization of Information Technology and Informatics Services

Context

Successful statistical organizations base their operations on a cost-effective, solution-focused infrastructure that comprises data and information services, processing and analysis capabilities, and the underlying network and computing structure. Information technology (IT) is a strategic enabler of all modernization activities, such as process automation, method innovation, and information- and data-management capabilities.

Desired IT transformation activities fall into two broad categories. First, the streamlining of investments and the focus of IT activities on a core set of technologies and applications will maximize operational effectiveness. Second, IT will promote innovative approaches to creating statistical value from data through powerful processing, analysis, and dissemination techniques.

IT modernization is a key component of the Corporate Business Architecture (CBA) transformation initiative carried out by Statistics Canada (for details, refer to Chapter 3.1: Corporate Business Architecture). IT transformation involves platform standardization, data management, metadata-driven capabilities, and a centralized IT function.

This chapter describes the principles and the strategies behind Statistics Canada's IT-driven modernization strategy, and illustrates how the agency was able to implement them in the transformation process.

Strategies, mechanisms and tools

As Statistics Canada's experience shows, a successful IT-driven modernization strategy consists of the following seven elements:

  1. effective organization of IT functions and resources within the statistical office;
  2. comprehensive enterprise architecture that provides the necessary framework to align IT infrastructure, technologies and services with the overall conduct of the statistical business;
  3. use of common systems and tools;
  4. effective management of IT security;
  5. strong IT governance;
  6. integration of IT planning into the overall strategic planning process; and,
  7. comprehensive resource development strategy to ensure the availability of skilled IT resources over time.

1. Effective organization of IT functions and resources within the statistical office

In the context of modernization, a typical statistical organization will be required to perform some of the following IT functions internally:

  • Client relationship management (CRM)—approach whereby client satisfaction is assured through effective service management and communication.
  • Application development—delivery of new solutions and evolution of existing solutions, with a focus on quality, agility, and timely delivery.
  • Operations (IT Ops)—Operation and support of integrated applications and infrastructure, including security, for production purposes.
  • Infrastructure management—function typically merged with IT Ops, although the engineering and evolution (design / build) could be managed separately.
  • Application portfolio management and project portfolio management—oversight (at a corporate IT level) of application portfolios and IT projects, including performance management.
  • Platform and component services—development and evolution of "platform as a service" offerings, such as database hosting, application hosting, web hosting, data analytics and business intelligence services, and integration platform services; these services are responsible for the development and support of statistical components (at Statistics Canada, these are known as "generalized systems").
  • Enterprise Architecture—alignment of business and IT strategy and projects through the creation, development, and enforcement of principles, standards, and architectural frameworks in the areas of business, information, application, technology, and security.

The above IT functions are organized around one of two models (or a combination of the two). The choice of model will reflect the statistical office's modernization strategy:

  • Decentralized organization—under this model, the IT function is embedded in business units so that it can develop and support local solutions. This approach provides localized and customized solutions to the business units, but may lead to redundant solutions and inflexibility to meet enterprise priorities.
  • Centralized organization—under this model, the IT function is centralized as an enterprise-wide service; it develops and supports solutions for all lines of business. Typical benefits include economies of scale, flexibility, and focus on enterprise, as well as line-of-business priorities.
  • Mixed model—it is possible to have a combination of both models that includes embedded IT units creating lightweight temporary or exploratory solutions and a centralized group providing foundational platforms. This model is sometimes known as "bi-modal IT."

As part of its CBA transformation, Statistics Canada created a centralized IT function serving the entire organization. Under this model, IT resources and software are considered corporate assets, and managed according to business priorities at the organization level. This was a shift from past practices, which involved a mix of core functions and extensively distributed IT functions embedded in the business.

Adopting this model provides economies of scale through the sharing of tools, IT practices, and capabilities across all business units; the organization thus avoids having local IT resources "locked" into specific business groups. IT specialists are deployed, in a flexible way, on priority projects funded through the Integrated Strategic Planning Process (for details refer to the Chapter 2.2: Integrated strategic planning). This results in flexibility and in a more effective use of resources.

A potential risk of this approach is that the IT teams would no longer be closely connected to their business clients. In order to mitigate that risk, the IT function has created a clear line of IT business managers dedicated to each line of business (Field within Statistics Canada). Statistics Canada's IT organization structure is shown in the below, in figure 3.2.1.

Figure 3.2.1: Statistics Canada's centralized IT organization

Statistics Canada's centralized IT organization
Description of Figure 3.2.1

This figure explains the composition of the Chief Information Officer's (Director General of the Informatics Branch) jurisdiction. The Chief Information Officer jurisdiction is composed of 5 distinct sections:

  1. Collection Solutions
  2. Economic and Social Solutions.
  3. Administrative and Dissemination Solutions.
  4. IT Operations
  5. System Engineering

The first three sections (Collections Solutions, Economic and Social Solutions, Administrative and Dissemination Solutions) have two main functions: client relationship management, portfolio management and application development and support.

The forth section (IT Operations) has four functions: desktop support, partner (SSC) management, hosted platforms (database, application) and IT service management.

The fifth section (System Engineering) has four functions: generalized systems development and support, integration technology centre, enterprise architecture and IT project and application management.

As figure 3.2.1 shows, the client relationship management function and the application development function have been aligned with different business portfolios—Collection, Economic and Social, Corporate (Administrative) and Dissemination, IT Operations, and System Engineering. Key roles within these areas include the following:

  • Line-of-business (field) IT manager—responsible for client relationship management and for management of application development and support
  • IT Operations—responsible for desktop and infrastructure management (including partner management) and operations related to shared technologies (e.g., databases)
  • Generalized Systems—responsible for shared statistical functions
  • Integration Technology Centre—responsible for the integration platform and the service integration approach
  • Enterprise Architecture—responsible for ensuring the alignment of business and IT strategies and projects through the creation, development, and enforcement of principles, standards, and architectural frameworks in the areas of business, information, application, technology, and security
  • IT Project Office—responsible for IT project management, application portfolio management, and reporting activities

Each Field IT manager in Informatics works closely with his or her assigned line of business (Field) to manage the field's application portfolio and to oversee IT project-execution and delivery. Field IT managers perform this function by maintaining close contact with their business clients through participation in the Field Planning Board of each business area. These planning boards serve as the fora in which members review and prioritize investment, project, and operational elements. Participation in these boards ensures responsive delivery of IT solutions, as well as their maintenance, to the business.

Using the matrix management approach and structure followed at Statistics Canada (see Chapter 2.1: Organizational structure and matrix management), IT project delivery and operations are carried out by multidisciplinary work teams of IT specialists (with participation of business specialists and methodologists, as required). Each Field IT manager oversees a team of application development and support staff, with additional skills provided by cross-cutting IT functions, such as database hosting, system engineering, various technology centres (e.g., Microsoft, SAS, integration technologies), and generalized systems teams. The assignment of staff to project teams is a dynamic process based on project need and priority. A regular assignment-review process within Informatics, which involves Field IT-manager participation, helps make staffing decisions. Holding both weekly Field IT-manager meetings and special sessions addresses planning, technology, and operational needs, and to ensure ongoing alignment and integration.

2. Business–IT alignment—an enterprise architecture approach

In order to realize the benefits of the transformed IT organization, it is important that IT be aligned with the business and that it creates as much as possible, common solutions for common business activities across the range of business units (e.g., social statistics, economic statistics, national accounts, and census).

2.1 Generic models

An analysis model or framework sets out the relationship between the organization's business activities and its IT function. The international national statistical office (NSO) community (through the High-Level Group for the Modernisation of Official Statistics) has created various frameworks and tools to assist statistical organizations in adopting an overarching enterprise architecture. An enterprise architecture will enable organizations to streamline their business, their information resources, processes and capabilities, and their application, technology and security architecture, and to better understand the interaction between these functions.

2.1.1 Generic Activity Model for a Statistical Organization (GAMSO)

The first model is the Generic Activity Model for a Statistical Organization (GAMSO). This useful framework can assist statistical organizations with analyzing their overall operations by factoring out common business activities and identifying solution opportunities.

Figure 3.2.2: Generic Activity Model for a Statistical Organization
Generic Activity Model for a Statistical Organization

Source: UNECE, 2015b

Description of Figure 3.2.2

This graphic represents the Generic Activity Model for a Statistical Organization, which is split into four sections.

  1. Strategy and Leadership
  2. Capability Management
  3. Corporate Support
  4. Production

The top section (Strategy and Leadership), is split into three sub-categories: Define vision, Govern and Lead and Manage strategic collaboration and cooperation.

The first middle section (Capability Management) is split into Plan capability improvements, Develop capability improvements, Monitor capabilities and Support capability implementation. The second middle section (Corporate Support) is split into Manage business and performances, Manage finances, Manage human resources, Manage IT, Manage statistical methodology, Manage information and knowledge, Manage consumers, Manage data suppliers, Manage building and physical space and Manage quality.

The last section (Production) consists of the Generic Statistical Business Process Model.

The model shows the main business activities of the senior management of statistical organizations:

  • Strategy and Leadership—the business activities related to how strategy is defined, how leadership and governance are performed, and how the various strategic relationships within, and external to, an organization are managed
  • Capability Management—the business activities related to how continuous improvement and new opportunities or discontinuities (e.g., new systems development, research on big data) are identified, assessed, planned and implemented
  • Corporate Support—the business activities relating to the operation of the organization as a business or enterprise, including management of Human Resources, Finance, IT, Information, and Quality
  • Production—the business activities related to the continuous production of statistical products for stakeholders and clients.
2.1.2 Generic Statistical Business Process Model (GSBPM)

The Production activity area of the GAMSO has been elaborated as a reference process model, known as the Generic Statistical Business Process Model (GSBPM) (see figure 3.2.3). The GSBPM is a flexible tool for describing and defining the set of business processes and sub-processes needed to produce official statistics.

Figure 3.2.3: Generic Statistical Business Process Model (GSBPM)
Generic Statistical Business Process Model (GSBPM)

Source: UNECE, 2013a

Description of Figure 3.2.3

This represents the Quality and Metadata Management. It goes by steps and by eight sections.

  1. Specify Needs. (The steps in order are to identify the needs, consult and confirm the needs, establish the output objectives, identify the concepts, check the data availability and prepare the business case.)
  2. Design. (The steps in order are to design the outputs, design the variable descriptions, design the collection, design the frame and sample, design the processing and analysis and design the production systems and workflow.)
  3. Build. (The steps in order are to build the collection instrument, build or enhance process components, build or enhance dissemination components, configure the workflows, test the production system, test the statistical business process and finalise the production system.)
  4. Collect. (The steps in order are to create frame and select the sample, set up the collection, run the collection and finalise the collection.)
  5. Process. (The steps in order are to integrate the data, classify and code, review and validate, edit and impute, derive new variables and units, calculate weights, calculate aggregates and finalise data files.)
  6. Analyse. (The steps in order are to prepare draft outputs, validate the outputs, interpret and explain outputs, apply disclosure control and finalise outputs.)
  7. Disseminate. (The steps in order are to update the output systems, produce dissemination products, manage release of dissemination products, promote dissemination products and manage user support.)
  8. Evaluate. (The steps in order are to gather evaluation inputs, conduct evaluations and agree on an action plan.)
2.1.3 Generic Statistical Information Model (GSIM)

Another useful tool is the Generic Statistical Information Model (GSIM). This model provides a standardized way to express (at a conceptual level) the information objects and their relationships in use within statistical production (see figure 3.2.4). This comprehensive model covers a number of domains within an agency. Perhaps the most familiar is the "Concepts" area, with its Variable, Concept, Unit and Population elements. Similarly, the "Structures" area relates to the structure of the data itself as it flows through production solutions and processes. The "Business" area addresses information objects representing elements of the program itself, such as Production Activity and Statistical Program. Finally, the "Production" area represents elements of the statistical production processes themselves, such as Process Step and Method.

The GSIM provides business and information architecture functions; it includes standardized "language elements" with which to describe the information structure at a conceptual level.

Figure 3.2.4: Generic Statistical Information Model (GSIM)
Generic Statistical Information Model (GSIM)

Source: UNECE, 2013b

Description of Figure 3.2.4

This represents the Generic Statistical Information Model (GSIM). It is separated into four blocks that all come together as a sort of puzzle.  Each block includes five sections.

  1. Business. (The sections within are Statistical Program, Statistical Support Program, Statistical Need, Business Process and Process Step.)
  2. Exchange. (The sections within are Information Provider, Information Consumer, Exchange Channel, Provision Agreement and Product.)
  3. Structures. (The sections within are Data Set, Data Structure, Information Resource, Referential Metadata Set and Referential Metadata Structure.)
  4. Concepts. (The sections within are Variable, Population, Concept, Unit and Statistical Classification.)
2.1.4 Common Statistical Production Architecture (CSPA)

Finally, the Common Statistical Production Architecture (CSPA) brings together these existing frameworks and introduces the new frameworks related to Statistical Services. The CSPA allows for developing a harmonized top-level description of the "system" of producing statistics aligned with the modernization initiative. In addition, the CSPA gives users an understanding of the different statistical production elements (i.e., processes, information, applications, services) that make up a statistical organization and of how those elements relate to each other. It also emphasizes commonality by providing a common vocabulary with which to discuss implementations. This approach enables the vision and strategy of the statistical industry, by providing a clear, cohesive and achievable picture of what is required to get there.Endnote 1

2.1.5 Ideal Enterprise Architecture

The integration of these four models provides a vision of an ideal enterprise architecture for the process, evolution and transformation of statistical production and corporate operations (see figure 3.2.5).

Figure 3.2.5: Vision of the future
Vision of the future
Description of Figure 3.2.5

This graphic represents the vision for the future of Statistics Canada. The model is based on GSBPM. The Survey production process follows steps.

In order, the steps are

  1. Specify Needs
  2. Design
  3. Build
  4. Collect
  5. Process
  6. Analyse
  7. Disseminate.

The next part is under the GSIM model.

Under Specify Needs, Design and Build, we have a Statistics Canada Design Platform. Under this, we have the Workflow and Project Management. It splits into Standard Metadata Catalogue and Process Building Blocks. After that, we have the Design Repository. The last step is Knowledge Management and Quality Management.

Under Design and Build, we have a Statistics Canada Design Platform. Under this, we have the Workflow and Project Management. It splits into Technology Building Blocks and Integration Platform (EAIP). After that, we have the Software Repository. The last step is Knowledge Management and Quality Management.

Under Collect, Process, Analyse and Disseminate, we have a Statistics Canada Delivery Platform. Under it, we have the Workflow and MIS. It splits into two sections, which are Task Services and Info Services. Under Task Services, we have Collection, Processing, Analyse and Dissemination. Under Info Services, we have BR, Tax, AR and Classification. Under those two sections, we have a Metadata and Data Repository. The last step is Knowledge Management and Quality Management.

The Goal is to have an integrated approach to design, development, and production.

The top of the diagram shows the key process steps included in the GSBPM. Notionally, survey production proceeds along these steps. These steps can be grouped into the phases commonly found in software development lifecycles: a "design" phase (shown in yellow), a "build" phase (shown in green), and a "run" phase (shown in purple).

Within the "Design" phase, the goal is to have survey design activities use standard metadata and reusable process building blocks in design activities. Designers should have access to past metadata, reusable data assets, and design elements (documents, artifacts) in creating or modifying their survey. Surveys should use standard information services, such as registers, administrative data sources, and statistical attributes from other surveys, common questionnaires, and classifications. Standardized approaches to various methodological functions would include items frequently found in Statistics Canada's Generalized Systems, including edit and imputation, and sampling.

Within the "build" phase, business and subject matter construction addresses the creation of collection instruments and other items using a common set of tools. Reusable technology building blocks, linked by means of an agency integration platform (the service-oriented architecture strategy), allow for creating new solutions or improving existing ones.

The actual production (operation) occurs in the final phase, the "run" phase, where solution components composed of platforms (e.g., collection) and information or task services operate together in an integrated fashion for delivering high-quality statistics efficiently and effectively.

One can apply this enterprise architecture vision to myriad components of the enterprise: the management of metadata, the use of digital workspaces, collaboration, the need for component-based approaches to assembling solutions, and the use of solution components from other statistical agencies. What is central to the enterprise architecture vision is that the business, subject-matter, methodology, operations, corporate, and IT functions all be connected in ways that reduce the silos in which they individually operate, and, in so doing, enable the delivery of more effective solutions.

2.2 Statistics Canada's Enterprise Architecture approach

The modernization process at Statistics Canada reflects the enterprise architecture approach. Statistics Canada's enterprise architecture follows the CBA principles. As shown in Figure 3.1.1, the CBA transformation strategy includes the IT organization (see figure 3.1.1) as part of "Generic Services" and "Enabling Informatics Systems and Infrastructure."

Statistics Canada also uses the GSBPM in the core of its CBA vision (the core set of process steps in the middle of the diagram). Governance and standardization efforts employ the GSBPM model as an analysis framework to identify possibly redundant applications that may be suitable for elimination. This model also serves to assess the degree to which survey productions areas have migrated to the standard applications, gaps, and plans needed to complete the migration (as directed by CBA).

The "Generic Services" component of the CBA, as illustrated above, focuses on common business activities associated with the Collect, Process, Analyze, and Disseminate elements of the GSBPM. By creating common business units to perform these activities on behalf of all survey areas, this model enables IT to create common solution platforms.

Statistics Canada also uses the GSBPM in its application portfolio management work to ensure that survey areas are using common platforms (with exceptions if warranted) and to plan future investment roadmaps. The CBA principles are applied as a basis for governance activities (as described in the governance section of this chapter).

2.3 Statistics Canada's Enterprise Anchor Model

As previously mentioned, the CBA transformation strategy focusses on standardizing and consolidating applications and operations in the IT area. Statistics Canada is using a common reference framework to categorize the applications and technologies in use. Statistics Canada's Enterprise Anchor Model (figure 3.2.6) describes the structure and integration of services and components. It serves as a framework for the analysis, design, implementation, and management of these elements.

Figure 3.2.6: Statistics Canada's Anchor Model
Statistics Canada's Anchor Model
Description of Figure 3.2.6

This represents the Statistics Canada's Anchor Model. There are five vertical layers to this model.

  1. Core Production Segments (which splits into four sections; Collection, Processing, Analysis and Dissemination.)
  2. Core Statistical Services (which is in charge of questionnaire design, coding, sampling, edit and imputation, estimation, tabulation and confidentiality.)
  3. Core Information Services (which is in charge of address register, business register, geography, classification, metadata, admin/tax data and statistical data management.)
  4. Core Technology Platforms (which is in charge of databases, file services, documents and records management, application technology (.NET , JAVA, C#, etc.), office productivity tools, statistical analysis, business intelligence/data warehousing and workflow and business process management system.)
  5. Core Infrastructure Services (which is in charge of data centre, Network, email, archive and IT security services.)

The model consists of a series of layers:

  • An Infrastructure Layer—core infrastructure services related to storage, informatics, and infrastructure security. In the Government of Canada, a government organization responsible for government-wide infrastructure (Shared Services Canada) provides these services. Email is included in this layer in the model, as a mail service hosted by Shared Services Canada.
  • A Core Technology Platforms layer—the various platforms that are part of the Technology Architecture
  • A Core Information Services layer—the statistical and corporate information services used across the organization (e.g., registers, metadata and data management, information management, classifications management)
  • A Core Statistical Services layer—common, shared statistical functions such as those found in Statistics Canada's Generalized Systems (e.g., Banff E&I, G-SAM, G-Export, etc.). Creating reusable services can be a strong contributor to organization success.
  • A Core Production Segments layer—platforms for the different solution segments supporting statistical production, including the collection platform, dissemination, business statistics processing, social statistics processing (common tools), and census. It also includes corporate services platforms used in running the business.

Statistics Canada uses this model to support its standardization activities and to maintain dialogue in the conduct of its statistical business. As illustrated in the organizational view (figure 3.2.1), there is a strong connection between the organization's structure and responsibilities and the platform components (production segments) at the top of the model.

2.4 Integrating solutions and components—CBA service approach

As part of its CBA transformation activities related to IT, Statistics Canada embarked on a new integration approach that uses a common integration platform and a series of "plug and play" components, known as the CBA service approach.

This approach (known in the industry as a service oriented architecture or SOA approach) allows solution developers to create reusable components (such as the Generalized Systems) and assemble them into new solutions for business clients in flexible and powerful ways. The integration platform shields the solution developers from the details of integration and underlying infrastructure, and reduces the complexity of communications between these components. Statistics Canada has an Integration Technology Centre, which provides a single point of expertise and platform support.

3. Use of IT common systems and tools

3.1 Principles for designing generalized systems

Statistics Canada has developed a comprehensive set of generalized systems components over many years, as shown in the table 3.2.1. This suite of tools provides a rich library of statistical functions for use in all survey areas for different surveys and methodologies. The agency's experience indicates that, when one embarks on the creation of reusable components, it is best to start building a library of tools that can meet the basic, most common functions of the national statistical office. The GSBPM should be used to determine which statistical production sub-processes can be automated. The next step is to determine the requirements from the data producer community. To meet these requirements, off-the-shelf software should be affordable, reputable, easy-to-use, documented and supported, and should have the built-in functions that the work requires, including mathematical and statistical functions, as well as data management and metadata management capacity.

It may be necessary to develop common systems for more complex statistical functions, such as the following:

  • Data manipulation tools—moving data from one step to the next, manipulating files;
  • Mathematical and statistical functions—stratifying, determining sample size, selecting a sample, editing, imputation, weighting, parameter estimation, variance estimation, disclosure control, data analysis, mathematical modelling;
  • Automation of survey steps—data collection, coding; and
  • Dissemination, archiving and retrieval.

Once the organization's business requirements are clear, building a prototype will serve to determine how these requirements can be generalized to different situations and different applications. Lessons learned also illustrate how the "production" version of the software should be developed according to a set of standard programming protocols. This work should use shared utility functions, document the code applied, and follow a common naming convention for variables and modules. This encourages the development of single function re-usable modules rather than one large software program.

3.2 Sharing generalized systems across the organization

Aligning generalized systems with processes or sub-processes of the GSBPM has not only facilitated their design and their integration into the data production process, but has also contributed to technology exchange worldwide. For example, various organizations have been using Statistics Canada's generalized systems through a licensing system. Users include the U.K. Office for National Statistics, Italy's National Institute of Statistics (Istat), the Australian Bureau of Statistics, the New Zealand Bureau of Statistics, the Croatian Bureau of Statistics, Lockheed Martin (United States), the U.S. Bureau of Labor Statistics, and the U.S. Bureau of the Census.

Table 3.2.1: Generalized systems used in Statistics Canada
Name of Generalized System Description/Main functionalities Scope (social or/and business surveys)
G-Link Record linkage Social
G-Sam Stratification, allocation, sampling Business
G-Code Coding Social
Banff Edit and imputation Mainly business; has been used for social
CANCEIS Edit and imputation Social
G-Est Estimation and variance due to imputation Business
G-Series Times series adjustment Social
G-Confid Disclosure control (confidentiality) Business
G-Tab Disclosure control and tabulation Social
G-Export (corporate tool) CANSIM table production Both social and Business

Component sourcing decisions should be made by considering "buy", "borrow" and "build" options. If components are available for sharing with other agencies, this may provide a cost-effective way to build up a library of components. For example, Statistics Canada uses the Blaise platform (originally created in the Netherlands) for certain parts of its collection activities.

3.3 Technology standardization

To manage its technology and application standardization, Statistics Canada has adopted a strategy based on the technology brick approach from Gartner, an information and communications technology research and analysis company. Under this approach, the organization expresses its standard for a given technology and develops a set of evolutionary roadmaps in a concise format. Each Technology Brick has two key pages: the strategy page and the roadmap page. Figure 3.2.7 shows an example of each.

The first row in the strategy page provides a technology baseline view—what is in use currently - and a forecast of what will be in use in two years and in five years.

The "technology lifecycle current state" view consists of a series of "states":

  • "Emerging" identifies technology that is new to the organization and is under evaluation.
  • "Mainstream" includes technology that reflects the current recommended or mandated approach.
  • "Containment" indicates technology whose deployment is limited to specific uses only, either because it is a "boutique" or niche technology used in special cases or because it is on its way to retirement.
  • "Retirement" indicates technology that is in the process of being decommissioned.

There is a natural flow through these technology lifecycle states—typically, a new technology shows up in "emerging" and is brought into Statistics Canada for evaluation. If it has important business value, it transitions to the "mainstream" state for use by all; if it is a special-purpose controlled-use technology, it moves to "contained." As the technology or version ages, it moves from "mainstream" to "contained"; this will freeze the ongoing deployment of the technology in preparation for its move to "retirement," where the technology is removed from service. It is important that these transitions be clearly communicated to business and IT participants, so that solutions dependent on these technologies are able to move to new versions or technologies in a planned manner.

Figure 3.2.7: Technology Brick—Strategy page
Technology Brick—Strategy page
Description of Figure 3.2.7

This figure shows the strategy page of the Technology Brick. The top of the table identifies the name, the owner, the date and the version. In this specific example, the name is Database Management System (DBMS), the owner is Martin Carbonneau from ITOD, the date is January 1, 2016 and the version is Final 2.0. The table is then divided into three rows.

The first row, the technology baseline outlook, has three parts:

  1. The first part is the baseline which is the current inventory (current technologies or components in use). Applications listed in this box are: Oracle Ent 11R1, 11R2 and 12C R1; MS SQL Server Ent 2005, 2008 R1, R2; MS SQL Server Dev 2005, 2008 R1, R2; MS SQL Server Ent 2012; Microsoft SQL Server Express 2005; Microsoft SQL Server Compact v3.5; Sybase 15.5, 16.0; MySQL 5.6; Microsoft Access 2007, 2013; Firebird DBMS and Postgres Enterprise DB.
  2. The second part shows the 2-year outlook (technologies that may be used in the near term, tactical time frame, what is currently available). Applications listed in this box are: Oracle Ent 11g R2, 12c R1, R2; MS SQL Server Ent 2008 R2; MS SQL Server Ent 2012, 2014, 2016; My SQL 5.6 for DRUPAL and Microsoft Access 2013 for local use only.
  3. The third part shows the 5-year outlook (technologies to be used in the future, providing strategic advantage, also anticipated marketplace products). Applications listed in this box are: Oracle Ent 12c R1, R2; MS SQL Server Ent 2014, 2016 and My SQL 6.0.

The second row, the technology lifecycle current state, has four parts or states described below.

  1. The first state, the retirement (EA standing = legacy), represents technologies or components targeted for de-investment in the 5-year period. Applications  listed in this box are: Oracle Ent 11R1; MS SQL Server Ent 2005, 2008 R1; MS SQL Server Std 2005, 2008 R1, R2; MS SQL Server Dev 2005, 2008 R1, R2; SQL Server Express 2005; Microsoft Access 2007 for local use only; Sybase 15.5; Firebird DBMS and Postgres Enterprise DB.
  2. The second state, the mainstream (EA standing = standard/prescribed), represents technologies or components targeted as primary deployment or investment for new or legacy migration. Applications listed in this box are: Oracle Ent 12c R1; MS SQL Server Ent 2012 and Microsoft Access v2013 for local use only.
  3. The third state, the containment (EA standing = exception), represents technologies or components targeted for limited investment during the 5-year period, such as boutique (niche) use or pending transition to retirement. Applications listed in this box are: Oracle Ent 11g R2; MS SQL Server Ent 2008 R2; My SQL 5.6 for DRUPAL; Microsoft SQL Server Compact v3.5 for CPI Hand held devices only; Sybase 16.0; MS SQL Server Developer 2012, 2014 (for installation on workstations only, and used by MSDN subscribers only, refer to the note in the box called opportunities in the third and last row of the figure) and Sybase 15.7, 16.0.
  4. The fourth and last state of the second row, the emerging (EA standing = under evaluation), represents technologies or components to be evaluated for future integration and use based on technology availability, business need (e.g. evergreening). Applications listed in this box are: My SQL 6; Oracle Ent 12c R2; MS SQL Server Ent 2014, 2016; Hadoop, etc. and Big data.

The third and last row of this figure, the planning analysis summary, is divided into three boxes:

  1. The first box represents the implications and risks and it contains three bullets which are:
    1. Version sprawl needs to be reduced, platforms consolidated to achieve lower TCO. In early 2016, Statistics Canada will have six SQL versions deployed with one version in N-3 (obsolete) and two versions in N-2 Retirement (obsolete as of 2017).
    2. Due to the hardware restrictions within existing Statistics Canada datacentre Oracle migrations from AIX to Linux are on hold.
    3. MySQL is a technology under exception for WCMS use only which is presently deployed to the corporate Web, NDM and ICOS.
  2. The second box shows the dependencies and it contains one bullet which is:
    1. Application stacks, hosting centre competencies, procurement, licensing agreements.
  3. The third box represents the opportunities and it contains three bullets which are:
    1. Develop a cost / performance justification template to rationalize tech selection.
    2. Alignment with End State Datacentre DBMS offerings.
    3. Installation of SQL Server Developer 2012, 2014 is restricted to MSDN subscribers for installation on workstations only. This product shall be used to allow developers access to various tracing and debugging features of SQL Server that are not otherwise available to them on hosted servers. No support or backups will be offered, so the use of TFS or similar products should be used to prevent the loss of source Code. This is meant to enhance, not replace the use of corporate infrastructure (Development, Test, QA and Production).

The bottom of the figure informs the reader that the architecture planning period at Statistics Canada is 5 years.

The second page in the brick shows the plan for technology evolution for the specific brick in a roadmap format (see figure 3.2.8). The horizontal axis shows time in quarterly increments while the vertical axis shows technology names and versions. The letters in the cells indicate the lifecycle state of the technology at a particular point in time. In the example, one can see that a technology starts in "Emerging" with an "E" and then transitions through the states discussed in figure 3.2.8. The legend is shown at the top of the page. There will be one or more roadmap pages for a given brick according to the complexity of the brick.

Figure 3.2.8: Technology Brick—Roadmap page
Technology Brick—Roadmap page
Description of Figure 3.2.8

This figure shows the roadmap page of the Technology Brick. The top of the table identifies the name, the date and the version. In this specific example, the name is Database Management System (DBMS) SQL Server DBMS, the date is January 1, 2016 and the version is Final 2.0. The top of the table also informs the reader of the technology product lifecycle: the lifecycle goes from E for emerging to M for mainstream to C for containment to R for retirement to X for end of life.

The first column of the table lists the technology product inventory (ordered from emerging to retirement). There is then one column for each fiscal year for a total of five fiscal years which are: 2015/2016, 2016/2017, 2017/2018, 2018/2019 and 2019/2020. Each fiscal year is then divided into four columns, one for each quarter of the year: Q1, Q2, Q3 and Q4.

Each row of the table then goes through the lifecycle (from E to M to C to R to X) of each technology product chronologically from quarter to quarter and from fiscal year to fiscal year. The technology products and their lifecycle are listed below:

  1. SQL Server Ent/Dev 2005: R from Q1 to Q2 of 2015/2016 and X in Q3 of the same fiscal year
  2. SQL Server Ent/Dev 2008 R1: R from Q1 to Q3 of 2015/2016 and X in Q4 of the same fiscal year
  3. SQL Server Ent 2008 R2: C from Q1 to Q3 of 2015/2016, R from Q4 of 2015/2016 to Q3 of 2016/2017 and X in Q4 of 2016/2017
  4. SQL Server Ent 2012: M from Q1 to Q3 of 2015/2016, C from Q4 of 2015/2016 to Q3 of 2017/2018, R from Q4 of 2017/2018 to Q3 of 2018/2019 and X in Q4 of 2018/2019
  5. SQL Server Ent 2014: E from Q1 to Q3 of 2015/2016, M from Q4 of 2015/2016 to Q3 of 2017/2018, C from Q4 of 2017/2018 to Q4 of 2018/2019 and R from Q1 to Q4 of 2019/2020
  6. SQL Server Ent 2016: E from Q4 of 2016/2017 to Q3 of 2017/2018 and M from Q4 of 2017/2018 to Q4 of 2019/2020
  7. Oracle Ent 11g R1: R from Q1 to Q3 of 2015/2016 and X in Q4 of the same fiscal year
  8. Oracle Ent 11g R2: C from Q1 of 2015/2016 to Q4 of 2016/2017, R from Q1 to Q3 of 2017/2018 and X in Q4 of 2017/2018
  9. Oracle Ent 12c R1: M from Q1 of 2015/2016 to Q3 of 2016/2017 and C from Q4 of 2016/2017 to Q4 of 2019/2020
  10. Oracle Ent 12c R2: E from Q2 of 2015/2016 to Q3 of 2016/2017 and M from Q4 of 2016/2017 to Q4 of 2019/2020
  11. MySQL 5.6: M from Q1 of 2015/2016 to Q4 of 2016/2017, C from Q1 to Q4 of 2017/2018, R from Q1 to Q2 of 2018/2019 and X in Q3 of 2018/2019
  12. MySQL 6: E from Q1 to Q3 of 2016/2017 and M from Q4 of 2016/2017 to Q2 of 2018/2019
  13. SAP (Sybase) ASE 15.5: R from Q1 to Q4 of 2015/2016 and X in Q1 of 2016/2017
  14. SAP (Sybase) ASE 16: M from Q1 to Q3 of 2015/2016 and C from Q4 of 2015/2016 to Q4 of 2019/2020
  15. SAP (Sybase) IQ 15.2: R from Q1 to Q4 of 2015/2016 and X in Q1 of 2016/2017
  16. SAP (Sybase) IQ 16.0: M from Q1 to Q3 of 2015/2016 and C from Q4 of 2015/2016 to Q4 of 2019/2020
  17. SAP (Sybase) Replication 15.2: R from Q1 to Q4 of 2015/2016 and X in Q1 of 2016/2017
  18. SAP (Sybase) Replication 15.7: M from Q1 to Q3 of 2015/2016 and C from Q4 of 2015/2016 to Q4 of 2019/2020
  19. Microsoft Access 2007 (local use only): R from Q1 to Q4 of 2015/2016 and X in Q1 of 2016/2017
  20. Microsoft Access 2013 (local use only): M from Q1 of 2015/2016 to Q4 of 2017/2018
  21. FireBird DBMS: R from Q1 of 2015/2016 to Q3 of 2016/2017 and X in Q4 of 2016/2017

Statistics Canada's Enterprise Application Registry also manages this information, which records tombstone data for all applications and technologies in use, including dependencies. This information is used as the common reference point for the various parts of the governance process, linking together IT operations, application development, application and technology portfolio management, and enterprise architecture.

4. Effective management of IT security

As a statistical agency, it is critical to protect data provided in confidence, and in accordance with high standards. To ensure impartiality, it is also important that statistical indicators be protected until they are ready for general release to all. To ensure that critical survey and indicator results are provided in a reliable and continuous manner, it is important to protect the IT infrastructure from damage.

The following three elements are essential to effectively managing IT security:

  1. ensuring confidentiality of the information
  2. maintaining integrity of information at rest (e.g., in a database), as well as in transit (e.g., moving between different applications over the network)—Consider whether data can be intentionally or unintentionally modified without permission.
  3. finding solutions to meet business needs (e.g., Consider whether IT management is running effectively).

Effective IT security management requires a close partnership with the business. The business is responsible for determining the confidentiality of the information in compliance with agency and government standards. The business is also responsible for identifying business needs with respect to integrity and availability, e.g., the minimum tolerable solution outage in the event something goes wrong.

IT is responsible for providing user authentication and data protection functions in its software, hardware, and networks to meet those requirements. Key deliverables include user authentication (e.g., login controls), access control (e.g., ensuring that a user is authorized to access the data, typically through permissions), and reporting and logging (to identify intruders, report on incidents, and ensure overall operability of solutions). In the availability area, IT is responsible for ensuring that a robust infrastructure, with strong security protection features, is in place with sufficient performance and capacity to meet business needs.

An effective IT security solution depends on trained and skilled personnel, standard operating processes, clear identification of the sensitivity of data and information, a secure source of login and access permissions, and systems that integrate with these controls. For more information about IT security and other security measures taken by Statistics Canada, refer to Chapter 4.6: Respecting privacy and protecting confidentiality.

5. Adequate IT governance

The governance function is a critical success factor in realizing IT business outcomes with the organization and IT strategy. It consists of two components:

  1. a set of principles, standards, guidelines, reference architectures, and frameworks consistent with the goals and elements of the strategy, and effective communication with and education for all stakeholders
  2. an assurance function that ensures alignment with these standards through review of projects, activities, and designs; this must include approvals, exceptions, and performance management of the function itself.

An important consideration for effective IT governance concerns the guiding principles used for decision-making:

  • Responsibility – the individuals and teams responsible for performing the IT business activities
  • Accountability – the individual(s) and teams accountable for the results of the activities
  • Authority – the individual(s) authorized to direct or modify activities, including initiation, acceptance, or cancellation.

These elements are aligned to maximize the benefit of the governance activity. They are interrelated, and work in harmony with each other to ensure that the organization can meet its goals. Ambiguity frequently arises, particularly in the aftermath of a centralization activity in which IT resources merge when the organization moves from a distributed model to a centralized core IT function. In some instances, clients may feel (or act as though) they have retained functional authority over certain IT activities, whereas this may actually be the case.

The success of the governance function also depends heavily on clarity about intended business outcomes, terms of reference, decision rights, as well as a clearly defined scope and the efficient flow of information. Statistics Canada strongly recommends an evidence-based approach. Relevant documents (e.g., design, specification, assessment) demonstrating compliance and alignment must be provided. This becomes especially important when one is dealing with external service providers (either within government or externally) and the challenges associated with different business mandates, multiple locations, and communications.

Ideally, the governance function is dynamic and accompanied by a high degree of transparency and organizational engagement. Success stories of "risks averted" or "solutions enabled" clearly show the value of the function. Compliance, itself, as a goal is rarely successful; compliance must be viewed as a means to a successful end, as examined in the IT and organization strategies.

Statistics Canada's IT governance approach rests on a core set of considerations:

  • Is there an overarching business transformation strategy to be followed (e.g., at Statistics Canada, this consists of the CBA and the CBA Service Approach)?
  • Are there overarching concerns regarding the IT strategy within Statistics Canada or in the broader government context (such as mandated technologies)?
  • Have appropriate trade-offs been made in deciding on technology diversity vs. additional cost?
  • Are there inherent risks with existing or proposed technologies?
  • Is Statistics Canada ready to adopt these technologies, or will it be ready at a future date? Do these technologies help position the organization to be ready for changes (e.g., Big Data, increased use of administrative data, use of mobile technology for interviewers and consumers)?

Effective governance should have strategic performance measures that show its effectiveness in helping the organization attain its corporate business goals.

The IT governance forum at Statistics Canada is the Information Technology Architecture Committee (ITAC). The Director General of the Informatics Branch and the Chief Informatics Officer co-chair this committee. Its overall mandate is to ensure that IT systems are developed using sound architectural principles and a standard set of tools and methods, so they will meet the business needs of the organization and respect the IT security policies of both the organization and of the Government of Canada. ITAC carries out the following activities:

  • serves as a forum on IT enterprise architecture, ensuring the strategic alignment of technologies, applications and processes to support Statistics Canada's programs and priorities
  • reviews, promotes and prescribes the framework of IT enterprise architecture, using technology bricks and systems roadmaps to maximize the re-use of generalized systems, common solutions, reusable components, and best practices;
  • reviews and authorizes the transition plans for decommissioning redundant or obsolete solutions and technologies, taking into account system dependencies
  • ensures that new IT systems being developed are compliant with standards by conducting technical and security reviews and by managing exceptions
  • monitors key indicators for IT services (i.e., incidents and availability) from both the Informatics Branch and Shared Services Canada
  • develops recommendations for referral to the Executive Management Board

Figure 3.2.9: Statistics Canada's governance structure

Statistics Canada's governance structure
Description of Figure 3.2.9

This figure explains the constitution of the Executive Management Board.

The Executive Management Board separates in 6 distinct branches.

  1. Other Management Committee.
  2. Security Coordination Committee (which leads another branch called Security Review Committee.)
  3. IT Architecture Committee (which leads another branch called Technology Review Committee. This branch controls the Security Review Committee.)
  4. CBA Management Committee (which also leads the Technology Review Committee, like IT Architecture Committee.)
  5. Methods and Standards Committee.
  6. Information Management Committee.

Figure 3.2.9 shows some of the management committees at Statistics Canada and the relative position of ITAC. Detailed technical and security reviews occur at subcommittees that report to ITAC and in the case of security, to the Security Coordination Committee. ITAC will review from an architectural and IT operations perspective the projects and solutions that have been approved by the CBA Management Committee (investment, project oversight) and other committees as required. Together, this ensures that Statistics Canada runs on a standardized set of platforms, technologies, and common statistical and information services, in support of CBA goals. As with all Government of Canada IT functions, Statistics Canada IT services also receive direction and guidance in the form of functional directives, action notices, and management accountability reporting requirements from the relevant central agency (Treasury Board Secretariat) and the organization in charge of providing government-wide IT infrastructure services (Shared Services Canada).

6. Integration of IT planning into Integrated Strategic Planning

Experience at Statistics Canada has shown that an annual planning process involving senior managers from across the organization (director-general level and above) is valuable in terms of reassessing the feasibility of, and the resources required for, both short- and long-term priorities. Every year, organizations should take account of information technology, and discuss and prioritize any adjustments to plans or resource requirements (see Chapter 2.2: Integrated strategic planning).

An annual review process like this serves to ensure that current projects are adequately resourced and that the organization has the capacity to support priority projects, including the necessary IT resources. It is important to review priorities with senior managers across the organization with the aim of ensuring that human, financial and IT resources are aligned to support priorities and to discuss any reallocation of resources between areas. A multi-year plan with targets and outcomes should be established and monitored to reflect this alignment. Data on the effort in person-days or person-years required to conduct different processes is useful information for such planning.

Organizations should develop a long-term investment plan that takes into account the regular renewal of IT systems and redesigns for all major surveys. This will help maintain quality standards, as described in the technology bricks that make up the IT Roadmap. Given the cyclical nature of statistical work, and the need to refresh IT technologies over time, it is important to create and document plans for IT systems maintenance and enhancements over a 3 to 5 years period. IT development should be managed separately from ongoing operations (see Chapter 2.4: Project Management Framework). Of particular importance, is a clear articulation of the future financial consequences of all IT activities. The expected duration and operating costs over the life cycle of a new technology must be well understood and documented before its introduction. In addition, one should consider the likely points in time when refurbishing or replacement is likely to be required and the estimated cost for this.

7. IT resourcing strategy

Given the IT-driven nature of statistical agencies, it is very important to have IT specialists who have knowledge of specialized technologies and are experienced in providing solutions to mitigate the different challenges of statistical production and innovation. This requires tight coupling between the technology and solution standardization efforts, noted above, and the development of specialist skills in the relevant areas.

Central to an effective IT workforce is a strong set of core values at the individual and team levels. The IT function needs to ask the following: "What are the guiding values or principles under which the work is performed and the mandate fulfilled?"

The essential values of an effective IT function include the following:

  • Client focus – Ensure clients (partners) are effective in their work by providing reliable, high-quality, flexible, and timely solutions.
  • Continuous learning – At both a personal level and an organization level, continuously improve, using research, market roadmaps, organization roadmaps, and learning sources in a multidimensional way.
  • Teamwork – Recognize that the success or failure of solutions depends on multiple teams and spans analysis, design, construction, assembly and production.
  • Cost-effectiveness – Ensure an optimum use of the resources to deliver solutions, avoiding waste through redundancy, low quality, and misalignment.
  • Innovation – Ensure that you have a creative and curious team that looks for new ways and new opportunities and shares and encourages the expression, exploration and adoption of ideas of business value.

In addition, the IT function needs to make effective use of permanent, full-time resources and contractors in order to meet business needs, such as agility, flexibility, and capacity. This capacity needs to be assessed on an ongoing basis to establish a balanced approach to developing, delivering, and supporting services.

A large full-time permanent workforce offers the benefits of continuity, control, and domain expertise, at the expense of headcount-related costs, which typically make up the majority of an IT budget. Most organizations contract external services as part of their operations, either to meet peak capacity demand (e.g., to conduct a census) or to address skills gaps in the organization. Contracts to address skills gaps should provide for a knowledge transfer component, if these skills are part of the long-term strategy. As a rule, the resourcing approach should focus on differentiating core business activities from internal resources; i.e., statistical production as opposed to corporate operations, unless the operations are key enablers for the business. The same discussion also applies to software as a service and to infrastructure as a service (public / private clouds).

A country's national statistical office may also want to create a plan outlining human resources needs for the next five years, and link it strategically to the business plan. This plan should cover skills requirements, the need to recruit specialized staff, training, career advancement opportunities, and ways to maintain a positive workplace. Increasingly complex IT environments and security risks result in new staff training, development, and retention requirements.

Before implementing major changes, it will be important to  carry out human resources capacity planning: How many and what kinds of staff are required, for each work location, to undertake large projects that have a fixed time period and to continue ongoing work? This involves carrying out a workforce analysis and creating targets for recruitment of new personnel, timing of retirements, and other potential departures (see Chapter 2.5: Human resources planning and management). Widespread training on corporate tools and IT systems should be encouraged.

Key success factors

From the perspective of a national statistical office, an IT modernization initiative relies on two important success factors: (1) the creation of common processes, systems and tools, and (2) IT management practices that focus on corporate strategic planning and lifecycle management.

1. Creation of common processes, systems and tools

A key success factor in efficient IT management is the development of corporate generalized systems (for collection, processing, analysis, and dissemination). To this end, it is important to ensure that standard working procedures be developed for each business process and, if possible, that these be aligned with the GSBPM. Subsequently, common IT systems and tools should be created for multiple surveys. This maximizes re-use, and reduces the diversity of computer systems and applications that the national statistical office is required to support.

In allocating IT resources, modernization projects should have a high priority. The development of a transition plan for moving existing surveys into the new business process model is essential, and monitoring of this plan is required. Spreading out the transition over time will distribute the workload and provide opportunities to learn from the transition of the first surveys.

2. IT management practices focussing on IT lifecycle management

Statistics Canada has effective IT management practices in place to manage risks associated with aging applications and technologies. These practices are built on three pillars:

  • a long-term investment plan, as part of its integrated strategic plan to support the changes required to maintain the continuity and quality of statistical programs, including refreshing IT technologies (software, hardware) before they become aging IT risks
  • an ongoing process for application portfolio management that evaluates each application's use, function, age, and technology risk. Inventorying and assessing the organization's application portfolio provide an effective means of evaluating the business and technical value of applications and of making informed decisions about investments in the areas of risk and of greatest opportunity for both IT and business. Application portfolio management helps mitigate the risks associated with aging IT. It identifies which applications to keep, de-commission, or modernize. The continual updating of an application portfolio management system is essential to rationalizing the number of applications in use in the organization, and to informing decisions regarding the organization's application de–commissioning plan. The application portfolio management supports both the organization's enterprise architecture framework and the CBA principles.
  • application retirement plan (known as System Roadmap) with target decommissioning dates for obsolete IT technologies that vendors no longer support or expect to no longer support in the future and for applications that become redundant once all surveys have migrated to the common solutions set out in the CBA. The lifecycle of each technology is documented as a technology brick.

Challenges and next steps

IT is always changing and evolving. This underscores the need to keep abreast of emerging technologies, methods and innovation. Pursuit of innovation in IT should be an essential part of a national statistical office's strategy to respond with greater agility to emerging data needs of statistical programs. The issues faced by IT management in a national statistical office include the following:

  • difficulty in managing competing priorities of clients
  • uneven uptake of advances in technology
  • no standardization of commonly-used survey functions
  • proliferation of similar systems in a wide variety of technologies

Taking a CBA service approach addresses many of these challenges. However, implementation of such an approach is not without its own challenges. Here are some of the lessons learned from Statistics Canada.

1. Transition planning / decommissioning of legacy systems

Statistics Canada recommends a transition plan for migrating existing surveys to any new business process model or IT system. Spreading out the transition will distribute the workload more equitably and provide opportunities to learn from the transition of the first surveys. Planning the decommissioning of old IT systems, once all surveys are migrated to new IT systems, should also be part of the IT plans.

IT can lead, but senior management of the subject-matter divisions must commit to and support plans for the decommissioning. The vision in terms of the current "as-is" state and the proposed "to-be" state, along with the rationale for the change, should be documented, so that this information can be clearly communicated to all interested persons.

Decommission dates must be clearly communicated to, and agreed upon by, the business-line owners. The decommissioning of systems should generate efficiencies that the organization can reinvest to support, maintain and enhance new corporate IT systems. IT management should be proactive in developing and communicating a decommissioning plan for old systems, while new systems are being implemented.

Subject-matter business-line owners should play a key role in planning the decommissioning of disparate legacy systems as common solutions. The Informatics Branch should know and understand the following:

  • Who are the owners? (both business-line and system owners)
  • Who are all of the users of each system component?
  • What data need to be kept or migrated? What data are to be archived or deleted?
  • Where are the systems and data located? (mapping to servers)

2. Balancing the need for standardization and innovation

Standardization and innovation are often viewed as natural "enemies", and some feel that a strong focus on standardization stifles innovation—or that innovation can lead to ad hoc and chaotic approaches. To avoid this pitfall, it is very important to identify how innovation and standardization complement one other. Standardization is an important tool for creating cost-effective organizations, which, in turn, can free up more resources for innovation. Innovation can evolve to become future standards, as organizations experiment with new ways and new technologies to determine their use. Innovation should focus on new ideas with good business-value potential—ad hoc approaches to basic technology and practices should not be confused with high-value innovation. Organizations can provide a stable, standardized platform for exploring and evaluating innovative ideas, which can lead to projects that will implement a new approach.

Endnotes:

Endnote 1

United Nations Economic Commission for Europe, 2015a, p. 8–9.

Return to endnote 1 referrer

Bibliography

Gartner (2006). Enterprise Technology Architecture PowerPoint Templates. Gartner document no. G00144188. Consulted on the 11th of March 2016 and retrieved from https://www.gartner.com/doc/498997/toolkit-enterprise-technology-architecture-powerpoint

United Nations Economic Commission for Europe (2013a). Generic Statistical Business Process Model v.5.0. Prepared by the High-Level Group for the Modernisation of Official Statistics. Consulted on the 11th of March 2016 and retrieved from http://www1.unece.org/stat/platform/display/GSBPM/Generic+Statistical+Business+Process+Model

United Nations Economic Commission for Europe (2013b). Generic Statistical Information Model Version 1.1. Prepared by the High-Level Group for the Modernisation of Official Statistics. Consulted on the 11th of March 2016 and retrieved from http://www1.unece.org/stat/platform/display/gsim/Generic+Statistical+Information+Model

United Nations Economic Commission for Europe (2015a). Common Statistical Production Architecture v1.5. Prepared by the High-Level Group for the Modernisation of Official Statistics. Consulted on the 11th of March 2016 and retrieved from http://www1.unece.org/stat/platform/display/CSPA/CSPA+v1.5

United Nations Economic Commission for Europe (2015b). Generic Activity Model for Statistical Organizations Version 1.0. Prepared by the High-Level Group for the Modernisation of Official Statistics. Consulted on the 11th of March 2016 and retrieved from http://www1.unece.org/stat/platform/display/GAMSO/GAMSO+Home

Report a problem on this page

Is something not working? Is there information outdated? Can't find what you're looking for?

Please contact us and let us know how we can help you.

Privacy notice

Date modified: