Statistics Canada
Symbol of the Government of Canada

Innovation indicators: More than technology?

Warning View the most recent version.

Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please "contact us" to request a format other than those available.

by Michael Bordt, EASD, Statistics Canada

The third edition of the Oslo Manual poses several challenges to future innovation surveys: measuring organizational innovation and marketing innovation; coping with complex and multinational organizations; understanding innovation in services and low-tech manufacturing.

About this article
Background
Main messages
Summary
References
About the author

About this article

This article summarizes the main messages from the CEIES seminar on "Innovation indicators: More than technology?" held in Denmark, February 5-6, 2007.

Presentations and papers presented at all CEIES workshops as well as background information on the CEIES are available at: http://forum.europa.eu.int/Public/irc/dsis/ceies/library

Choose the folder "Seminars 31-40" and then "32nd CEIES Seminar".

Background

The CEIES is a committee of the European Union to reflect the opinion of the European society at large on community statistics. CEIES stands for Comité consultatif européen de l'information statistique dans les domaines économique et social; in English: 'The European Advisory Committee on Statistical Information in the Economic and Social Spheres'.

Part of the work program of the CEIES is to organise seminars on current topics and the topic of the 32nd seminar, held in Århus, Denmark on February 5-6, 2007 was 'Innovation indicators: More than technology'. While the main purpose of the seminar was to advise Eurostat on the implementation of the third edition of the Oslo Manual (OECD/Eurostat 2005), many of the main messages were of broader interest.

The third edition of the Oslo Manual poses several challenges for future innovation surveys:

  • It broadens the definition of innovation from technological product and process innovation to include organizational and marketing innovation. The term "technological", that is research and development (R&D)-based, has been dropped;
  • It places a greater emphasis on linkages with other firms and institutions in the innovation process;
  • It provides advice on obtaining information from the appropriate level of the organization;
  • It recognizes the importance of innovation in less R&D-intensive industries such as services and low-technology manufacturing;
  • It places emphasis on the creation of sub-national innovation statistics.

Main messages

Producer ability to collect data – some experiences

Peter Teirlinck (Belgium) noted that weighting procedures that take item non-response into account greatly affect the results.

Giulio Perani (Italy) provided an example of a two-tiered approach in which respondents at headquarters are asked to obtain information from establishments if they cannot answer on their behalf.

The Danish Survey of Innovation (presented by Peter Mortensen) asks respondents for innovation expenditures in each postal code.

Tomohiro Ijichi (Japan) noted that the results of the non-response analysis of Japan's Survey of Innovation 2003 showed that 23% of non-respondents did not respond because they were unfamiliar with the survey.

Lynda Carlson (USA) highlighted the importance of questionnaire testing and stakeholder consultations in the redesign of the US R&D survey. Such consultations had suggested changes in wording (but not necessarily in concept) from the OECD manuals.

Response unit; new to firm; knowledge management

The author's own presentation focused on "Response unit; new to firm, market and world; knowledge management". The presentation advised that the unit responding to the survey should be capable of responding on behalf of other levels of the organization. If the reliability of such a proxy response cannot be assured, approaches that survey multiple levels of large organizations should be used.

The Oslo Manual recommends substituting "new to the market" for "new to the country" as a measure of regional novelty of innovation. In testing the question for the Canadian Survey of Innovation 2005 (Statistics Canada 2005), it was found that "new to the market" was interpreted by smaller companies as the local market of that company. To avoid mixing local with national concepts of innovation, "new to the country" was retained.

Although some aspects of knowledge management were recommended in the Oslo Manual, the author suggested that the broader set of practices as piloted by Canada and several other OECD countries (see OECD/Statistics Canada 2003) be considered for future innovation surveys.

Data providers' response, ability and willingness

Patrick Corbel (France) showed how "vignettes" or realistic innovation stories helped focus the Oslo Manual revision discussions on including organizational and marketing innovation. The vignettes were adapted from actual write-in questions on the French Community Innovation Survey (CIS3).

Viggo Maegard (Danfoss A/S, Denmark) noted that for Danfoss, it was impossible to provide an estimate of in-country innovation activities. Part of the reason for this is that, while R&D was conducted in Denmark, a majority of the sales were outside the country. Because of these complexities, the company was averse to providing "rough estimates" of innovation expenditures.

Peter Mortensen (Denmark) noted the improvement in response rates with the shorter CIS4 questionnaire. Despite this, some items obtained poor item response rates. He suggested that innovation surveys should be linked with already-available administrative and survey data to reduce response burden.

Comparative analyses based on Community Innovation Survey (CIS) data

Staffan Laestadius (Sweden) noted that a pilot survey on innovation in low-technology industries showed substantial knowledge production in these industries, which are not R&D intensive.

Leo Hannes (Austria) found that it was possible to conduct some international sectoral comparisons for some select firm types, such as gazelles and eco-industries.

Heidi Armbruster (Germany) showed the benefits of a more detailed survey (and a longer time-frame) that obtains more detailed information on organizational innovation. She also emphasized the collection of information on the extent of use of a given practice as well as when the practice was first introduced. Such detail may be impossible to include on broader innovation surveys but may be useful for occasional focused surveys.

The revised Oslo Manual and its implementation into CIS

Frank Foyn (Norway) presented the results of successive Norwegian surveys of innovation. The proportion of product and process innovators changed little when the term "technological" was dropped but increased when the concepts organizational and marketing innovation were added. Intuitively, the opposite effects should have been observed.

Carter Bloch (Denmark) urged analysts of innovation data to develop composite indicators that would be more useful for telling a comprehensive story.

Vincent Dautel (Luxembourg) reported the shortening of the reference period from three years to two did, as expected, reduce innovation rates in services, low-technology industries and in small firms.

Aavo Heinlo (Estonia) emphasized that the concepts of "new" and "developing" are not as clear for non-technological innovation (organizational and marketing) as for technological product and process innovation.

User needs for new indicators – as well as the existing

Reinhard Büscher (European Commission) noted that existing innovation indicators were already playing a major role in the European Innovation Scoreboard (InnoMetrics 2006). Such composite indicators are essential as communication devices even though analysts debate the meaning and validity of national all-industry aggregates.

Anthony Arundel (UNU Merit) maintained that analysing more detailed breakdowns of innovation data (by size class, industry, innovation type and R&D performance) would highlight the "neglected innovators". That is those in certain industries that manage to innovate despite not performing R&D, being small and being in service sectors.

Sven Olaf Nås (Norway) suggested asking all questions on innovation surveys of non-innovators as well as of innovators. The firm information is useful for understanding the reasons for non-innovation or for discovering innovators who fail to report on innovation activities.

Giulio Perani (Italy) suggested facilitating access to all micro-data by European researchers.

CIS 2008 and beyond

August Gotzfried (Eurostat) described efforts underway to test modules on knowledge management, user-driven innovation and eco-innovation (that is, emerging environmental technologies) for the upcoming CIS.

How far and fast can we go?

Fred Gault reviewed the outcome of the Blue Sky II conference (see IAB, Vol. 8, no. 3, December 2006) and the challenges posed. Blue Sky II urged the S&T indicators community to: develop indicators to tell a comprehensive story; move from activity measures to impact measures; to coordinate, focus and synthesize; to move from macro to micro analysis; and to develop a "science of science policy".

In parallel with this, new modes of analysis were advocated that would develop micro-analytic and macro models, incorporate case studies and most essentially, to support the understanding of the Big Picture. In terms of new indicators, participants recommended improvements for cross-cutting areas: HR measures; classifications and guidelines; firm characteristics and sustainable development.

Summary

The Third Edition of the Oslo Manual posed several challenges to measuring innovation. The seminar proves that the coordinated experiments and analysis of the broad stakeholder community are going a long way to meeting these challenges. Blue Sky II, however, poses even further challenges that will inspire work over the next decade.

References

InnoMetrics. 2006. European Innovation Scoreboard 2006. (accessed March 1, 2007).

OECD/Eurostat. 2005. Oslo Manual—guidelines for collecting and interpreting innovation data, 3rd edition. Paris, France.

OECD/Statistics Canada. 2003. Measuring knowledge management in the business sector—first steps. OECD Catalogue no. 96 2003 021 P. Paris, France and Ottawa, Canada.

Statistics Canada. 2005. Survey of Innovation. (accessed March 1, 2007)

About the author

Michael Bordt is with the Environment Accounts and Statistics Division (EASD) at Statistics Canada. For more information about this article, please contact sieidinfo@statcan.gc.ca.