1 Introduction

Roos Haer and Nadine Meidert

Previous | Next

With the growing number of internet users and the increasing popularity of broader bandwidth, the use of Web surveys to collect data is proliferating at a rapid pace (Vicente and Reis 2010). The advantages of this survey mode have been well documented; they save a significant amount of time and money. Along with the positive aspects of Web surveys, there are methodological concerns that cannot be ignored if survey quality is to be guaranteed (Vicente and Reis 2010). These concerns are mainly about nonresponse and coverage. Although coverage is of less concern for surveys of specifically named persons, such as students, nonresponse remains a major concern in Web survey research (Crawford, Couper and Lamias, 2001).

Web surveys are connected with relatively low response rates compared to other modes of survey research (Lozar Manfreda, Bosnjak, Berzelak, Haas and Vehovar, 2008). It affects all types of Web surveys, from list-based samples to pre-recruited probability-based panels and opt-in or volunteer panels (Couper and Miller 2008, page 833). The nonresponse rate is the sum of those respondents that did not participate in the Web survey, although they were invited, together with those respondents that broke off and dropped out prematurely. In other words, nonresponders are those respondents who do not view all questions and answer all questions (Bosjnak and Tuten 2001). It is important to note that we use the terms 'dropout' and 'breakoff'; as synonymous throughout this study.  Nonresponse is of particular importance to researchers because the unknown characteristics and attitudes of non-respondents may cause inaccuracies in the results of the study in question (Bosnjak and Tuten 2001). This problem poses a challenge for any survey mode, but in particular to Web surveys (Galesic 2006). In general, dropout rates in Web surveys may be as high as 80 percent, with an average of about 30 percent. For individually targeted Web surveys, these rates are lower, but still average at about 15 percent (Galesic 2006, page 313; Peytchev 2009).

By far, the largest number of respondents who drop out do so on the initial page – the welcome screen (Couper 2008). On this splash screen, the invitee is reassured that they have arrived at the right place, is informed about the content of the survey, (given the country context) privacy rights, and encouraged to proceed to the survey itself.  Consequently, this particular page plays an important role in sealing the deal; in turning the invitee into a respondent (Couper 2008, page 330). However, despite its importance, the influence of the welcome screen on breakoff rates has received little to no research attention (Couper 2008, page 330).

This is even more surprising, considering the rich multimedia capabilities of Web surveys that allow text to be supplemented with a variety of visual elements, such as color, graphics, typography, and animation. Experimental research has shown that the content of the text as well as these auxiliary features are potentially powerful tools for maintaining respondents' interest in the survey and for encouraging completion of the instrument (Couper, Traugott and Lamias, 2001). Although respondents are exposed to these design features from the very first screen that they see, most of the conducted experimental survey research has been limited to the influence of the design of the actual complete survey on breakoff rates and not to the influence of specific lay-out and design features of the welcome screen.

With this in mind, the goal of this study is to systematically explore some of the factors connected to the welcome screen that may affect the decision to break off the survey prematurely. As such, this research falls into the category of research focused on how to increase response rates (Bosjnak and Tuten 2001). In identifying these factors, special attention is devoted to the use of color, the announced length of the survey, and to variations in describing the privacy rights of the respondent. These three factors are standard elements of the welcome screen often discussed in textbooks, but neglected in empirical research.

The remainder of the article is structured as follows. Firstly, empirical evidence is used from Web surveys to develop our hypotheses about the influence of particular design features of the welcome screen on the breakoff rates. Next, we describe the experimental study we conducted aiming to test these hypotheses. Finally, we conclude this article with a discussion and implications for Web survey design.

Previous | Next

Date modified: