3 Research design and implementation
Roos Haer and Nadine Meidert
The experiments we describe here were included in a survey of University of Konstanz first year Bachelor and Master students conducted by the quality management unit of the university. This unit was interested in why students choose to study in Konstanz. We designed the questionnaire in close cooperation with the quality management unit, whereas the various designs of the welcome screen were conceptualized solely by us.
The different designs of the welcome screen were added after the content of the study was determined. We tested the three features in a 2 x 2 x 2 experimental design. Table 3.1 gives an overview of the control and treatment groups. See Appendix A for an example of one of the six possible welcome screens.
| Privacy Right | |||||
|---|---|---|---|---|---|
| Available via link | On the welcome screen | ||||
| Background color | |||||
| white | red | white | red | ||
| Announced survey duration |
short (8min) |
short
white link |
short
red link |
short
white screen |
short
red screen |
|
long (20min) |
long
white link |
long
red link |
long
white screen |
long
red screen |
|
To test the influence of background color on the likelihood of breakoff, we selected two lay-outs: one with black text on a white background and another one with black text on a red background. We are aware of the fact that the red is not a very realistic background color. Nevertheless, due to the mixed-results of previous research, we have chosen this color as a most likely case of breakoff directly after the welcome screen. Red is a saturated color with a long wavelength. Additionally, this color might cause a negative effect on the emotional response of the respondent since it is usually used as a warning sign. However, we are acquainted with the fact that our research design cannot clearly determine which of the discussed mechanisms (i.e., wavelength, saturation, or emotional response) has a possible impact. Nevertheless, we can give first insights if the color on the welcome screen is relevant at all. Note that we have verified that the display of the background colors was the same across different browsers.
To examine the effect of privacy rights on the breakoff rate, we again came up with two designs: a version in which the privacy rights were described in detail directly on the welcome screen, and another version in which the privacy rights were only briefly mentioned and respondents could use a Web link that opened a new window where their privacy rights were made clear in the same way it was done in the first version.
To test the effect of the perception of duration i.e. the announced length of the Web survey on the welcome screen, we announced two different time durations needed to fill in the questionnaire. We used the result of the pretest as guidance to estimate the duration. The result of the pretest indicated that, completing the survey took on average around 12 minutes. Consequently, we decided to inform the sampled persons in one version of the survey that it would only take about 8 minutes to complete, which corresponds to the expected minimum to complete the questionnaire, whereas another group of respondents were told it would take about 20 minutes to complete. The duration to complete the survey depended to a great deal on how many answers respondents would give in the survey, since the questionnaire contained many filters. The real mean time to complete the questionnaire was 17.81 minutes (with a standard deviation of 9.01), which is considerable higher than the pretest indicated.
The invitation of the Web survey was sent to all 2,629 first year students' university e-mail accounts (See Appendix B). We focus on this particular student population since we assume that they had not been frequently exposed to Web surveys from the university and they were therefore more inclined to fill out such a suvey without 'satisficing' (Toepoel, Das and Van Soest, 2008). Once the students clicked on the survey link (n=1,419), they were randomly assigned to one of the eight groups, with a minimum of 151 students and a maximum of 185 students in each treatment group. On average, there were around 177 students per treatment group with a standard deviation of 8.5 (see Table 4.1 for the exact number of respondents per treatment group).
Since the information provided in the invitation e-mail often overlaps with that provided on the welcome screen, we limited the instruction in the e-mail as much as possible to isolate the possible effect of the welcome screen. Furthermore, to keep the e-mail's layout as simple as possible no HTML format was used. With the invitation e-mail, the students received a unique URL in which their personal password was integrated, which prevented multiple completions of the survey. These e-mail messages inviting students to participate in the survey were sent on November 8, 2011 after having conducted a qualitative pretest with 15 individuals, who were working in the administration of the university, who were students, or who had experience with survey research. This pretest was focused on technical aspects of the survey and on the wording of the text and the questions.
Five days after the initial e-mail had been sent, a reminder was sent to those who had not participated in the survey yet. A final reminder was sent on November 18, 2011 not only to those who had not participated yet but also to those who had started but not completed the survey. The survey closed on December 5, 2011. From the 2,629 students, 1,419 started the survey, from which 1,118 completed the entire survey. Completion of the survey means that the participant arrived at the last page. Since all important questions were implemented as forced-choice, item non-response is not relevant. These figures result in response rates of 43 percent using the American Association of Public Opinion Research (AAPOR) RR1 and 54 percent using AAPOR RR2 which takes partial response into account (AAPOR 2011).
The Web survey was implemented using the Unipark program, which is an online survey software allowing users to create Web surveys with minimal effort. The program allows for straightforward programming and is in comparison to other providers low priced for scientific use. Unipark is rather flexible in the different aspects. For example, participants can interrupt filling in the questionnaire and continue later. Furthermore, the system records information when participants break off or the time they need to complete the questionnaire and individual screens. In addition, it allows for integrating sophisticated and non standardized tools to the questionnaire design. Despite the increasing use of mobile devices, we did not implement a mobile webpage, which has proven to be a good decision as only 65 participants used mobile devices to fill in the questionnaire. However, these participants, although more likely to access the survey, were also more likely to break off (21 percent in the group without mobile devices and 35 percent with mobile devices, p<0,01).
- Date modified: