Analysis
Filter results by
Search HelpKeyword(s)
Subject
Author(s)
Results
All (3)
All (3) ((3 results))
- 1. Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias ArchivedArticles and reports: 12-001-X200800110607Description:
Respondent incentives are increasingly used as a measure of combating falling response rates and resulting risks of nonresponse bias. Nonresponse in panel surveys is particularly problematic, since even low wave-on-wave nonresponse rates can lead to substantial cumulative losses; if nonresponse is differential, this may lead to increasing bias across waves. Although the effects of incentives have been studied extensively in cross-sectional contexts, little is known about cumulative effects across waves of a panel. We provide new evidence about the effects of continued incentive payments on attrition, bias and item nonresponse, using data from a large scale, multi-wave, mixed mode incentive experiment on a UK government panel survey of young people. In this study, incentives significantly reduced attrition, far outweighing negative effects on item response rates in terms of the amount of information collected by the survey per issued case. Incentives had proportionate effects on retention rates across a range of respondent characteristics and as a result did not reduce attrition bias in terms of those characteristics. The effects of incentives on retention rates were larger for unconditional than conditional incentives and larger in postal than telephone mode. Across waves, the effects on attrition decreased somewhat, although the effects on item nonresponse and the lack of effect on bias remained constant. The effects of incentives at later waves appeared to be independent of incentive treatments and mode of data collection at earlier waves.
Release date: 2008-06-26 - 2. Design effects for multiple design samples ArchivedArticles and reports: 12-001-X20060019256Description:
In some situations the sample design of a survey is rather complex, consisting of fundamentally different designs in different domains. The design effect for estimates based upon the total sample is a weighted sum of the domain-specific design effects. We derive these weights under an appropriate model and illustrate their use with data from the European Social Survey (ESS).
Release date: 2006-07-20 - Articles and reports: 12-001-X20050018093Description:
Kish's well-known expression for the design effect due to clustering is often used to inform sample design, using an approximation such as b_bar in place of b. If the design involves either weighting or variation in cluster sample sizes, this can be a poor approximation. In this article we discuss the sensitivity of the approximation to departures from the implicit assumptions and propose an alternative approximation.
Release date: 2005-07-21
Stats in brief (0)
Stats in brief (0) (0 results)
No content available at this time.
Articles and reports (3)
Articles and reports (3) ((3 results))
- 1. Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias ArchivedArticles and reports: 12-001-X200800110607Description:
Respondent incentives are increasingly used as a measure of combating falling response rates and resulting risks of nonresponse bias. Nonresponse in panel surveys is particularly problematic, since even low wave-on-wave nonresponse rates can lead to substantial cumulative losses; if nonresponse is differential, this may lead to increasing bias across waves. Although the effects of incentives have been studied extensively in cross-sectional contexts, little is known about cumulative effects across waves of a panel. We provide new evidence about the effects of continued incentive payments on attrition, bias and item nonresponse, using data from a large scale, multi-wave, mixed mode incentive experiment on a UK government panel survey of young people. In this study, incentives significantly reduced attrition, far outweighing negative effects on item response rates in terms of the amount of information collected by the survey per issued case. Incentives had proportionate effects on retention rates across a range of respondent characteristics and as a result did not reduce attrition bias in terms of those characteristics. The effects of incentives on retention rates were larger for unconditional than conditional incentives and larger in postal than telephone mode. Across waves, the effects on attrition decreased somewhat, although the effects on item nonresponse and the lack of effect on bias remained constant. The effects of incentives at later waves appeared to be independent of incentive treatments and mode of data collection at earlier waves.
Release date: 2008-06-26 - 2. Design effects for multiple design samples ArchivedArticles and reports: 12-001-X20060019256Description:
In some situations the sample design of a survey is rather complex, consisting of fundamentally different designs in different domains. The design effect for estimates based upon the total sample is a weighted sum of the domain-specific design effects. We derive these weights under an appropriate model and illustrate their use with data from the European Social Survey (ESS).
Release date: 2006-07-20 - Articles and reports: 12-001-X20050018093Description:
Kish's well-known expression for the design effect due to clustering is often used to inform sample design, using an approximation such as b_bar in place of b. If the design involves either weighting or variation in cluster sample sizes, this can be a poor approximation. In this article we discuss the sensitivity of the approximation to departures from the implicit assumptions and propose an alternative approximation.
Release date: 2005-07-21
Journals and periodicals (0)
Journals and periodicals (0) (0 results)
No content available at this time.
- Date modified: