Dynata has tracked the impact of questionnaire length on fatigue and data quality across more than a decade and found the effects of increasing length have remained consistent over time:
- Response rate does not depend on interview length.
- Longer surveys do not necessarily mean increased drop out; most drop out has occurred by the halfway stage, irrespective of interview length. However, drops from those taking the survey on mobile phones do increase as length increases.
- Participants get fatigued, pay less attention to the task in hand and increase their speed of response as the interview progresses.
- Data quality suffers as interview length increases.
Although response rates have plummeted since online research began almost 20 years ago, drop-out rates have remained remarkably constant. This shows, perhaps, that people have the same level of attention span, or at least the same level of commitment to complete a task once they have started.
Whether a survey is short (about 10 minutes) or long (about 30 minutes) most of the drop out occurs by the halfway point of the survey. We have noted in recent years that people are taking a shorter time than they used to to complete surveys. This may be due to increased bandwidth leading to faster response time, or to increasing familiarity with the survey environment on the part of many participants (knowing where the buttons are and the like). By employing a block-rotation design, we see that as the same block of questions are moved further back in the study, the time taken to complete them gradually reduces. This could be due to increased familiarity with the question set; but we were able to show that at least some of the increased speed was due to fatigue.
One piece of research which demonstrates this is a series of questions using a slide bar. The slider bar is positioned at the mid-point so the respondent is allowed to click on “next” without moving the slider and still leave some data behind. Results show that the likelihood of not touching the slider bar increased as the slider bar question was encountered further and further into the questionnaire.
In another test, one block of questions about vacations was only shown to people who said they had been on a vacation. Since participants were randomly assigned to see this block either early or late in the survey (1st, 2nd, 3rd or 4th position) we would expect to see the incidence of qualification to be the same wherever it was positioned, and the same incidence in both the long and short surveys. This was not the case. The highest incidence of qualification for the short break block of questions was 68%. This occurred on the short survey when positioned first. The level of qualification declined at each position further into the questionnaire to finally reach 50% when the block was positioned 4th. The same phenomenon occurred on the long survey. Qualification started at 64% and finished at 47%.
Data quality can also be measured by answers in open questions. There should be no real difference between the number of words or number of characters used in the same open-end question, whether it was asked first or last. We found that in the long survey the number of characters typed decreased as the survey progressed.
Participants (especially online panelists) tend to keep going on to the end of any survey, however long. Is it their fault that they are tired and cannot think as clearly as we would like them to?
In the “old days” of telephone and face-to-face interviewing, the interviewer would hear when participants became tired and started to satisfice. They would take pity on them, perhaps dropping out of interview mode for a moment or two, providing a mental break before going back to the task in hand. In online self-completion mode this type of encouragement is absent.
Dynata’s original research into fatigue effects in long surveys was conducted in 2004. When we repeated the study in 2009, the same results were found – fatigue and satisficing increased later in the survey, especially after the 15-20-minute mark.
Dynata suggests a number of strategies to reduce questionnaire length, including:
1.Remove or shorten questions wherever possible. If the answer never changes, does the question need to be asked? Can a question be asked of only some of the respondents?
2. Use factor analysis to reduce the number of items in grids
3. Break the questionnaire into chunks and piece data together after field
4. Use the demographic and other information the sample provider holds to avoid re-asking for this information
If researchers work to keep surveys shorter, it will not only help ensure response quality, but it will also make for more motivated and responsive participants. For more research on survey fatigue, and how to improve your surveys contact us.