Posted by John Kleeman
This is the third and last post in my “Agree or disagree” series on writing effective attitude surveys. In the first post I explained the process survey participants go through when answering questions and the concept of satisficing – where some participants give what they think is a satisfactory answer rather than stretching themselves to give the best answer.
In the second post I shared these five tips based on research evidence on question and survey design.
Tip #1 – Avoid Agree/Disagree questions
Tip #2 – Avoid Yes/No and True/False questions
Tip #3 – Each question should address one attitude only
Tip #4 – Minimize the difficulty of answering each question
Tip #5 – Randomize the responses if order is not important
Here are five more:
Tip #6 – Pretest your survey
Just as with tests and exams, you need to pretest or pilot your survey before it goes live. Participants may interpret questions differently than you intended. It’s important to get the language right so as to trigger in the participant the right judgement. Here are some good pre-testing methods:
- Get a peer or expert to review the survey.
- Pre-test with participants and measuring the response time for each question (shown in some Questionmark reports). A longer response time could be connected with a more confusing question.
- Allow participants to provide comments on questions they think they are confusing.
- Follow up with your pretesting group by asking them why they gave particular answers or asking them what they thought you meant by your questions.
Tip #7 – Make survey participants realize how useful the survey is
The more motivated a participant is, the more likely he or she is to answer optimally rather than just satisficing and choosing a good enough answer. To quote Professor Krosnick in his paper The Impact of Satisficing on Survey Data Quality:
“Motivation to optimize is likely to be greater among respondents who think that the survey in which they are participating is important and/or useful”
Ensure that you communicate the goal of the survey and make participants feel that filling it in usefully will be a benefit to something they believe in or value.
Tip #8. Don’t include a “don’t know” option
Including a “don’t know” option usually does not improve the accuracy of your survey. In most cases it reduces it. To those of us used to the precision of testing and assessment, this is surprising.
Part of the reason is that providing a “don’t know” or “no opinion” option allows participants to disengage from your survey and so diminishes useful responses. Also, people are better at guessing or estimating than they think they are, so they will tend to choose an appropriate answer if they do not have an option of “don’t know”. See this paper by Mondak and Davis, which illustrates this in the political field.
Tip #9. Ask questions about the recent past only
The further back in time they are asked to remember, the less accurately participants will answer your questions. We all have a tendency to “telescope” the timing of events and imagine that things happened earlier or later than they did. If you can, ask about the last week or the last month, not about the last year or further back.
Tip #10 – Trends are good
Error can creep into survey results in many ways. Participants can misunderstand the question. They can fail to recall the right information. Their judgement can be influenced by social pressures. And they are limited by the choices available. But if you use the same questions over time with a similar population, you can be pretty sure that changes over time are meaningful.
For example, if you deliver an employee attitude survey with the same questions for two years running, then changes in the results to a question (if statistically significant) probably mean a change in employee attitudes. If you can use the same or similar questions over time and can identify trends or changes in results, such data can be very trustworthy.
I hope you’ve found this series of articles useful. For more information on how Questionmark can help you create, deliver and report on surveys, see www.questionmark.com. I’ll also be presenting at Questionmark’s 2016 Conference: Shaping the Future of Assessment in Miami April 12-15. Check out the conference page for more information.