When was the last time you got asked to fill out a survey? Probably not that long ago. Did you do it?
I was talking to a friend who said that in the past couple of days, she had been asked to do three surveys, all from different suppliers...all asking for fifteen minutes of her time. And she said,
"I know people who love surveys but I’ll be honest with you, sometimes I find them a little tiring! But now the shoe's on the other foot. I've been asked to write a survey!"
We live in an information society and crave knowledge about our attitudes, behaviours and communities.
Surveys are an effective way of gathering this information and can reveal opportunities, reduce uncertainties and identify the performance and potential direction of our companies. For better or for worse, they are everywhere.
Over the years, I have discovered that there can be a lot more to surveys than meets the eye. There’s good reason that research methodology is studied at a tertiary level. It's not as straightforward to develop a good survey you might think.
So I asked my friend if she would share what she has just learned about survey question development. With her permission, I've combined her new knowledge with everything I've learned over the years about what makes a survey question effective or ineffective.
And I'm sharing it with you because you never know when it’ll be your turn.
It’s hard to know where to start. Things like, defining your objectives, delivery, structure and font are important, but I’m going to just jump right into the meaty bit and tell you what both my friend and I have learned about writing survey questions.
WRITING YOUR QUESTIONS
Like I said, I have done a lot of surveys and although that doesn’t tell me how to write a question, I do know when I find a question frustrating and what I wish the author had thought of before they wrote the question.
Here are some things you need to think about when writing research questions:
Know your audience – Your audience is donating their time, unless of course they’re procrastinating…in which case they’re still donating their time, but they’re probably happier to do it.
Whatever the case, their time is precious so don’t waste it. Aim to make your survey around 15 minutes; any longer and you risk survey fatigue.
Make the survey relevant. Someone in Sydney is not going to care about the daily commute on Melbourne transport.
Make sure they have the knowledge to answer your questions. It’s unlikely that an employee from finance will be aware of the busy periods on the retail floor. It’s possible, but unlikely.
Target your audience by selecting participants who care about the results.
Be concise and express ideas simply – Now you know your audience, write your questions in a way that won’t frustrate them and you’ll be more likely to get a response.
Don’t ask unnecessary questions. If you have an email address, then you probably don’t need a mobile number or postal address. Unless you want to send them a thank-you gift in which case tell them that.
Reciprocity is a powerful tool. If you give your audience something they are more likely to return the favour – by completing the survey.
Use simple language and stay away from jargon. This is not the time for lyrical flights of fancy. Explain sufficiently but don’t over explain. The following is a true case from the NBC/Wall Street journal (1).
Example 1 - A Bit More Information Changes The Whole Picture...
The first question doesn’t define ‘government entitlements’. Without this knowledge the respondents don’t know what the cuts relate to, but as the majority are in favour of reducing the budget deficit they agree to cuts.
The second question explains the programs that are likely to be cut. Now the respondents know how they are to be effected and are strongly opposed to cuts.
In this case under explaining produced opposing results and compromised data.
Avoid bias and leading questions – This is a tough one because as the author you have a certain idea of how you want your responses and resulting data to look. So it’s hard not to word the question in a way that leads the respondent to support your hypothesis.
Asking ‘How much did you like the movie?’ shows a bias towards liking the movie even when you include a ‘not at all’ category. It would be more neutral to ask ‘How did you feel about the movie?’.
‘Do you think that research scientists are underpaid?’ reveals that the author thinks the scientists are underpaid and wants the respondent to agree.
No matter how satisfying it would be to show your survey sponsor a report saying scientists are underpaid, it would be inaccurate and ironically, in a weird, convoluted way, potentially a justification for lower pay (because you will have just demonstrated you didn't know how to write a good survey question. Think about it....)
Best not to get on that downward spiral. Keep it accurate from the start and if it’s hard to identify your own bias get someone else to do it for you by pre-testing your survey.
Avoid double barrelling – Double barrelling is when two thoughts are expressed in the same question such as ‘How well do you understand your pay and benefit entitlements?” with the responses on a scale from ‘very well – not well at all’.
In this instance a respondent may understand their pay but not their benefits but the responses don’t allow that distinction.
It is tempting to double-barrel. Especially when the questions are similar as it looks like it reduces the number of questions and saves the respondents time - but it doesn’t.
It introduces ambiguity and frustration and increases the likelihood that the respondent will dump the survey.
Complex ideas need to be broken down into multiple questions making one concept per question.
Formatting with a matrix as demonstrated below is an easy way to help the respondent answer similar questions.
Example 2: Make It Easy For Respondents To Answer Similar Questions
Please rate your experience with the following;
Precision – Precision ensures that every respondent interprets the question in the same way.
Instead of asking ‘How many?….’ ask ‘How many times in the last week?’ which will additional benefit or giving you more measurable data.
A dichotomous question such as ‘Do you like orange juice could be turned onto an ordinal-type question that ranks different variables such as taste, texture, nutritional content or price.
‘Do you watch TV regularly?’ has similar issues. What does ‘regularly’ mean? And is a downloaded TV series or movie the same as ‘TV’? Precision is king. When in doubt, get someone to review and advise you.
Risky words – Sartre wrote that ‘Words are loaded pistols’. His meaning is beyond the scope of this blog, but a literal interpretation could imply that words have the potential to cause damage.
For our purpose, there are certain risky words probably best avoided. The words could/should/might may seem interchangeable.
However swapping them around in a question such as ‘The supreme court could/should/might change the limits on free speech in light of terrorist activities’ would yield markedly different responses.
Strong words such as ‘prohibit’ or ‘oppressive’ should be avoided as should words with high-emotional connotations such as in the question ‘Do you think terminally ill people suffering from cancer should be released from their pain?’.
Exhaustive and Mutually exclusive responses. Just as your questions need care, so do your responses. An exhaustive answer range is one that always gives the respondent an option. We never want them to think, “my answer isn’t here.” This problem is easily solved by including an “other” option with or without a text box.
Example 3 - Provide A Way To Make All Possible Choices Available
Mutually exclusive responses guide the respondent in such a way that they can only choose one answer. No answers overlap. Example 4, Set 1 is confusing as those aged 25 can legitimately choose one of two answers. Set 1 shows a truly mutually exclusive range of answers.
Example 4 - Make Your Potential Response Choices Mutually Exclusive
Just remember that the aim of the game is to make it as easy as possible for your audience to select an answer.
Accommodate human nature – Human nature can be capricious, but when it comes to surveys it’s pretty predictable. Avoid extremes in your answer range. Make sure your typical response is in the middle of the answer range not at the end.
Pre-testing can help identify the typical response. Question ordering can influence how a person answers a question which can be minimised by using randomisation feature offered by many online survey tools.
Response ordering can also impact the answers as it’s common for respondents to pick the first answer. Bias comes into responses just as it does with questions as demonstrated in the table below.
Example 5- Watch Out For Bias
….and finally – Allow the respondent the opportunity to give their feedback in an open text box. It probably won’t be completed, but I know a lot of people who get annoyed if it’s not there.
I did mention that there is more to surveys than meets the eye, and my friend and I still have much more to share.
We're looking into information about samples, pilots, analysis and timing - giving a survey to a mother with a full shopping trolley and three screaming toddlers at the checkout will not promise a high response rate. Nor will giving a student a survey during an exam period.
However, just as there is survey fatigue, I believe there is blog fatigue so I will save those ideas for another post. The only thing I want to add is that despite finding surveys tiring, my friend now more fully appreciates the effort and thought that goes into developing those surveys and is determined to make more of an effort to complete them. And in her words:
"As they say, knowledge is the basis of tolerance."
Have you ever seen a poorly worded survey questions and gone "what the ????" Or had to answer that poorly worded question, like my friend did and then worried yourself sick because you were scared that it would result in the wrong outcome or choice being made for you or someone you love?
If so, I would love to hear your story. I am working on an eBook that will help people ask clearer survey questions and I would be grateful if I could include stories that will help illustrate why good survey questions matter.
If you would like to share, send me an email by clicking on the link below.
All the best.
Director, West Island Digital
(1) Sheuren, F, from What is a Survey, American Statistical Association. https://www.whatisasurvey.info/chapters/chapter1.htm