Surveys: The Perils & Pitfalls And How To Overcome Them
You’d think that just putting together a survey would be easy, right? Just ask the questions that you feel need answering, then decide the format and software, and where you want to post it. So easy in fact, that at least as of 2013, according to vovici.com, it was estimated that American adults received 7 billion of them each year. I receive a lot, but certainly not that many. That’s almost 20 million a day for heavens’ sake! Perhaps that’s because I’m actually a Brit living in the USA? Hmmm.
If they really mean all American adults combined, rather than each one, I’m sure I receive several times more than their supposed average of less than 30 per adult per year.
I get one :
- Almost every time I buy something online when not from Amazon.
- Every time I ask a question or have any customer support dealings
And many of the social media sites and blogs I frequent also hope I will impart my opinions. Seems like barely a day goes by without me being asked to fill in at least 2 or 3.
And there we have just one of the problems that every would-be surveyor must contend with:
How do I get people who are inundated with surveys, to take mine?
Seems like the most important survey question is:
What would make you take my survey?
Why Do People Create Surveys?
According to snapsurveys.com, the 4 main reasons are the following:
- Uncover answers – At least a little obvious, right?
- Evoke discussion
- Base decisions on objective information
- Compare results
Nothing particularly earth-shattering there, but they do give a little background on each. The most notable aspect relates to point 3:
Don’t rely on “gut feelings” to make important business decisions.
Your knowledge and instincts can only take you so far. And at some point, even running things by your trusted friends and peers will not be sufficient.
The Pitfalls Of Running Surveys
Well, beyond the already established point that it won’t be as simple as you’d hope to get responses, there are far more ways to get it wrong than to get it right!
Here’s a quick Infographic from Survey Monkey regarding the preferences of their survey takers:
- How much can you hope to learn from multiple-choice questions?
- How much might you learn if virtually nobody answers a survey that includes a lot of open-ended questions?
Avoid These Mistakes In Your Surveys
There are so many….
1. Don’t ask leading questions!
As detailed here on cnbc.com, the UK was originally going to ask a simple question in its Brexit Poll:
“Should the United Kingdom remain a member of the European Union?
This was obviously a leading question, but it would certainly not have been unusual
Imagine how different “”Should the United Kingdom leave the European Union?” might have been?
In the end, they put 2 check boxes. One alongside “Remain a member of the European Union” and the other by “Leave the European Union”.
This lead nobody and no matter what anyone’s thoughts might be regarding the result, it was certainly not influenced by the way the question was asked.
2. Apply The ‘Flip’ Test
The trap avoided by the UK pollsters is stumbled into an absurdly high percentage of times that ‘yes and no’ questions are asked. I urge everyone thinking of asking such a question, to consider the following:
- Would the result be different if I flipped the question?
- Exactly how much use are results that are stilted towards a particular response?
Sure, it’s a cheap pollster trick. It’s also something that businesses pay good money for every day. I don’t doubt that in some instances, the leading language is aimed at producing results that are favourable to a pitch. I suspect that in the case of small businesses and amateur pollsters, it is nothing more than an error of inexperience. So I also caution people who are looking at poll results to help them make up their own minds on something, to carefully consider the flip test.For 'Yes or No' survey questions, always employ the 'flip' test. If the results would be different, change the question!Click To Tweet
3. Not All Answers Are Valuable!
It’s really important that you try focusing on who is taking your survey. Are they in your niche? Do they take surveys for the gifts or prize you are offering?
More about that later….
Interpreting and Presenting Survey Numbers
If you look at the Survey Monkey Infographic above, you will note that:
- 45% aren’t willing to spend more than 5 minutes on a survey
- 87% “prefer” multiple-choice answers
Does either number tell the whole story?
- How many of that 45% will not spend more than 3 minutes on a survey, for example?
- And of the 87% who “prefer” multiple-choice, how many would respond to at least an occasional text question?
Back in March 2014, I asked a number of questions regarding an Infographic by Zendesk. I invite you to read the entire article, which was titled “Here’s How To Find The Right Context From Survey Results [Infographic]”. (Forgive the multi-coloured text!). Here’s the Infographic….
The second set of numbers stated that:
- 18% said they would be more willing to take surveys if they are short
- 13% would do so if rewarded
- 10% said they should be interesting
Obviously, respondents are picking from one option from multiple-choice questions. Nobody is suggesting that 90% of people are happy to do boring surveys, or that 82% would be happier if surveys were longer! They have been given a drop down list and asked to select one option from each list.
Targeting Specific Responders
We really want our survey to have meaningful results. OK, that’s a slightly redundant point, as who would take the time to create and distribute a survey if that wasn’t their goal! Let me rephrase that slightly. It is our intention to give our readers more of what they come to Curatti (or any similar site) for.
Coming up with the questions was painstaking. Do we ask confining multiple-choice questions, which more people are likely to respond to but will ultimately be of less value? Or do we mix in some text boxes and let you tell us in your words?
We looked at perhaps buying an audience from Survey Monkey. $1 a response for anything up to 15 questions, isn’t cheap, but it’s also not expensive. But if you look a little closer, this is the cost only when you accept a broad demographic.
If you narrow the audience down to a specific area of employment, the cost per response would be $11. And depending on which box(es) you check, they may not have enough people available to be able to guarantee even 100 responses!
- Age range (75 cents extra per responder)
- Politics ($1 extra per responder)
- Internet usage (50 cents extra per)
- Pet ownership (50 cents)
- How often you exercise ($1)
And quite a few others that can run up to $5 extra per response.
If $1 per responder could have given us reliable responses, we’d probably have done it. But how much value can people in entirely unrelated fields provide with their answers?
We’re not going to talk about cars, restaurants or pharmaceuticals, for example. So why ask people whose main expertise is in those areas, which areas of Digital Marketing they want us to write more about? If they are doing the surveys for a financial reward – however small – mightn’t that make all of the results useless?
Who Is Responding?
What percentage of your following responded?
Without background data on the respondents, do their answers have any context and therefore any relevance, whatsoever?
Are the respondents representative of the broader public?
Are they even representative of your broader audience
And offered these suggestions:
Know the background of the person you are asking – THIS DOESN’T HAVE TO INVOLVE ASKING THEM MORE QUESTIONS!
It could, however, mean a controlled release, with the survey sent directly to specific people, as opposed to just posting a link……… or
Create multiple, possibly exact copies of your survey and
Direct different people to different links
It’s a Wrap!
I have it on very good authority that to plan, execute and analyze a scientific survey, you should expect a bill of somewhere in excess of $15k. So they are for the big boys.
For the rest of us, we must make do with spending way more time than we could possibly have imagined it could take, devising or own questions and making the most of the responses we get.
Expect trial-and-error. Expect fewer responses than you hoped for. Actually, better yet, hope, rather than expect, that you will have results you can learn from and pass on to others.
It’s here that we used to offer you the chance to take our survey. But although quite a number of people looked, few actually took it. It basically proved to us that surveys need to be short. Very, very short.
Have you had good or bad experiences with your own Surveys? (We’d love it if you would share them!)
Do you have any questions or thoughts on the above?
In general, can you offer advice to anyone looking to create their own survey?
Image attribution: Lead/Featured: Copyright: ‘http://www.123rf.com/profile_file404‘ / 123RF Stock Photo
Latest posts by Andy Capaloff (see all)
- Don’t Do These Things In Your Outreach Emails - April 1, 2021
- Curatti Best Articles of 2020 (And Happy New Year 2021!) - December 31, 2020
- Data Redundancy: Why You Need An External Storage Device - December 10, 2020