What it is about is an online customer feedback survey (see image below – click to enlarge) that I have seen lately as a Barclays customer. The survey was conducted by Foviance, on behalf of Barclays. I was invited to complete the survey when, as a business customer, I came to the end of an online banking session.
Now, when I saw this the first time my first impression on the survey was:
- First question equals 3% complete
- Some quick maths later and I deduced that the survey would be approximately 33 questions long
At that point, I didn’t continue. When I saw it the second time I did exactly the same thing – closed the window.
Now, I am not suggesting that that will be the case for everyone. You’d have to ask Foviance about their response rates for their survey methods and I will reach out to them and try and find out.
However, research by SurveyMonkey done back at the end of 2010 and reported in Does Adding One More Question Impact Survey Completion Rate? analysed response and drop-off rates across 100,000 random surveys conducted by their users to better understand the relationship between drop-off rates and the length of surveys. According to SurveyMonkey:
“As expected, the more questions per survey, the higher the respondent drop-off rate from start to finish. However, as can be seen in the graph below, the relationship between survey length and drop-off rate is not linear. Data suggests that if a respondent begins answering a survey, the sharpest increase in drop-off rate occurs with each additional question up to 15 questions. If a respondent is willing to answer 15 questions, our data suggests that the drop-off rates for each incremental question, up to 35 questions, is lower than for the first 15 questions added to a survey. For respondents willing to answer over 35 questions in a survey, our data suggests they may be indifferent to survey length, and are willing to complete a long survey (within reason of course—we limited our analysis to surveys with 50 questions and below—we didn’t tackle the really, really long surveys we’ve seen this time around).”
For me, the most interesting comment in their findings was:
“the relationship between survey length and drop-off rate is not linear.”
In an another piece of research, SurveyMonkey also found that:
“In addition to the decreased time spent answering each question as surveys grew in length, we saw survey abandon rates increase for surveys that took more than 7-8 minutes to complete; with completion rates dropping anywhere from 5% to 20%. The tolerance for lengthier surveys was greater for surveys that were work or school related and decreased when they were customer related.”
Therefore, when thinking about surveying our customers we need to take into account the length of our surveys if we want to increase response rates.
I’ve written about surveys before and their impact in Do customer surveys do more harm than good? and Customer surveys, low response rates and staff targets and, I must admit, I am a fan of shorter rather than longer surveys.
However, thinking back to my response to the Barclays/Foviance survey, I realised that different people will respond to different things in different ways. The things that will affect their response will be governed by, amongst other things, their personality type.
This got me to thinking.
What if we added a question to the front of our surveys that asked our customers if they would like to take a short survey or a longer one. Would that increase response rates?
Personally, I think I would be more inclined to answer a short survey rather than a long one. But, I also recognise that not everyone is like me and that longer surveys ‘fit’ better with other people.
So, maybe we could improve response rates by adding a question that gives our customers a choice about the length of survey they would like to take?
What do you think?
Thanks to andrechinn for the image.