We should not pursue employee engagement for engagementβs sake
June 29, 2012Customer retention: Increase customer switching costs by doing something that is low revenue, constant but difficult to replace for the clients
July 7, 2012Barclays have been in the news a lot lately but this post is not about their transgressions.
What it is about is an online customer feedback survey (see image below – click to enlarge) that I have seen lately as a Barclays customer. The survey was conducted by Foviance, on behalf of Barclays. I was invited to complete the survey when, as a business customer, I came to the end of an online banking session.
Now, when I saw this the first time my first impression on the survey was:
- First question equals 3% complete
- Some quick maths later and I deduced that the survey would be approximately 33 questions long
At that point, I didnβt continue. When I saw it the second time I did exactly the same thing – closed the window.
Now, I am not suggesting that that will be the case for everyone. Youβd have to ask Foviance about their response rates for their survey methods and I will reach out to them and try and find out.
However, research by SurveyMonkey done back at the end of 2010 and reported in Does Adding One More Question Impact Survey Completion Rate? analysed response and drop-off rates across 100,000 random surveys conducted by their users to better understand the relationship between drop-off rates and the length of surveys. According to SurveyMonkey:
“As expected, the more questions per survey, the higher the respondent drop-off rate from start to finish. However, as can be seen in the graph below, the relationship between survey length and drop-off rate is not linear. Data suggests that if a respondent begins answering a survey, the sharpest increase in drop-off rate occurs with each additional question up to 15 questions. If a respondent is willing to answer 15 questions, our data suggests that the drop-off rates for each incremental question, up to 35 questions, is lower than for the first 15 questions added to a survey. For respondents willing to answer over 35 questions in a survey, our data suggests they may be indifferent to survey length, and are willing to complete a long survey (within reason of courseβwe limited our analysis to surveys with 50 questions and belowβwe didnβt tackle the really, really long surveys weβve seen this time around).”
For me, the most interesting comment in their findings was:
“the relationship between survey length and drop-off rate is not linear.”
In an another piece of research, SurveyMonkey also found that:
“In addition to the decreased time spent answering each question as surveys grew in length, we saw survey abandon rates increase for surveys that took more than 7-8 minutes to complete; with completion rates dropping anywhere from 5% to 20%. The tolerance for lengthier surveys was greater for surveys that were work or school related and decreased when they were customer related.”
Therefore, when thinking about surveying our customers we need to take into account the length of our surveys if we want to increase response rates.
I’ve written about surveys before and their impact in Do customer surveys do more harm than good? and Customer surveys, low response rates and staff targets and, I must admit, I am a fan of shorter rather than longer surveys.
However, thinking back to my response to the Barclays/Foviance survey, I realised that different people will respond to different things in different ways. The things that will affect their response will be governed by, amongst other things, their personality type.
This got me to thinking.
What if we added a question to the front of our surveys that asked our customers if they would like to take a short survey or a longer one. Would that increase response rates?
Personally, I think I would be more inclined to answer a short survey rather than a long one. But, I also recognise that not everyone is like me and that longer surveys ‘fit’ better with other people.
So, maybe we could improve response rates by adding a question that gives our customers a choice about the length of survey they would like to take?
What do you think?
Thanks to andrechinn for the image.
40 Comments
Hello Adrian
You make a great point. And I wonder why the survey cannot be really simple. For example as regards your banking sesson
What worked well for you during this session?
What didn’t work for you?
What changes would you like to see?
Any other ideas on how we can improve, do better?
Maz
Hi Maz,
Thanks for that. I, like you, prefer shorter surveys and agree with your suggested questions. However, the challenge with them is that they ask for too much qualitative data which doesn’t lend itself to making pretty charts π but it might increase the response rate.
Adrian
Hi Adrian
For once I disagree with you – this *is* another Barclays transgression! 15 minutes for a customer survey? Really? And I note that their consultants are equally culpable, it was probably their idea.
However, frivolity aside, what the SurveyMonkey research doesn’t fully reveal is that there are other factors at work as well as length. You make a great point about personality type, and it would be interesting to try your experiment. But I have to say I have never heard anyone ask for a longer survey. Ever. Well, not a customer anyway – I’ll accept that people inside a business (and especially the consultants they use, who have a vested interest) try and shoe-horn in as many as they can, so there’s clearly a tension there.
Other factors that can skew results are:
– if there’s a prize or incentive, respondents will throw in any old answers just to get to the end, which means the results aren’t trustworthy or actionable;
– if the questions are chosen poorly, it creates a negative impression of the company in the mind of the customer (conversely, well chosen questions can improve the company’s standing), so ask only about things that are important to the customer;
– a poor user experience can lead to frustration where previously there was none (so don’t use multi-page surveys or multi-part SMS surveys).
From what I’ve learnt, Maz’s suggestion would be a popular length for a satisfaction survey. Perhaps the appropriate ‘extra question’ might then be ‘Would you also like to complete a longer market research survey?’
Hi Guy,
Actually I agree with that it is another transgression but I didn’t want to pour fuel onto an already large fire.
I think you make really good points none of which I can argue with.
In terms of the additional question then if it is a means to show business that people want shorter surveys then why not use it as it could be a very effective tool for change. Don’t you think?
Adrian
Adrian,
good post, and interesting question.
I won’t try to prove how smart I am about surveys by telling you some theory about what works and what doesn’t like the other comments. The question you asked is what are my thoughts on using a single question upfront to query the respondent on what length they want.
There are two problems with this, from my POV:
1. the systems that collect the data are not setup to work this way — theoretically… the people who choose the shorter survey will do in a different mindframe from those with longer surveys (there was a study done about length of surveys and animosity/attitude towards answers that suggested that shorter surveys get better (more positive) responses due to mindframe) and thus the responses are not going to be comparable (problem develops over time, when you are comparing results from a survey with 80% short answers and one with 50% short answers, for example). this bias will make ti hard to keep a historical perspective on data
2. i am going to guess that you are suggesting that the rest of the questions is not important enough to get the customers’ opinions on them — then why have them? if you think a short survey gives you sufficient data, why not just use short surveys with everyone?
there are other issues, but don’t want to sound preachy here π
bottom line, make shorter surveys and have everyone be happy and give you better scores — and then follow up with a select group that agreed to do so… instead of giving them options of surveys before, use a trimmed good survey that gives you 80%+ you need, then ask as a last question if you can follow up for more details — and do so (in exchange for a token prize, not disclosed in either survey).
there is a lot more to surveys than asking and reporting – that is where most people fail: planning.
One more point that was brought up to me by someone who read my above comment — I probably expressed myself poorly when I referred to the other comments. I did not mean anything by my statement that they were using theories — was trying to make a distinction between my answer and theirs – and probably chose the wrong words…
My apologies if I offended or disparaged anyone – not my intention.
Hi Estaban,
Thanks for that. No offense taken I am sure. Looks like I wrote something that people feel strongly about. A robust exchange of views is always a healthy thing. Thanks for contributing to the discussion.
Cheers,
Adrian
Hi Estaban,
Thanks for sharing your thoughts on my post. I agree that shorter surveys and then follow up are the way to go. However, my underlying frustration is with businesses that persist with long surveys and poor response rates. Like my response to Guy’s comment I was wondering if by adding an additional question and it showed businesses that people want shorter surveys then it could be a very effective tool for changing mindset and how they approach surveys. Perhaps that’s naive and a bit cheeky? π
Adrian
Hi Adrian,
I kind of agree that the response rate is related to the length of the survey. I had the same experience with the Barclays survey. I was about to answer it, but as soon as I saw that there were more than 30 question I closed it, and I would assume that these happen with many of us.
Thanks for sharing this post. Good stuff on it!
Hi Karol,
Thanks for dropping by and sharing your perspective. I’m glad it was not just me that thought the Barclays survey was too long.
Funnily enough, I shared this post with the folks at Foviance who run the survey for Barclays and I still haven’t heard back from them. Curious!
Adrian
Hi Adrian
I had similar surveys ostensibly from BMW and British Airways but actually organised by Icon Added Value and GfK NOP. Both were infeasibly long, especially when you consider the demographic and likely attention span of their target customers.
I, too, contacted both survey organisers, because I thought the surveys reflected badly on their clients’ brands. I, too, did not receive the courtesy of a reply or acknowledgement. Yes the big brands have editorial control, but they are not being well served by their consultants either in their advice or their treatment of the customers and that behaviour deserves to be called out.
Guy, couldn’t agree more.
I wonder if you get to charge more for a longer survey? π
Adrian
Excellent post Adrian, we have implemented a Customer Sure programme (the fantastic resources set up by Guy Letts actually) to make sure that our clients and candidates are able to tell us exactly how we are communicating with them.
This has been a great success for us!
I couldn’t agree more about HUGE surveys! I am happy to help but not that much!
Hey Dan,
Thanks for your input and for the big shout out for Guy and the folks at CustomerSure.
Adrian
Hi Adrian,
I always gets survey requests from various services I use let’s take the PayPal what always likes to push a short of questions in my inbox by sending a link that redirects me to the survey to improve their services.
But I always ignore those requests because they requires full attention, while I prefer in solving them if they have yes/no option to choose the appropriate one.
Hi Robinsh,
Thanks for your comment. It seems that keeping things short and sweet is the consensus that is coming through and is the way to go with surveys.
Thanks for adding your perspective,
Adrian
Adrian,
Personally I think it is what they do with the answers that counts, not the number of questions.
Though I guess the more questions the more confused (and easy to duck) the analysis will be
James
James, you’re absolutely right.
I read a great line by Bruce Temkin this week “Feedback is cheap, but [customer-insightful] actions are precious.” http://experiencematters.wordpress.com/2011/06/16/9-net-promoter-score-nps-recommendations/
It’s why I favour transactional feedback (and follow-up action) over mass surveys when it comes to customer satisfaction. Mass surveys in my experience are ineffective in improving the experience of individual customers. Yet we are won or lost one at a time, depending on the actions (or lack thereof) that we each experience as customers.
To anyone looking for customer survey or customer feedback software, unless it’s purely for market research and business planning, I would advise looking for one that has integrated action capabilities. Otherwise, as Adrian indicates, it’s the research company who benefits from the exercise rather than the customers and your own business.
Hi James, Guy,
I agree with the points that you both make. The thing that bugs me is best encapsulated by a saying from Goethe when he said: “Please forgive me for writing such a long letter. I did not have time to write a short one.”
If we believe that it is indeed harder to make something shorter and more concise then it is the lack of effort and care by businesses that inflict and bombard their customers with long surveys that frustrates me.
Adrian
Adrian, thanks for the tweet and happy to comment. I used to run Foviance so I have to declare an interest ahead of writing.
Once someone agrees to click the ‘agree’ button on a survey then there is [of course] a relationship between the length of the survey and the abandonment rate. However, there is also a relationship between the usefulness of the information and the length of the survey.
NPS fans will say that one question can give valuable information and achieve high completion rates but NPS is not very useful if you want to find out something specific (that is not whether you will be getting a recommendation). Most companies already wrap a number of other questions around NPS for that very reason.
In my opinion, the correct length of a survey is the one that achieves the statistically significant sample size completions that are required and provides the insight needed whilst not disenfranchising customers. That is generally my customers requirements and I like to keep them happy.
Hi Paul,
Thanks for you comment and for entering the debate.
I agree with your observations about length, abandonment rate and NPS. With the post I was seeking, through an observation, to explore if there was a better way of conducting such surveys by adding a question.
The particular challenge with the survey in question, however, was not just it’s length but now I come to think about it that there was no question asking if I would agree to a survey upfront. Rather the survey just popped up and started at the end of a banking session. This happened twice.
Unfortunately, once this happened they lost me.
If they had asked permission upfront then perhaps things may have been different.
Adrian