Sean Ellis, the marketing guru behind DropBox and other successes, advises clients that “The most important question on a survey is, ‘How would you feel if you could no longer use this product?’” He goes on to quantify the response. If more than forty percent of the respondents say they would be “very disappointed,” the product should go viral and be a great success. Conversely, if less than ten percent say this, those companies or products would have a hard time getting traction in the marketplace.
What other questions could we wrap around this critical one to form a great survey that is both short enough and powerful enough to be relevant to our marketing effort, let alone our R&D and production efforts?
[Email readers, continue here…] Using Sean again as a source, we might ask: “How did you discover our company?” and provide several checkbox answers, including ‘friend or colleague.’ Again, it is a sign of a viral marketing effort to get more than forty percent checking that box. Then “Have you recommended our company to anyone?” Use just ‘yes’ and ‘no’ as possible answers, and look for more than fifty percent ‘yes’ responses.
And there is always the great open door question: “Would it be OK if we followed up by email to request a clarification to one or more of your responses?” If more than fifty percent say “yes” you have a real hit on your hands. It means you can use this respondent as a resource for case studies and marketing quotes in the future.
Keep your survey very short to insure a large number of responses. But do include at least one specific question about your product to be sure the respondent is an actual customer.
A great proof that survey structuring is an Art and not science. I believe this is the ever waited for practitioners to read and understand. Thanks for a great post
Terrific ways to differentiate customer “needs” vs. “wants” — if only more companies would ask this question rather than just assuming they know the answer. A brief but important article.
Great question! My only thought is that customer surveys don’t always reflect true behavior. Doesn’t the behavior from a price increase or change in features reflect truth? Isn’t a test with a small portion of our audience a better gauge?
Really enjoying your pearls of wisdom!