Dog facing to the right.
Editorial

Surveys Gone Sideways: Avoid These Common Sampling Pitfalls

4 minute read
Adam Berinsky avatar
By
SAVED
Survey size isn’t everything. Without clean sampling, your data-driven decisions could be headed in the wrong direction.

The Gist

  • Targeting matters most. Surveying the wrong audience can lead to misleading data and poor decisions, even if the questions themselves are solid.

  • Weighting shapes reality. Overrepresenting certain groups, like college grads, can skew results even when other variables are adjusted.

  • Bias is everywhere. Incentives, vendor methods and self-selection all affect who answers your marketing survey and who doesn’t.

Political pollsters have every incentive to get their survey samples right. 

In a close election, a miss by even just a couple of points can make the difference in whether a pollster correctly predicts the winner. And a high-profile whiff can lead to mockery in the press, damage a pollster’s reputation and even dry up future work. Yet, given three chances, pollsters underestimated support for Donald Trump in 2016, 2020 and 2024. 

This wasn’t the result of some nefarious conspiracy. And although marketers sometimes struggle to design effective surveys, the question in this case (“Who do you plan to vote for?”) could not have possibly been more straightforward. Rather, pollsters consistently failed to collect survey samples that properly mirrored the electorate. 

There’s an important lesson here for marketers. Even the most carefully crafted questions won’t yield useful insights if the wrong people are answering them. Watch out for these common sampling errors that could be sabotaging your marketing survey data. 

Table of Contents

You Don’t Define Your Target

Companies would never aim their marketing at an audience of “everyone.” The bright colors of a Froot Loops box are meant to appeal to children, while luxury car ads are aimed at status-conscious people who can afford a Lexus or a BMW. 

Yet when it’s time to survey prospective customers, marketing teams have a perplexing tendency to cast a wide net. This temptation to “go broad” may stem from the assumption that it’s always better to collect more data. But if the data isn’t coming from the people who are most likely to buy your product, it’s useless. In fact, it can be actively harmful, and it can cause companies to change their offerings based on the input of people who were never going to buy from them in the first place. 

Related Article: How to Create Effective Customer Surveys and Obtain Actionable Insights

You Don’t Weight by the Right Variables

This was actually one of the killers in those election polls, especially in 2016. Pollsters overrepresented college graduates in their samples, and this turned out to be meaningful, even when samples were weighted for other factors like income and ethnicity. The same issue shows up in marketing surveys, where people with college degrees are more likely to answer questions than those without.

This may be due to the more stable living situations of highly educated individuals (making them more reachable), past exposure to surveys and questionnaires, a higher degree of trust or even a sense of altruism. 

Really, the reason doesn’t matter, but the result does. If you’re oversampling college graduates, your data is likely skewed. 

Related Article: Designing Customer Surveys Without Causing Customer Fatigue

You Don’t Consider Self-Selection Bias

In a perfect world (for survey conductors, at least), you would be able to force every person in your selected sample to answer your questions. But in reality, you get what you get, and the people who self-select into your sample may be quite different from the group as a whole. Something as small as an incentive can make a big difference. 

This is a silly example, but you wouldn’t want to offer a Starbucks gift card as a reward for people agreeing to take a survey on whether or not they like coffee. Here’s a more realistic example. A marketing survey offering a charity-linked incentive is likely to draw a vastly different sample from one with a lottery-based award. Be mindful about who you’re attracting, and how that may influence your results. 

Related Article: Stop Survey Begging: 4 Tips to Improve Your Customer Feedback

Checklist: Avoiding Common Survey Sampling Errors

Use this guide to vet your survey design and ensure valid, actionable insights.

Question to AskWhy It Matters
Have you clearly defined your target audience?Surveying too broadly leads to irrelevant or misleading results.
Are you weighting your data by key demographic factors?Failing to do so can skew insights—especially with overrepresented groups like college grads.
Have you evaluated how incentives may affect self-selection?Incentives can attract non-representative participants, distorting outcomes.
Have you verified your vendor’s sampling methods?Cheap responses often come with poor data quality or sampling bias.
Have you considered the impact of non-respondents?Ignoring who’s *not* responding hides key behavioral insights.

You Don’t Question Your Vendor

Online survey tools make it easy for polling firms to get quick, low-cost responses. I’ve seen rates as low as 50 cents per response, compared to $20 and higher for traditional phone surveys. Any vendor is going to claim that their samples are robust and statistically sound, but marketers must conduct their own due diligence. Dig into the sampling data and make sure it avoids the pitfalls we’ve discussed so far. If your vendor doesn’t have a plan to avoid oversampling college graduates, heavy technology users or people outside your target demographic, find one that does. 

Learning Opportunities

Related Article: Why AI Can’t Fix a Vague Survey Strategy

You Don’t Account for the Unmeasurable

One of the most important pieces of information from a survey will never show up in the response data. That’s who didn’t choose to take the survey. Again, looking back at political polls, we can see how important it is to try to account for this. Even after pollsters adjusted for everything else, they were never quite able to account for the fact that Trump supporters apparently simply don’t like answering poll questions. 

You’ll never know exactly who is opting out of your marketing surveys, why they’re opting out or how they would have answered your questions. But by doing your best to analyze non-responses, you can give your organization one more valuable tool to help draw conclusions from your survey data. 

fa-solid fa-hand-paper Learn how you can join our contributor community.

About the Author
Adam Berinsky

Adam Berinsky is the Mitsui Professor of Political Science at MIT and serves as the director of the MIT Political Experiments Research Lab (PERL). In addition, he is a Faculty Affiliate at the Institute for Data, Systems, and Society (IDSS) and lead instructor of the MIT Professional Education course, “Effective Communication through Surveys and Market Research”. Connect with Adam Berinsky:

Main image: Natalia | Adobe Stock
Featured Research