What do you think are the reasons why respondents are unable to answer the questions being asked?

Us humans like to think we’re entirely rational beings, but the truth is that none of us are logical a hundred percent of the time.

Even Apollo—the Greek god of logic and rational thought—did all sorts of irrational things, like when he challenged humans to music contests or tried so hard to seduce Daphne that she turned into a Laurel tree to escape him. (Myths are weird.)

Our irrationality makes us human. And part of that irrationality is the thousands of biases running through our heads, influencing our thinking, forming our beliefs, and affecting every minute decision and judgment we make daily.

Bias is everywhere. And the most significant form of it in surveys and research is response bias. It sways results and impacts data collection quality, most of the time, without you even knowing it.

It’s impossible to altogether remove biases. They’re like mosquitoes in the Australian summer— you can’t ever really get rid of them, you can just do your best to minimize their impact.

In this comprehensive guide, we outline the steps you can take to make sure you are doing everything within your power to avoid response bias and gather accurate data.

What is response bias?

Response bias is a term for when respondents don’t tell you the full truth when answering your survey questions. This leads to inaccurate data and makes it difficult to garner useful insights for your business.

They might misguide you about what they think of a product, hide their true beliefs, or answer ‘yes’ to every question just because they can’t be bothered to fill out your boring survey.

Usually it happens because questions are phrased in a certain way, or because a certain response might be considered more socially acceptable.

Most often, it occurs when people are asked to self-report on behaviors in surveys and structured interviews, but it can also be a result of bad form design.

Biases aren't always intentional. They can be accidental or subconscious and caused by a bunch of internal and external factors, including:

  • How you phrase questions
  • How you conduct the survey
  • Survey design/format
  • The researcher’s demeanour (in person)
  • A respondent’s desire to be a good subject
  • Fatigue
  • Plain ol’ boredom

The chances are you’ve exercised response bias of your own when filling out online forms. Ever just clicked “Strongly Agree” on every question to get it done? Or told a white lie to make yourself sound better; cooler; smarter?

It’s safe to say most of us have. And while, as respondents, we don’t think about the effects our answers have, the reality is, it dramatically effects the integrity of the data collected on the other end.

There’s not much you can do about the biases folks bring to your form, whether it’s in-person or online. It’s beyond your scope. But in a lot of cases, response bias isn’t your respondents’ fault.

Nine times out of ten, your survey structure and questions leads them to answer the way they do—it might be too long, too ugly, or you might have written your questions in a style that leads them towards specific answers.

It’s up to you to encourage more honest and accurate answers by wording and formatting your surveys the right way. The rest is up to fate, or to keep the Greek mythology ball rolling, The Fates.

How survey bias affects your data

Surveys and questionnaires are valuable ways for your business to gather data and opinions from your target audience. But both tools rely on honesty.

When done well, surveys allow you to make educated assessments of public opinion, gauge satisfaction and get valuable feedback to improve your products, services, or your business as a whole.

But when the data and opinions are false—when respondents exhibit too much bias—it takes their power away, like tossing a brick of Kryptonite onto Superman’s lap. Then your survey isn’t worth the paper it’s printed on or the form builder you used to create it.

An example of the importance of surveys

A good example is the Hollywood film industry. Based on the widespread adoption of polls and surveys in media and politics, in the late 1930s studio executives hired the famous George Gallup to develop a survey methodology for movies.

Gallup and his staff conducted more than 5,000 surveys for over a dozen studios and producers. Susan Ohmer says these surveys “influenced the casting, narrative structure, and promotional campaigns of films ranging from Gone with the Wind to Disney’s Cartoons.”

To this day, just about every part of every major movie is still influenced by surveys, from Avengers: Infinity War to the latest Star Wars. Why rely on intuition when you can learn exactly what type of stories, actors, jokes, relationships, and even endings people want to see?

That’s not to say surveys are the sole driver of ideas—they’re not. But they are used to supplement the work done by creatives and are hugely important in the creation of successful movies and television shows.

Imagine if Disney or Netflix didn’t control for response bias when surveying their audiences. They would be getting a bunch of false data that led them to make stuff that no one wanted. No Tiger King. No true crime documentaries. No Mandalorian. The horror!

This is why you want to adopt the right survey methodology and have processes in place to minimize response bias as much as possible—to get insights that are actually accurate and valuable for your business.

6 types of response bias

Understanding how response bias happens is the first step to minimizing its effects. Let’s take a look at the most prevalent forms of bias so you can recognise each one and stop them from spoiling your survey results.

1. Acquiescence bias

Acquiescence bias (also known as “agreement bias”) is a type of bias where people tend to agree with a statement or answer “yes” to a yes/no question regardless of what they believe.

Why does this happen? Because subconsciously or not, most folks like to be seen as polite and likeable (and our memories are super unreliable.) People also tend to look for information from their own experiences to support a positive response.

Lawyers deal with this often when questioning a witness. If they ask a question like “Did the man have a blue shirt on?”, even if the person doesn’t remember, data tells us people are more likely to say yes than no.

The same goes when asking questions that require respondents to either strongly agree or strongly disagree with a statement—they are more likely to agree.

So what can you do about it? Well, the best way to avoid acquiescence bias is by including contradictory statements to test the accuracy of the answers. Present one statement, then follow up with a different one. The ol’ switcharoo.

If the respondent agrees with both contradictory statements, it tells you that their responses aren’t entirely accurate. Find this is happening often? Think about how you could reframe the question to make it more engaging and personal.

2. Demand characteristics bias

The term ‘demand characteristic’ was coined by researchers in psychology who found their subjects were forming ideas about an experiment’s purpose and subconsciously adjusting their behaviours to fit that interpretation.

This bias applies to any form of research. Whether it’s an experiment, a survey or a questionnaire, people tend to try to figure out what its goal is so they can give the “right” answer within the context of your study.

There are a few main demand characteristics that can influence responses:

  • Rumours. If respondents hear any information (whether it’s true or false) about the survey or questionnaire from outside sources. For example, they might hear from a friend that for filling out a customer service survey with positive answers they received a gift card.
  • Setting. The location  the survey is performed in can have an influence, as well as the brand or company conducting the survey. If respondents are asked to complete a survey at Harvard University, the chances are they’ll be influenced by the setting of the institution.
  • Communication. Any kind of communication between participants and the person conducting the survey (verbal or non-verbal) can influence how they respond to your survey. For example, if you joked about punishing them for the “wrong” response.

Prominent psychologists Thomas Cook and Stephen Weber, wrote that participants tend to adopt one of four roles. These were coined in reference to experiments, but apply equally to surveys and questionnaires too. They are:

  • The good participant: The participant tries to understand the goal of the experiment and give you the "right" answer because they don't want to "ruin" the survey.
  • The negative participant: The participant attempts to destroy the credibility of the study by intentionally answering incorrectly (also known as the 'screw-you effect').
  • The faithful participant: The participant follows instructions down to a tee.
  • The apprehensive participant: The participant is so worried about how their responses might be evaluated that they do the right thing.

Short of giving up, how do you avoid demand characteristics bias? Well the best thing to do is reduce human contact (since COVID-19 we're all used to that now anyway).

Conducting surveys face-to-face often leads the surveyor to unintentionally reveal things through body language, reactions, or just small talk. By using online surveys you can take out the human element and knock demand characteristics bias on the head.

3. Extreme and neutral responses

We all know the story of Goldilocks and the Three Bears? Goldilocks pops into a house and tries the porridge (a weird thing to do). The first one’s too hot, the second is too cold, and the third one is juuuuust right.

Think of extreme response bias as the first two bowls of porridge— it refers to when respondents provide all super positive or super negative responses, with no middle ground.

A neutral response is when people choose whatever the middle option is, which while not as extreme, is equally unhelpful for your survey results.

Both of these biases happen a lot with when surveys require people to rate something on a scale. For example, a Semantic Differential or Likert Scale that asks customers to rank customer service from from 1-10 or from “very unsatisfied” to “very satisfied”.

There can be a few reasons for these types of answers, including education, culture, indifference towards the survey or the wording of your questions. In our experience (and we deal with lots of surveys) most often it comes down to plain ol’ boredom.

To avoid extreme or neutral response bias, try to keep surveys short. The longer your survey goes, the more likely respondents are to get fatigued or start to lose interest (just like a TV show or a Christopher Nolan movie).

Try to cut down the amount of available responses for scale questions— you might think that giving more options is helpful, but too many choices just makes people freeze up. It's a delicate balance that pre-testing can help you find.

Most importantly, use interactive elements throughout your survey to make sure respondents are engaged. Use button animations, customize colours and fonts, insert images and videos, or use tools like Paperform’s Guided Mode to make a distinct immersive experience users will love.

4. Social desirability bias

In the words of George Costanza, “we live in a society, you know?”. And as part of a society, most people want to be seen as someone who thinks, says and does what’s socially desirable.

This type of response bias refers to when people respond to a survey with answers that they feel are desirable, and avoid giving ‘undesirable’ answers, no matter their feelings on a subject.

At first glance it sounds a bit like acquiescence bias, but the difference is that it’s not about appeasing the interviewer—it’s about living up to societal standards and cultural expectations.

For example, we all are conditioned from childhood that drug use isn’t acceptable. If you were to ask “do you think it’s acceptable to take recreational drugs?”, most folks would say no, despite what they may actually think or do.

People recognise what the socially appropriate answer is, and give that regardless of their real opinion. It goes without saying that this kind of false reporting impacts the value and usability of the data you’ve collected.

To avoid social desirability bias try to validate respondents’ answers with multiple questions. Ask follow-ups that are contradictory to help you recognize when their responses are inaccurate and try to make sure your questions don’t lead subjects towards any particular answers.

Above all, do your best to make respondents feel that there are no “right” answers. This can  be done by adding a short disclaimer at the start of the survey. You could also make surveys anonymous, which is a simple way to encourage honest responses.

5. Question order bias

Survey bias isn’t just about what questions you ask or the way you ask them, it can be about the order you ask them in. There are two types of question order bias: contrast and assimilation.

Contrasts effects are where the order of questions leads to greater differences in responses. Assimilation effects are where the responses are more similar as a result of their order.

Contrast effects are exemplified in a recent study of people who receive government grants commissioned by the Board of Social Services in Denmark. Because policy-makers rely on satisfaction surveys to inform budgets and decisions, they wanted to find out how question order affects satisfaction levels.

They found that when asked about overall satisfaction before any specific services, people expressed far lower satisfaction levels. When asked about specific services first, overall satisfaction was much higher—even in people with high education levels and experience with survey research methods.

So what’s this mean for us non-academics? It means respondents tend to provide answers consistent with their prior responses. That’s why it’s so important to think about the way you’re designing your surveys, rather than just slapping a couple of questions together and calling it a day.

Even basic questions can influence each other. For example, if you ask people what type of soft drink is their favourite and then ask what their favourite drink is, most folks will say soft drink.

This is called priming— and it’s a way to make people subconsciously think about something in order to influence their answers. It’s used in all sorts of devious ways, mostly by shoddy lawyers in TV shows, and it’s something you want to avoid.

There are three main ways to combat question order bias:

  • Start broad. In most instances it helps to start with general questions that become more specific as the survey progresses. It lets respondents get comfortable, adds a flow to the survey and cuts down on response bias.
  • Randomise. Randomising the order of unrelated questions is a simple way to reduce bias. Only randomise questions that don’t need to be asked together, like “what is your favorite color?” and “what is your favorite video game?”.
  • Group related questions. Respondents expect related questions to appear in the same sections. But that doesn’t mean they have to be in the same order. You can group related questions and change the order within that block to reduce bias and still keep things organised.

Randomisation is the most effective way to reduce question order bias, but it’s by no means a one-size-fits-all solution. Sometimes questions need to be asked in a certain order to make sense, or to achieve your goals.

Trust your instincts. Most of the time an awareness of potential response biases is the first step to resisting them.

6. Non-response bias

Last, but definitely not least, is non-response bias. Contrary to what it looks like, this isn't actually the opposite of response bias. It's a term used for when respondents are either 1) unable or 2) unwilling to respond to your survey.

Usually this is for a few reasons:

  • Their opinions are out of line with the target audience
  • The survey didn’t reach the right respondent
  • The wrong audience was targeted

Most of the time non-response bias (or as participation bias) happens because your survey is poorly constructed, or because the wrong people were targeted.

For example, a survey asking teenagers about the best cigarette brand isn’t going to get many responses unless you hop in a DeLorean and travel back to 1985.

With online surveys often there’s a simpler reason for non-response bias: spam. If you send an email and it gets sent to the spam folder, potential respondents won’t see it and their lack of reply will affect your data.

Non-response bias leaves you in quite the pickle. If you're not careful it can skew the results of your research, lead to inconclusive results and play all sorts of games with your estimates since the sample size was different than expected.

To steer clear of it, make sure surveys are brief, easy to respond to and actually being sent to the correct audience (not to their spam folder.) Use close-ended questions that are more straightforward to fill out, and avoid double-barrelled questions as they can discourage people from answering.

How to avoid response bias in your surveys

We’ve been through the major response biases and how you can avoid them. But let’s take a look at further strategies you can use to keep your survey data bias-free.

1. Word your questions carefully

In the movie 300, the Spartan King, Leonidas, tells a Persian messenger to “choose your next words wisely”. The same applies to your surveys. Question wording is one of the most important parts of the survey creation process.

The way you word your questions can sway responses and kill any chance you have of gathering accurate data. When conducting survey research it’s super common to accidentally use leading questions and subconsciously influence results.

The good examples put the onus on the respondent, while the bad assume that respondents are already feeling a certain emotion and encourages them to respond negatively. This is a common form of response bias and easily fixed with a slight wording adjustment.

Another common mistake is making assumptions about survey respondents. For example, if you were to ask “what gym do you train at?” you assume that everyone taking your survey goes to the gym. This alienates respondents and will lead to a lower response rate.

A great way to navigate this issue is to use question logic. Logic allows you to adjust questions based on previous answers, so your survey adapts to your respondents.

You could first ask “do you go to the gym” and set it up so only those who answered “yes” are asked what gym they train at. Conditional logic is super easy to set up with Paperform—see it in action below.

If your survey uses multiple choice questions or includes predefined answers, you’ll want to be careful about the options you give. One of the biggest mistakes folks tend to make not offering enough response options.

This might not sound like a big deal, but when people don’t see an answer that they connect with, they tend to pick any old answer from the options. This is a form of extreme responding—and leads to inaccurate survey responses.

Say you’re doing market research to gauge the most popular social media platform among the 18-24 years demographic, but you leave out Twitter.

That means people who would answer “Twitter” will either be nonrespondents, or be forced to select an answer that is closest to what they really want to say. This doesn't work for you or your respondents and is a lose-lose situation for both parties.

To avoid this happening try to offer enough answer choices. This is easy for general terms, but if things are more complex or broad consider adding an “Other” option with spot for respondents to enter text of their own.

Where possible, you can also add a “I don’t know” option. Some participants won’t have an adequate answer to your question, and rather than forcing them give an inaccurate response, allow them the option of saying they haven’t got a clue.

As well as inviting more truthful responses, this can also be a valuable data set of its own. If you see a bunch of “I don’t know” responses popping up, it could lead you to a completely unique data point you didn't foresee.

3. Do your homework

Before you do any kind of survey you need to do your research. Put together all the information, opinions and ideas on the topic that you can, so the questions you ask are as relevant as possible.

A great way to make sure you’re asking the right things is to look up old surveys and see what previous questions have been asked on the subject. Look at what was (and wasn’t) covered, check the results and sample size, and see how you can approach the topic from a different angle.

You’ll also need to come up with who your demographic is. This influences what you ask and how you ask it. Ask yourself a few questions:

  • Who is my audience?
  • What do they have in common?
  • What do I want to learn from them?
  • How can I best gather that information?

Don’t look for validation for what you already believe. Just as the survey respondents have have their biases, you have yours too. Approach any survey you conduct as a scientist approaches an experiment—with an open mind and an acceptance that what they believe could be wrong.

Making your survey anonymous is a shortcut to more accurate data, particularly if you’re asking sensitive questions about personal beliefs and behaviors.

By making a survey anonymous, your respondents automatically feel comfortable providing more truthful responses. Just make sure, if you go this route, to be clear about their anonymity, and to never betray the trust of respondents.

This is an ideal strategy for things like customer satisfaction surveys, and surveys about workplace culture or employee feedback. None of these really require personal identification and respondents feel safer knowing they won’t be judged for their answers.

Over to you

Respondents walk into every survey with some element of bias. They are like the cowboy in an old western wandering into a saloon, except it’s not a pistol they’re wielding, it’s biases of every shape and size.

But doing what you can to reduce the likelihood of bias on your end makes a world of difference to the accuracy of your results. After all, even the best survey is only as good as the insights that it provides.

Why not get started creating a survey with Paperform today? Build one from scratch or pick from 600+ templates and get started instantly. We’ve got all the tools you need to create engaging, response bias-free surveys that people enjoy filling out.

Give it a go today with our 14-day free trial—no credit card required.

About the author

What do you think are the reasons why respondents are unable to answer the questions being asked?

Jack Delaney
Content Manager

Jack is Paperform's Content Manager, based in Sydney, Australia. He loves hard-boiled crime fiction, Michael Mann movies and coffee as black as midnight on a moonless night.