Show Seneca, C. (2020, September 17). How to Break Out of Your Social Media Echo Chamber. Retrieved September 30, 2020, from https://www.wired.com/story/facebook-twitter-echo-chamber-confirmation-bias/ Recently, a friend opened a poll on social media, asking “How do I escape the liberal echo chamber?” To explain, these past few months have been characterized with social media posts regarding modern societal issues and education on these matters, such as Black Lives Matter, detainment of undocumented individuals, and many more. Significantly increased after the passing of George Floyd, these posts aim to spread awareness and information, hoping to change someone’s opinion regarding these social issues. With the 2020 presidential election upcoming, many of these social media posts have been geared towards voting – in particular, who to vote for and why. With such a polarized election and environment in the U.S. today, these posts support either Joe Biden or Donald Trump, urging followers to vote for one or the other. In terms of my friend’s question, she points out an interesting and accurate fact – we engage in an “echo chamber.” More often than not, the posts we share are already common knowledge to the viewers, and they are already planning to vote for the specified candidate. My friend asks how we escape these chambers so these activist posts reach those who don’t share our views, and therefore make a real impact or change. This idea of social media echo chambers is detailed in this article by The Wire. The article explains how social media perpetuates confirmation bias. Confirmation bias is “the natural human tendency to seek, interpret, and remember new information in accordance with preexisting beliefs” (Seneca). In other words, we like to see what we already believe. In terms of social media, companies like Twitter, Youtube, and Instagram will use algorithms to push topics and information that we already interact with in order to maximize advertisement interaction. This creates echo chambers, as we’re rarely presented with new or contrary information. This only encourages the polarization in the country, and circumvents educational or political efforts like those commonly posted on social media. In order to avoid the echo chamber, the article suggests we seek “disconfirmation” in order to experience a more realistic online society and expose ourselves to alternative viewpoints that, in fact, DO exist outside the echo chamber (Seneca). This article heavily reminds me of the power, trends, and effects of social networks. Specifically the idea of positive and negative relationships between nodes, or people. Large balanced networks consist of either a large group of friends, or two large friend groups that have a negative relationship. In addition, the Triadic Closure Theorem asserts that a node with 2 strong ties will “close” into a connected triangle. Both of these ideas translate to social media networks and seeking confirmation. On social media, we often only see posts of others that we have a positive relationship with, and are often friends with others who are similar. This forms a friendship group, possibly even a connected component. Aside from the rare possible bridge to the other cloud of thought (and if TCT is satisfied, this bridge is a weak tie), we don’t experience a flow of ideas from opposing groups. Network theory clearly explains why we only see information on social media that we already agree with, and how we’re often “friends” with those who support the same beliefs as us. Finding a bridge to an opposing ideology is difficult, but necessary if one is truly looking to sway minds with their posts. Comments
Confirmation bias is our tendency to cherry-pick information that confirms our existing beliefs or ideas. Confirmation bias explains why two people with opposing views on a topic can see the same evidence and come away feeling validated by it. This cognitive bias is most pronounced in the case of ingrained, ideological, or emotionally charged views. Failing to interpret information in an unbiased way can lead to serious misjudgments. By understanding this, we can learn to identify it in ourselves and others. We can be cautious of data that seems to immediately support our views. When we feel as if others “cannot see sense,” a grasp of how confirmation bias works can enable us to understand why. Willard V. Quine and J.S. Ullian described this bias in The Web of Belief as such:
Experimentation beginning in the 1960s revealed our tendency to confirm existing beliefs, rather than questioning them or seeking new ones. Other research has revealed our single-minded need to enforce ideas.
Like many mental models, confirmation bias was first identified by the ancient Greeks. In The History of the Peloponnesian War, Thucydides described this tendency as such:
Our use of this cognitive shortcut is understandable. Evaluating evidence (especially when it is complicated or unclear) requires a great deal of mental energy. Our brains prefer to take shortcuts. This saves the time needed to make decisions, especially when we’re under pressure. As many evolutionary scientists have pointed out, our minds are unequipped to handle the modern world. For most of human history, people experienced very little new information during their lifetimes. Decisions tended to be survival based. Now, we are constantly receiving new information and have to make numerous complex choices each day. To stave off overwhelm, we have a natural tendency to take shortcuts. In “The Case for Motivated Reasoning,” Ziva Kunda wrote, “we give special weight to information that allows us to come to the conclusion we want to reach.” Accepting information that confirms our beliefs is easy and requires little mental energy. Contradicting information causes us to shy away, grasping for a reason to discard it. In The Little Book of Stupidity, Sia Mohajer wrote:
How Confirmation Bias Clouds Our JudgmentThe complexity of confirmation bias arises partly from the fact that it is impossible to overcome it without an awareness of the concept. Even when shown evidence to contradict a biased view, we may still interpret it in a manner that reinforces our current perspective. In one Stanford study, half of the participants were in favor of capital punishment, and the other half were opposed to it. Both groups read details of the same two fictional studies. Half of the participants were told that one study supported the deterrent effect of capital punishment, and the other opposed it. The other participants read the opposite information. No matter, the majority of participants stuck to their original views, pointing to the data that supported it and discarding that which did not. Confirmation bias clouds our judgment. It gives us a skewed view of information, even when it consists only of numerical figures. Understanding this cannot fail to transform a person’s worldview — or rather, our perspective on it. Lewis Carroll stated, “we are what we believe we are,” but it seems that the world is also what we believe it to be. A poem by Shannon L. Alder illustrates this concept:
Confirmation bias is somewhat linked to our memories (similar to availability bias). We have a penchant for recalling evidence that backs up our beliefs. However neutral the original information was, we fall prey to selective recall. As Leo Tolstoy wrote:
Why We Ignore Contradicting EvidenceWhy is it that we struggle to even acknowledge information that contradicts our views? When first learning about the existence of confirmation bias, many people deny that they are affected. After all, most of us see ourselves as intelligent, rational people. So, how can our beliefs persevere even in the face of clear empirical evidence? Even when something is proven untrue, many entirely sane people continue to find ways to mitigate the subsequent cognitive dissonance. Much of this is the result of our need for cognitive consistency. We are bombarded by information. It comes from other people, the media, our experience, and various other sources. Our minds must find means of encoding, storing, and retrieving the data we are exposed to. One way we do this is by developing cognitive shortcuts and models. These can be either useful or unhelpful. Confirmation bias is one of the less-helpful heuristics which exists as a result. The information that we interpret is influenced by existing beliefs, meaning we are more likely to recall it. As a consequence, we tend to see more evidence that enforces our worldview. Confirmatory data is taken seriously, while disconfirming data is treated with skepticism. Our general assimilation of information is subject to deep bias. Constantly evaluating our worldview is exhausting, so we prefer to strengthen it instead. Plus holding different ideas in our head is hard work. It’s much easier to just focus on one. We ignore contradictory evidence because it is so unpalatable for our brains. According to research by Jennifer Lerner and Philip Tetlock, we are motivated to think critically only when held accountable by others. If we are expected to justify our beliefs, feelings, and behaviors to others, we are less likely to be biased towards confirmatory evidence. This is less out of a desire to be accurate, and more the result of wanting to avoid negative consequences or derision for being illogical. Ignoring evidence can be beneficial, such as when we side with the beliefs of others to avoid social alienation. Examples of Confirmation Bias in ActionCreationists vs. Evolutionary Biologists Evolutionary biologists have used fossil records to prove that the process of evolution has occurred over millions of years. Meanwhile, some creationists view the same fossils as planted by a god to test our beliefs. Others claim that fossils are proof of the global flood described in the Bible. They ignore evidence to contradict these conspiratorial ideas and instead use it to confirm what they already think. Doomsayers In When Prophecy Fails, Leon Festinger explained the phenomenon this way:
Music Confirmation bias in music is interesting because it is actually part of why we enjoy it so much. According to Daniel Levitin, author of This Is Your Brain on Music:
Witness the way a group of teenagers will act when someone puts on “Wonderwall” by Oasis or “Creep” by Radiohead. Or how their parents react to “Starman” by Bowie or “Alone” by Heart. Or even their grandparents to “The Way You Look Tonight” by Sinatra or “Non, Je ne Regrette Rien” by Edith Piaf. The ability to predict each successive beat or syllable is intrinsically pleasurable. This is a case of confirmation bias serving us well. We learn to understand musical patterns and conventions, enjoying seeing them play out. Homeopathy Homeopathy was invented by Jacques Benveniste, a French researcher studying histamines. Benveniste became convinced that as a solution of histamines was diluted, the effectiveness increased due to what he termed “water memories.” Test results were performed without blinding, leading to a placebo effect. Benveniste was so certain of his hypothesis that he found data to confirm it and ignored that which did not. Other researchers repeated his experiments with appropriate blinding and proved Benveniste’s results to have been false. Many of the people who worked with him withdrew from science as a result. Yet homeopathy supporters have only grown in numbers. Supporters cling to any evidence to support homeopathy while ignoring that which does not.
Scientific Experiments This is problematic. Inadequate research programs can continue past the point where the evidence points to a false hypothesis. Confirmation bias wastes a huge amount of time and funding. We must not take science at face value and must be aware of the role of biased reporting.
ConclusionThis article can provide an opportunity for you to assess how confirmation bias affects you. Consider looking back over the previous paragraphs and asking:
Being cognizant of confirmation is not easy, but with practice, it is possible to recognize the role it plays in the way we interpret information. You need to search out disconfirming evidence. As Rebecca Goldstein wrote in Incompleteness: The Proof and Paradox of Kurt Godel:
To learn more about confirmation bias, read The Little Book of Stupidity or The Black Swan. Be sure to check out our entire latticework of mental models. |