Issue 02 of our print magazine is available to buy now

Issue 02 is available now

A case for ambivalence: how can we escape our echo chambers?

A case for ambivalence: how can we escape our echo chambers?

Person speaking into megaphone
By Arielle Domb
20th Jan 2021

In a world of polarised opinions and fake news, can we trust our own beliefs?

Two years ago, an 18-year-old high school student was giving a graduation speech to his fellow students. Ben Bowling, the valedictorian, decided to share an inspirational quote, which he attributed to Donald Trump:

“‘Don’t just get involved. Fight for your seat at the table. Better yet, fight for a seat at the head of the table.’” 

The crowd went wild with applause. Located in Pineville, Bell County High School had an overwhelming Republican majority. Two years earlier, Trump had received an overwhelming 82% of votes.

But then something interesting happened. Amidst the crowd’s laughter, Bowling corrected himself:  “Just kidding,” he said. “That was Barack Obama.” 

The crowd quietened. Some continued laughing, someone booed, others said nothing at all.

Badges of identity

Bowling’s joke reveals to us something pertinent about how our beliefs operate. Asked afterwards by Courier Journal what he meant by the misattribution, Bowling confirmed that he meant the moment to be light-hearted. “I just thought it was a really good quote,” he explained. “Most people wouldn’t like it if I used it, so I thought I’d use Donald Trump’s name. It is southeastern Kentucky after all.”

While we may imagine ourselves as actively making decisions about what we do and don’t stand for, our beliefs are actually more rigid than we might like to think. This is because our beliefs are an integral part of our identities. They signal to ourselves and others what kind of person we are, and which groups we belong to. “To change your convictions means changing the kind of person you want to be. It means changing your self-identity. And that is not just hard, it is scary,” explains philosopher Michael Patrick Lynch. Straying from them feels uncomfortable as if we are betraying others and dislodging a vital part of ourselves. As Lynch puts it: “[n]o one wants to crush their self-image, betray their tribe and be voted off the island.”

To change your convictions means changing the kind of person you want to be. It means changing your self-identity. And that is not just hard, it is scary.

A one-way mirror

But our desire to surround ourselves with people that see the world exactly as we do can become dangerous. While associating with people who share our view of the world is good for our egos and gives us a sense of belonging, it can be detrimental when our views are merely reinforced without challenge. 

When like-minded individuals can communicate in spaces where they are insulated from divergent views (like a Facebook news feed), social scientists describe them as being in an echo chamber. As evoked by the metaphor, individual voices bounce off each and become stronger, amplifying and reinforcing biases within the group.

In the past decade, sociologists have observed how computer algorithms play a part in exacerbating this process. In his book, The Filter Bubble: What The Internet Is Hiding From You, Eli Pariser explains how invisible algorithms predict and reflect back the interests of its user. He cites an example of two people searching for “BP” (British Petroleum) on Google. While one search brought up news related to investing in the company, the other user was shown news about a recent oil spill. 

Filter bubbles, as he describes them, serve as “a kind of invisible autopropaganda […] indoctrinating us with our own ideas, amplifying our desire for things that are familiar”. Without knowing it, we can become imprisoned within our own echo chambers: ensnared by information that already aligns with our beliefs, pushing us further away from those who disagree with us. 

More noise, less impact

Social media has put this process on steroids. As NPR’s social science correspondent Shankar Vedantam explains in an episode of Hidden Brain, “it isn’t enough to say your opponents are wrong, you have to say: they are reprehensible.” We see extreme expressions of cultural tribalism all over the internet. Vicious Twitter wars play out every day, sometimes leading to real-life repercussions, like last week’s storming of the Capitol building.

Psychologists have revealed that there is actually an evolutionary reason why we get angry at each other. Outrage and our ability to dole out punishment deters wrongdoing, helping to regulate the behaviour of a group. In fact, outrage serves such a useful evolutionary function that we get pleasure from punishing others, activating areas in our brain associated with reward. But while this evolved from interactions with small groups, when it had to be weighed up whether it was worth inflicting punishment, we can now aggressively engage with strangers on the other side of the world with little to no physical repercussions.

Social media platforms exploit vulnerabilities in our brain circuitry like these, influencing the way we communicate with each other with rewards such as likes and retweets. Because of the business model of these mediums – geared around attention and engagement – we are incentivised to attune our language to engage other users as much as possible.

In a study conducted by NYU psychology professor, Jay Van Bavel, it was revealed that for every moral or emotional word used, the number of retweets was increased by 15-20%. So if a Tweet is packed with emotionally-charged language, the message is more likely to spread. 

While it might spread further, it has less impact. The study revealed that while outrage leads to more engagement, the type of people liking and sharing these messages usually share these political views already. In other words, outrage is very effective at spreading a message within an echo chamber, but it doesn’t necessarily lead to more change.

For every moral or emotional word used, the number of retweets was increased by 15-20%.

“In some ways, this makes intuitive sense,” says Vedantam. “When was the last time you changed your mind because someone screamed at you?” 

In contrast, Van Bavel’s team found that tweets without moral and emotional language were better able to start a conversation with people from other political persuasions. This is not to say that outrage isn’t important. Anger can be an incredibly productive force for change, as we’ve seen recently with the #MeToo movement and Black Lives Matter. But for a lot of issues, it’s worth thinking about how we are communicating with one another. What are we trying to accomplish? Are we trying to appeal to those who already agree with us, or are we trying to genuinely change minds

A world constructed from the familiar is the world in which there’s nothing to learn.

It’s also worth considering what our opponents can teach us. Are we engaging with diverse viewpoints, or are we hearing the same views echoed over and over again? Are we listening to people with different political persuasions, or merely shutting them down? Because if we really believe in our beliefs, there’s no use cutting ourselves off from the very minds we want to change, isolating ourselves in a vacuum of like-minded people, devoid of divergence and difference. As Eli Praiser puts it, “a world constructed from the familiar is the world in which there’s nothing to learn.” If our intention is to have an impact on the world, we cannot do so without expanding the parameters of our conversations and engaging with those who see things differently.

How can we escape our echo chambers?

  • Learning about our biases. Research shows that learning about cognitive biases can help us overcome them. We become better at noticing ways our thought patterns are unconsciously influenced, and can then take action to outsmart these habitual responses.
  • Reflect on our convictions. Are we agreeing with something because we’ve truly considered it, or does it merely chime with our pre-existing beliefs? Taking a moment to step back and consider this in isolated instances can help us get better at discerning between tribal values and reasoned beliefs.
  • Pay more attention to our sources. Where are we getting our information? Is it corroborated with evidence? We can reduce the effect of the confirmation bias by getting better at analysing how reliable the sources from which we form our beliefs.
  • Seek out disconfirmation. Listen to the arguments of those who hold differing views, and consider how they would critique your own beliefs. By actively searching for information that contradicts your beliefs, we get a more nuanced understanding of complex issues.
  • Be empathetic. If we want to change minds, we need to actively listen to opposing views. As behavioural strategist, Will Hanmer-Lloyd puts it: “If people feel respected and trusted they are more likely to listen; and if they can find out on their own, then they will have time to process and engage with it without feeling defensive.”