Views expressed in this blog are mine alone and are not intended to represent Williamson County policy nor intended as legal advice.
We’ve all seen some pretty wild speculation about the source, validity, and proper response to COVID. Some of them are patently absurd (it’s a master plan by Bill Gates and the Chinese to implant us all with 5G tracking chips). Some of them are cleverly cloaked in reasonable terms invoking the names of distinguished research facilities and universities (gargle with warm water and check for lung function by holding your breath). Still, others are peddled so fast and furiously over social and broadcast media that their source, veracity, and worth are lost.
The question that always comes to mind for me is how are people taken in by disinformation? How does an otherwise reasonable person share posts and tweets that have little to no foundation in truth? Who, ultimately, do we believe?
As I am wont to do these situations, I did some research. In March of 2019, the US State Department published a study entitled “Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age”. While this document has an obvious focus on foreign actors, it does provide some solid information on how and why people respond to disinformation.
There are psychological factors that grease the success of disinformation. Things like a need to belong, simplifying complex topics, and providing a feeling of exposing the truth.
The need to belong is a prominent factor. Many studies have shown that people seek a feeling of community through identity, the commonality of belief, and social belonging. Sounds familiar, doesn’t it? Most of us have been in a situation – the first day at school, a new job, a new city – where we find ourselves adrift from the social order around us. We are, as human beings, distinctly social animals. Even those of us who are introverts still need to belong, we just look for it in differing amounts and types.
The Internet and social media have gone a long way towards creating community across widespread distances. Indeed, it’s one of the things that the Internet does well. It brings together people who might otherwise never be able to meet someone with a shared interest, similar identity, or belief.
The report asks, “So how does this apply to the resonance of disinformation and propaganda?” In large part, it’s confirmation bias. We are drawn to information that supports what we already believe. That’s information that connects with us on a personal basis and supports the picture of ourselves we want to project. So, we seek out sources that support our systems of belief and avoid or mock those that conflict with our beliefs. We share what we find with our social media connections to enhance our social standing and public face.
Social media, in particular Facebook, is very effective in creating an echo chamber effect. We only hear and see what we already believe. This bolsters our need to belong because we are “surrounded” by people who share our beliefs. In an article published in May 2016, the Wall Street Journal created a way for people to view a single topic in a Red v Blue fashion. It’s an eye-opening chance to see how different a typical news feed is depending upon its political stance.
What it shows is that due to the algorithms Facebook uses, which have changed multiple times since this article was published but whose results have changed little, social media targets what you see based upon what it thinks your interests are going to be. Take a look at “Ads” under Settings on your Facebook account. It’s divided into Your Interests, which has 15 subcategories, Advertisers and Businesses, which has 5 subcategories, Your Information, which has 2 subcategories, Ad Settings, which allows you to control who can see your info, Ad Topics, and a section on how Facebook Ads work. Ever wondered why you’ll move from Google to Facebook and see ads for things you just searched? In part, it’s because of your Ad Settings (Google has its own settings). Your Interests are things that Facebook thinks you’ll want to see posts about. It’s these interests that are shared with advertisers and exploited by people who spread disinformation.
How does this all tie into the need to belong? We look to our social networks that support our beliefs and identity as news sources. Basically, we’re more likely to share something from someone we have something in common with or a trusted friend. But that news source is only as good as the person posting it. If they share unverified or low-quality information, it makes you more susceptible to disinformation.
The take away here is the old adage – “trust but verify”.
The next thing the study talks about is the “firehose of information”. We are inundated with information in our daily lives – social media, cable news, email, online news of all kinds. Sorting through that information is a daunting task. “The volume of information, combined with people’s limited attention spans, also makes it difficult to discern between high- and low-quality information, creating an environment in which disinformation has the potential to go viral”.
Bluntly put, due to the massive volume of news, we tend to ignore what we don’t understand and dismiss what contradicts our biases.
We also, as human beings, tend to want answers in uncertain situations. It’s part of the reason you’ll see news media try to come up with reasons for what’s going on long before the experts will say anything. It’s not that the experts are hiding anything, it’s that the very nature of complex events requires complex understanding and responses. Those don’t come quickly. We want answers and we want them quickly. So quickly, that we’ll seize upon anything that fits our underlying criteria. All too often, this promotes black and white thinking, extremism, and polarization.
So, on the one hand, we have an incredible volume of information available to us. On the other, we have a natural tendency to want clear cut answers in a crisis. Together with cognitive bias, this creates a sense of naïve realism. The study defines this as a belief “that [an individual’s] perception of reality is the only accurate view, and that those who disagree are simply uninformed or irrational.”
How do we combat our natural tendencies and the purveyors of disinformation?
Fact-checking has a limited effect. In one study, “researchers assessed 330 rumor threads on Twitter associated with nine newsworthy events … to understand how people interact with rumors on social media. Their analysis determined that users are more active in sharing unverified rumors than they are in later sharing that these rumors were either debunked or verified. The veracity of information, therefore, appears to matter little. A related study found that even after individuals were informed that a story had been misrepresented, more than a third still shared the story”.
Why are we less likely to send out the correction to our previous post? It’s the same reason that when newspapers print retractions when they make errors, they generally place them somewhere in the middle of section A of the paper. No one wants to admit that they were wrong or taken in by false information. Plus, that information may contradict our biases and we dismiss the correction as false or misleading.
It takes a concerted effort to be a watchdog. It requires both a healthy sense of skepticism and the time and knowledge to verify before smashing the share button. Despite the long-term effort to make us doubt the news media, we still go seek out information either through television, print, or online news sources. It’s just that today, more than ever before, we can find sources that match our biases, no matter what they may be. Also, that effort to increase our distrust of the media means we only trust the sources that meet our expectations. If you watch Fox News, you don’t trust MSNBC and vice versa. This distrust of media outlets isn’t healthy skepticism. It’s far more akin to polarization.
Healthy skepticism, particularly regarding social media, means we should know a little bit about the media landscape. We need to know the veracity and bias of a source before we make the choice to share. The problem is, that this takes time and effort. Neither of which is generally associated with our online lives. Part of the appeal of social media is the ease of which we can communicate and share information. Slowing down to take a couple of minutes to check out a headline is not how we’ve been trained to use it. And that’s the crux of the problem.
We can no longer simply trust our news sources to provide us with the unvarnished truth. Back in the day, when I got my degree in Journalism, the Holy Grail of the journalism world was objectivity. We were trained to be objective, to take every effort not to insert our own beliefs and agenda into what we wrote. The facts and just the facts were all that was needed. Our job was to provide information and let the reader draw their own conclusions. The opinion was restricted to personal columns and the Op-Ed page.
That’s not true anymore.
In this time of COVID-19, we’ve been inundated with conflicting messages. We want quick, simple answers to a massively complex problem. We are required, now more than ever, to be skeptical of what we hear and think critically before responding. Our very lives, the lives of our families and communities, depend upon it. With a pandemic raging around us, we cannot afford to be complacent and trusting of our sources just because we have relied on them in the past.
What we can and must do requires us to be active participants.
Be skeptical of everything you see on social media that claims to be news.
Be aware of your biases and those of your sources.
Be knowledgeable about sources that are clickbait.
Think before you share a post:
Is it credible?
What is the source?
Is this newsworthy or sensationalism?
If I copy and paste this headline into Google, who else is reporting the same thing? (never click on the link, that’s what they want)
Does this support a hidden agenda?
Be critical of those who share disinformation:
Give them accurate information.
Put them on a 30-day snooze or mute.
Report them if they continue.
Our democracy requires our participation to survive. To participate, we must be well informed. To be well-informed, we must be skeptical and critical in our consumption of media. We must work, every day, to understand our biases, to address our prejudices, and understand the world around us, not just from our own perspective, but from that of others.