Confessions of a Former Conspiracy Theorist

Recently, my eleven year old daughter and my nine year old son approached me, brandishing a dollar bill. Wide-eyed, they pointed to the back of it, to the pyramid and the eye, and informed me this was an insignia of the Illuminati. I laughed a bit uncomfortably and asked where they had heard of that term. The answer? YouTube videos. I actually can say I saw that one coming. According to a 2013 report by the World Economic Forum, per minute, 48 hours’ worth of content is uploaded to YouTube. That’s a lot of “information.”

Another similar incident happened over Easter weekend, but it was more alarming. The holidays are a time of fun and family togetherness. This Easter, for me, it was also a time of having strange arguments with my family, namely, my father. Interesting thing, it turns out my father believes the Earth is flat. He also believes in FEMA camps, the Illuminati “culling the herd,” and also chem-trails. With what I felt was a strange glimmer in his eyes, he told me about how amazing YouTube is, where he is able to learn these so-called truths from Average Joe’s publishing videos on their own channel.

Yes, amazing, Dad. Thank heavens for YouTube.

One of the most alarming and unnerving aspects of social media is the way rumors and questionable or false information spread like wildfire, particularly conspiracy theories. According to the World Economic Forum, massive digital misinformation is one of the main risks for modern society. “Our hyper-connected world could… enable the rapid viral spread of information that is either intentionally or unintentionally misleading or provocative, with serious consequences” (“Digital Wildfires…”). This is a valid concern. Why does misinformation travel so far and effectively, and what, if anything, can be done to stop it?

Look at how much traction was gained by Dr. Andrew Wakefield’s later-disproven anti-vaccine study, remnants of it still swirling across social media today. Almost 20 years later, despite being completely debunked, some are still not willing to vaccinate their children for fear of autism or other reactions. After the story became widespread, it caused panic among parents, “driving MMR uptake percentages down as much as 30 percent and increasing measles cases more than 18 times” in the U.K. (McIntosh White 79). This is just one example of the harm that can come from the spreading of false information. It can literally lead to a public health crisis.

How does this type of information spread so prolifically? We can thank the World Wide Web. As of 2013, Facebook had a reach of more than 1 billion active users, and Twitter, 500 million (“Digital Wildfires…”) Before, to have the world as a potential audience, one would likely need to be published in some way. Now, anyone with Wi-Fi can speak their piece loud and clear to everyone else with a connection. Logical fallacies and confirmation biases abound. Then, people with common beliefs tend to gather together and reinforce the beliefs in one another, a virtual echo chamber. According to a 2015 research article, when 1.2 million people’s Facebook habits were studied, they found “polarized communities” developed, where people generally consumed only the information that most reinforced their beliefs.

Everyone on the Web can produce, access, and diffuse contents actively participating in the creation, diffusion, and reinforcement of different narratives. Such a large heterogeneity of information fostered the aggregation of people around common interests, worldviews, and narratives (Bessi et al. 2).

I suppose it is not hugely surprising that people prefer to hear the things they already believe to be true. People are creatures of comfort, after all.

Another phenomenon that makes conspiracy theorists more dug into their theories is confirmation bias. “A confirmation bias is a type of cognitive bias that involves favoring information that confirms previously existing beliefs or biases” (Cherry). Essentially, when a person believes something is true, they then pay closer attention to the facts and events which support these beliefs. Then, they tune out anything which does not support their beliefs. This runs rampant in conspiracy theorist circles. People will dig up articles and stories which support their beliefs and accept them as confirmation, while they disregard anything that is in contradiction. Look at the issue of gun control. A stanch believer that the government wants to take away everyone’s right to bear arms will cite stories of guns saving lives and politicians who are in favor of gun control as evidence, but will not consider statistics that show that guns are responsible for many accidental deaths. They will generally write that information up as being part of the conspiracy and discredit it.

But what about when false information spreads and then is corrected? Surely, when people learn that what they believe was based on false premises, they would be willing to reconsider, right? Not necessarily. Sources say that not only does debunking usually fail at dispelling misinformation, but it can actually make people believe the false information even more, called a “reinforcement effect” (Bessi et al. 1).

This was the case at my father’s house over Easter. I definitely was lacking in a strong counter-argument that day, as I had not come to celebrate the holiday prepared to give a speech detailing evidence of life on a round planet. But, for anything I did say to him, (“What about the photos from the Space Station?” “That’s what they want you to believe, Elizabeth, we have never been to space.”) he had a strong counter that this was what I was supposed to believe, determined by some group of elites. I see political and scientific arguments devolve this way on social media, too. According to Bessi et al., “Conspiracy theses tend to reduce the complexity of reality by explaining significant social or political aspects as plots conceived by powerful individuals or organizations.” Basically, people can explain any plot holes with the conspiracy itself.

People often use this stance in arguments that reject scientifically-supported facts. Take, for instance, people who argue there is in fact a cure for cancer, one which is being withheld by the government and “Big Pharma.” Never mind that these very employees of the government and the pharmaceutical companies could also get cancer, or their loved ones, and they too would stand to benefit from the eradication of the disease. No way, it’s a conspiracy. There is always an explanation, and anyone who doesn’t believe it too is a brainwashed “sheeple.”

So what is the answer to this problem of rampant misinformation being accepted as truth? Some people might argue for some form of censorship. I feel there is no way that would be acceptable. There are both practical and ideological reasons this is a bad idea. It violates our First Amendment rights, and it takes away our ability to make personal decisions. It would be difficult to establish a thorough-but-fair-but-consistent standard for what to censor. Things would get through, too. Kids would still sneak and find the forbidden material and still be exposed. Censorship is just not the way to solve this problem. Knowing how to grapple with new information is the key. Educating people in media and information literacy is clearly the answer.

Teaching people how to process information in a smart way is something that would make a difference. It did for me. The funny thing about me is that the apple doesn’t fall from the tree. Not only is my father a conspiracy theorist, but I once was too. It’s something I don’t always tell people because it is embarrassing, but I think it is a testimony to how easy it is to get pulled into the world of conspiracies and fear.

Back when I first got on Facebook in 2009, I fell into a bad virtual crowd. I’d always had some amount of cynicism about politics, as many people do have. Some friends close to me posted a lot of links about the terrible things our government was inflicting upon the nation, the world even! The more I saw, the worse it got. I was appalled. FEMA camps were being built across the country, Obama was not really born in the U.S. and wanted to become the forever-President under martial law and take our guns. Gay rights battles like the right to marriage were merely distractions to create conflict among the people of the country so they could grab more of our rights away from us. As I mentioned, vaccines were evil, injecting our babies with cancers and deadly toxins, and I even began to wonder if I had done the wrong thing by having my own kids vaccinated.

It was so fast and so easy to become swept up in this way of thinking. I do not consider myself a stupid individual, but there I was, believing unreliable sources that all quoted one another as resources, with little more than anecdotal data to back up most of the claims made. I fell into the cycle of logically fallacies and confirmation biases. I begin to have rifts with my other friends, who could not believe the things I was beginning to share on Facebook. When they tried to argue with me, I would sneer and say those words – “That’s what they want you to think.” I had a comeback for any argument, no matter how well-thought out it seemed, no matter how uncomfortable it actually made me feel. I got pretty deeply dug into this viewpoint.

So what was it that got me to come back around from the deep end? For me, it was going back to school. At that point, my career in food service had stalled out and life had led me to the conclusion that school would be beneficial to me. Out of my element and nowhere near my usual comfort zone, I enrolled in classes for the spring. I was taking English 1551 that semester and I learned how to write a research paper, something I had never done before. Research was new to me, and it was fascinating to look up a topic in the library database and find all the journal articles about it.

The professor talked to us about ways to tell if a source was reliable and usable in a paper, and I began to realize that I had not been getting good information. The websites I had been reading used heavy biased language, obviously written to sway a person’s sentiment in a certain direction, or to cause excitement for people reaffirming their already-held beliefs. The quality of the writing was often not very good, with poor grammar and spelling. The people who wrote the articles had no real credentials in areas they were discussing, if any credentials at all. There was never actual evidence-based data, scientifically gathered nothing, just a circle of confirming sources, anecdotal evidence, and arguments full of logical fallacies.

I realized by the time that I finished that first paper that I had been wrong. I had let myself get duped into believing things that could actually be harmful to me or other people. I had learned how to better check my sources. I had learned information literacy.

There are different terms to describe this type of thinking – information literacy, media literacy, news literacy, critical thinking. They all fall under the same umbrella of learning to assess information in an intelligent way. “People deal with information constantly during work, leisure, civic, social, and academic activities, and they need to be able to decide the validity of information given, the bias of the conveyer of the information, and the meaning of that information” (Martin 268).

Efforts to teach information literacy have been somewhat scattered since they began in the 1970’s. And often when efforts have been made, they were more of a “protectionist” effort, “which sees media education primarily as a way to protect children from bad messages” (Heins and Cho 5). There has not been a consistent nationwide curriculum put in place in the U.S. for information literacy or teaching children critical thinking. These skills go largely untaught and the kids grow into adults who begin sharing conspiracy theories on Facebook and making YouTube Videos about the Wal-Mart turned into FEMA camp in your hometown.

A few months ago, I let myself get sucked into a Facebook thread argument. I can’t even remember what it was about. But the gentleman began to post links in the thread, and they were to sources with names like Abovetopsecret.com, Commonsenseconspiracy.com, and Beyondnews.com. I have learned not to get pulled into debates like this. People who think this way turn everything you say into a reaffirmation of their beliefs, they argue with unreliable sources, logical fallacies, and anecdotal evidence. I told the man in the Facebook there that he needed to learn how to better discern accurate sources and linked him to a page on how to do this from a university’s writing department. I’m certain he did not take my advice.

I absolutely believe if most people were taught how to tell the difference between reliable and unreliable sources, they too would come to expect higher quality from their information. It is what helped me. When I was a conspiracy theorist, there was nothing anyone could say to me in any argument that would change my mind. I had to learn to ask the right questions and then find the answers in an intelligent way. I had to learn to listen, with an open mind, to information that conflicted with what I wanted to believe.

Adding media/information literacy training to a child’s basic education here in the U.S. would make a difference. It might take a while to manifest on social media, and by then, who knows what medium will be the new big thing? Facebook might go the way of Myspace, once the biggest social media web site, and some new phenomenon will have taken its place. But whatever the medium, teaching people how to tell reliable sources from those that aren’t would help make social media a smarter place. More importantly, it could make the world a better place.

I’m not sure how to handle things with my father. He is nearly 70 years old now, and I’m not sure there is any way to teach him to think differently at this point in time. Sometimes, you have to know when to cut your losses and move on. I’m trying to work on media literacy with my kids, though. We talk about things like who publishes those YouTube videos and what their beliefs might be which influence what they talk about. I plan to teach them about using good sources, and triangulating the information they find with other sources to see if it is consistent.

Sometimes, trying to fight this war on social media seems like a dragon I cannot slay. Often, I bite my tongue and I try to pick my battles wisely. Sometimes I post links with checklists for how to analyze media and resources. I’m not sure if they get read, as they are not as flashy as the conspiracy theories headlines. I often get a snarky comment about the MSM (mainstream media). Probably, to those who believe these theories, I am a sell-out. I lost some of my friends when I came around and stopped believing. But, my other friends, the ones who liked me enough to see past my silly little phase, have welcomed me back to logic with open arms and slightly smirky smiles. We talk about it as that time I lost my mind. It is embarrassing that I fell for it, but it is also a testament to how ordinary people become believers in conspiracy theories. Hopefully, as social media ages, it also matures somewhat, and eventually people learn how to use it better, with more wisdom.

Works Cited

Bessi, Alessandro, Mauro Coletto, George Alexander Davidescu, Antonio Scala, Guido Caldarelli, and Walter Quattrociocchi. “Science vs Conspiracy: Collective Narratives in the Age of Misinformation.” PLOS ONE. Bessi Et Al., 23 Feb. 2015. Web. 6 Apr. 2016.

Cherry, Kendra. “Why We Favor Information That Confirms Our Existing Beliefs.” About.com Health. About Health, 27 Oct. 2014. Web. 19 Apr. 2016.

“Digital Wildfires in a Hyperconnected World.” Global Risks 2013. World Economic Forum, 2013. Web. 06 Apr. 2016.

Heins, Marjorie, and Christina Cho. “An Alternative to Censorship – Free Expression Policy Project.” Fepproject.org. Free Expression Policy Project, 2003. Web. 24 Feb. 2016.

Martin, Crystle. “An Information Literacy Perspective On Learning And New Media.” On The Horizon 19.4 (2011): 268-275. Education Research Complete. Web. 24 Feb. 2016.

McIntosh White, Judith. “Sabotaging Public Engagement with Science: Missing Scientific Principles in Newspaper Stories about the Wakefield MMR-Autism Controversy.” Sabotaging Public Engagement with Science: Missing Scientific Principles in Newspaper Stories about the Wakefield MMR-Autism Controversy. Romanian Journal of Journalism & Communication, 2012. Web. 06 Apr. 2016.

 

 

 

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s