The Problem with Fake News

Part I: Disinformation and Misinformation: Why Does It Spread So Quickly?

The term “fake news” has become a common response when a person hears news he or she doesn’t want to believe. Although media experts initially defined fake news as “factually false information, delivered in the context of a supposedly true news story, that is deliberately designed to deceive readers or viewers,” today the term also includes disinformation campaigns delivered through the Internet and on social media (“Fake News on Social Media”).

Before getting into a discussion of the causes of fake news, it is important to understand the difference between misinformation and disinformation. Valerie Strauss, writing for the Washington Post, states that defines misinformation as “false information that is spread, regardless of whether there is intent to mislead.” Disinformation, on the other hand, has a very different meaning: “Disinformation means “deliberately misleading or biased information; manipulated narrative or facts; propaganda” (Strauss). To sum up the difference, the viewers must look at intent.

For example, if a person spreads misinformation, he or she most likely believes that it is true. However, disinformation is purposely crafted and spread with the intent to mislead others. The differences between these two terms can be confusing because it is possible that disinformation can easily become disinformation, depending on who is sharing the information and why it is being shared. For example, if a politician purposely spreads false information, that is disinformation. However, if an individual sees this information, believes it to be true, and then shares it with friends, it is misinformation.

When people spread misinformation, they often believe the information they are sharing. In contrast, disinformation is crafted and disseminated with the intent to mislead others. Further confusing the issue is the fact that a piece of disinformation can ultimately become misinformation. It all depends on who’s sharing it and why. For example, if a politician strategically spreads information that they know to be false in the form of articles, photos, memes, etc., that’s disinformation. When an individual sees this disinformation, believes it, and then shares it, that’s misinformation.
Although the idea of fake news or disinformation isn’t new, the problem is becoming more and more prevalent with the popularity of the Internet and social media. Many factors contribute to the growth of disinformation and misinformation including troll farms, social media structure, and human nature.

Cause #1 — Troll Farms

One reason why disinformation is spreading so fast today is because Internet search engines and social media sites have become the targets of organized groups that create and spread disinformation online.

Known as “troll farms,” these groups exist in many different countries, but most particularly in Russia. For example, The Internet Research Agency, which is based in Olgino, Russia, had about 200–300 people working for it. As a result, the Kremlin-linked troll farm, reached up to 126 million Americans on Facebook during the last presidential election with fraudulent accounts, groups, and advertisements, according to the report of Special Counsel Robert Mueller ( qtd in Snider).

These “troll farms” generate online traffic aimed at affecting public opinion and spreading both misinformation and disinformation. Russians were sent to the United States to set up the infrastructure that allowed these farms to set up social media accounts that looked like they originated in America and were owned by Americans.

Clint Watts, a senior fellow at the Foreign Policy Research Institute asserts that the goal of these troll farms is to “create divisive wedges, pitting Americans against each other. They create these conflicts and report on them in an overt way on state-sponsored Russian media about how unstable America is. The goal is to undermine democracy. So you want America to look unstable and Americans not to trust each other (Qtd. in Snider).”

This video explains how the Russian Troll Farms attempted to interfere in the 2016 election in the U.S.

Cause #2 — Social Media

Photo by William Iven on Unsplash

As Robert Muller’s investigation suggested, troll farms effectively used political tensions in the United States to spread disinformation over social media sites such as Facebook and Twitter. Posting topics that are specific, divisive, and controversial, inflame disagreement from viewers. This creates a polarization which makes it easier to manipulate the viewers. The algorithms used to generate this content can be exploited by the troll farms. Using algorithms, social media sites can insulate viewers from opposing viewpoints, so they never have a chance to view the other side of an issue. This is called the “echo chamber effect” since viewers who only see one opinion will begin to believe that his is the dominant political opinion even if it isn’t true.

This video explains how Facebook plans to fight misinformation in the 2020 election.

Another problem with social media and disinformation is the debate over whether social media sites like Facebook are publishers or platforms. If these sites are considered publishers, they can be held legally responsible for the content their viewers post. If they are merely platforms, they are not responsible for any content published on their site. Facebook identifies itself as a “platform” to the public. However, in court proceedings, it considers itself a “publisher.” In order to control the spread of fake news on social media, this question needs to be answered.

Photo by NordWood Themes on Unsplash

Cause #3: Human Nature

A final reason why misinformation and disinformation spread so quickly boils down to human nature. For example, a recent study of disinformation showed that people tend to share disinformation 70 percent more often than they share factual information (Soroush). A study done by a group of scientists from MIT published in the journal Science, found that, “Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information,” (Soroush). This report also found that it took six times as long for the truth to reach 1,500 people as it did for a false statement to spread. More surprisingly, the researchers found that these false facts were not spread by bots. They were spread by ordinary people.


Understanding the terminology and some of the reasons why disinformation and misinformation are spreading so quickly over social media is the first step in addressing the problem. The effects of disinformation campaigns are dangerous as the next section of this report will illustrate.


Photo by Elijah O'Donnell on Unsplash

One decade ago, Facebook was the platform that would help spread democracy across the globe. Google was making access to information easier than ever, and social media bloggers and reporters were encouraging protests in countries from Iran to Tunisia. However, by 2017, it became clear that the Internet’s role was not improving democracy. More specifically, the social media platforms that helped galvanize people to protest in 2011, were harming democracy by spreading misinformation, disinformation, and propaganda. By repeating ‘big lies’ susceptible people can be easily persuaded to accept dangerous ideas.

The Center for Information Technology and Society at UC Santa Barbara states that disinformation is dangerous because these campaigns distract people from other important issues that are never solved, they intensify social conflict to “undermine people’s faith in the democratic process and people’s ability to work together,” and they “undermine the functioning of democracy globally.”

Disinformation Disrupts and Detracts

This video illustrates the effect of fake news on the real world.

First, disinformation and fake news is designed to disrupt American life and detract the citizens from more important issues. These campaigns have been used to fuel conspiracy theories after major tragedies, such as mass shootings. Russian agents and other political operatives fed fake news to conspiracy theorists usually claiming that the shootings were staged by gun control fanatics. People shared these messages with others over social media despite knowing they were not real. These campaigns were successful in distracting Americans from the real issues of gun safety and stricter gun control laws because people look for conspiracies in every tragedy, and these theories have been spread through social media platforms in many mass-casualty events such as the Boston Marathon bombing and the Sandy Hook massacre.

Disinformation Can Intensify National Conflicts

Secondly, disinformation campaigns can use fake news to make the national conflicts in the United States more intense. Fake news that is politically motivated can be spread through foreign governments, American political groups, and conspiracy theorists. Although each of these groups may have different motivations, the result is the same: spreading fake news to intensify social conflicts.

These ads target controversial hot button issues such as race, Black Lives Matter, gun control, and immigration. The Russians even used Facebook events to organize protests and counter protests about a certain issue resulting in Americans fighting one another over controversial issues. These events were made up so people would “believe them, show up, and make trouble” (“Danger”).

Disinformation Threatens the Democratic Process

This video explores the effect of fake news on politics.

Finally, disinformation threatens the democratic process as was clearly seen in the 2016 election. Since over 62% of Americans receive their news from social media, it is extremely important to make sure this news is not fake (Schiffrin).

For example, Russia’s Internet Research Agency did a wonderful job of intensifying social conflicts during the 2016 election. These Russian Troll Farms purposely spread fake news that favored Donald Trump and discredited Hillary Clinton. Using paid Facebook ads to spread disinformation turned Americans against each other. The Center for Information Technology stated that, “The U.S. Congressional Intelligence committees responsible for investigating fake news have released 3,500 of these advertisements to the public.” This clearly shows that this was a well-orchestrated campaign that affected the United States election results.

Fake news undermines trust in real news and in the government. The implications of this for democracy are enormous. Schriffrin argues that “Democracy rests upon the assumption of an educated populace; this is part of why public education is so important.” For Americans to participate in democracy, they need to understand the issues. Otherwise their voting decisions may be arbitrary or based on propaganda or pandering. Schiffrin continues to support her case by stating, “If misinformation and fake news campaigns truly do frustrate citizens’ attempts to educate themselves — or, even worse, actively manipulate citizens into believing false information — then the very foundations of democracy are at risk. “


Disinformation campaigns are evolving in their sophistication as the use of trolls and bots is being replaced by human beings who amplify these messages and even generate new campaigns on their own. This is dangerous because these people believe the disinformation they are spreading. This is critical as understanding the changing nature of these campaigns is critical to preparing for the next disinformation campaigns, including the upcoming election.


Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store