The “Fake News Media”

Social Submission Under the Age of Misinformation

Image for post
Image for post
Image Caption: A person in blue jeans and a striped top sitting on a beige couch. The person is holding up a newspaper with the caption “FAKE NEWS” in bright red letters on the front

Introduction

After the election of 2016, the brand-new words of the year became ‘fake news.’ No one was certain what that phrase was supposed to mean, however as time went on it became so popular a term that it entered the public lexicon. Part of the difficulty involved in the discussion of ‘fake news’ was the definition itself. Although the idea of mudslinging and publicity stunts had been around for centuries, this new set of words specified a particular point in time and understanding of technology to explain its spread. There were many factors that were eventually attributed to the outcomes of the 2016 election, but the idea of ‘fake news’ has never been far from the minds of the public. This essay will attempt to draw a clear definition of what ‘fake news’ is, the layers of problems within its folds, and possible solutions at this point in time. There is no clear-cut ways to solve the issue of misinformation; however, with concentration on specifics such as emotional reactions, critical thinking education, the spread and filtering of facts, and group think echo chambers, the solutions detailed here may present the beginnings of a resolution to future actions citizens, private companies, and governments alike can take.

Defining the Problem

One of the primary issues with the term ‘fake news’ are the words themselves. Although the phrase has existed well before the world of Donald Trump, since the 2016 election the concept of ‘fake news’ and its effect have warped the public imagination into a frenzy (Center for Information Technology and Society). In this vein, it is vitally important that this paper clearly define the term and lay out boundaries of this essay’s scope. To this end, this paper will use the term ‘fake news’ to denote “fabricated information that mimics news media content in form but not in organizational process or intent” (Lazer et al. 1094). The information discussed in this definition is specifically manufactured to cause harm or disorder through confusion in its credibility. This definition will also seek to include items such as social manipulation that attempts to disorganize traditional social and political structures. With these specifications laid out, this essay will now break down the main subfields of the ‘fake news’ phenomenon and attempt to present counteractive measures that can be taken in the future.

Human beings are, by nature, emotional. This can be positive as it helps facilitate relationship bonds, social connections, and build communities. However, if emotions are manipulated, the results of misdirection can be disastrous. Strong emotions, such as fear, have the power to dampen rational and lead to high arousal states that can be unpredictable. In fact, it has been found that human’s fear response can be triggered by both personal experience and through empathy with written and spoken accounts (Smithsonian). When this happens, reason and truth can be lost in the confusion of excitement, and decisions can be made without context of the reality of the situation. Indeed, similar to fear, “anger can facilitate belief in falsehoods, which might be ‘especially troubling given that anger also depresses information seeking and increases selective exposure’” (Scheufele and Krause 7665). Both anger and fear can lead to similar ‘flight or fight’ reactions that tend to feed of each other and further escalate situations. As there is a very fine line between these related reactions, any piece of media or event triggering either or both of these centers of response can facilitate unreasonable backlash that does not follow discernable patterns.

Many issues widely discussed on the internet and perhaps later dubbed ‘fake news’ are items that illicit high emotions in the content itself or in reactions to the information presented. High levels of emotion are entertaining to interact with and give a good distraction for viewers. However, “while each of these factors — context, distraction, social learning — have potential to influence the way we experience fear, a common theme that connects all of them is our sense of control” (Smithsonian). When viewers feel they are participating, and potentially in control of, content, this can lead to feelings of invincibility in their decisions. This mentality is incredibly dangerous as it can lead to quick decisions not made on adequate proof nor in quite sound mind. Human beings often struggle to contain their emotional reactions to events, which can promote a very fine line between what is true, and what a person believes is true.

One of the largest difficulties in combating the rise of the ‘fake news’ phenomenon is human ability. The human race is more connected, and more expeditious, than it has ever been in its history. Humans were not built to change so quickly, and thus have only been attempting to cope with rapid advancement by applying old patterns and hoping they will adequately compensate. Disastrously, traditional knee jerk emotional reactions cannot cope with the advances that have been made in the way society interacts at large. Any emotional stimulus given through these platforms cannot be appropriately categorized, and as such lead to further confusion and controversy. In fact, recent studies have shown that current students fail to “recognize the possible biases of politically charged tweets…distinguish between a news story and news-like advertisement… [and] one in four (23%) American adults admitted to sharing misinformation via social media” (Scheufele and Krause 7664). American adults are unable to tell the truth from a falsehood as they do not have ingrained tools trained to distinguish the two at such a fast pace without bringing their own emotional reactions into play. Without analytical and balanced assessment of the information that they share, they can potentially spread ‘fake news’ through either ignorance, or willful negligence of their own gate-keeping abilities.

In addition to this, there is a large gap between the general knowledge among the public versus epistemic beliefs. Regrettably, the gaps can partially be explained by confusion as to how the truth itself is defined. A current common narrative in media are arguments for ‘both sides,’ no matter the content of each argument. There is no assessment on the factual backing behind individual disputes, but instead more credit is given to any proposed ‘opposing view.’ As further explained by Scheufele and Krause,

“People who ‘[put] more faith in their ability to use intuition to assess factual claims than in their conscious reasoning skills’ are particularly likely to support conspiracy theories, whereas people who believe that empirical evidence is needed to validate truth claims exhibit the opposite tendency” (7663).

Many people are more likely to come to the most expedient, and convenient, understanding of an event than to think critically over every detail. It is far easier for members of the public to use their preconceptions and biases of an event to form their opinion than it is to change those perceptions. In these cases, if the public’s ideas of an event do not correlate to the objective truth, they are less likely to listen to the impartial facts than to conform to their preconceived narrative. These individuals may feel that their understanding of how the world functions is an evenhanded truth and need excessive reasoning to overturn their original conceptions.

Citizens struggle with telling truth from falsehood, and that can lead to confusion when assumed trustable sources are instead supplies of social manipulation. Indeed, “citizens can be uninformed and misinformed all at once… and these factors may influence each other” (Scheufele and Krause 7662). Without proper understanding of the mechanisms controlling their behavior and reactions, citizens are highly unlikely to change their nature. Where there are so many conflicting sources of misinformation and misguidance, it is difficult to distinguish the true culprit.

Image for post
Image for post
Image Caption: A protester holds up a sign in a crowd. The sign is yellow with the caption “STOP talking” encased in a red stop sign.

‘Fake news’ itself has no power on its own without a vehicle to spread its message. Typically, this is done through social media sites, where it is easiest to spread rumors with little fact checking. In fact, “about 47% of Americans overall report getting news from social media often or sometimes, with Facebook as, by far, the dominant source” (Lazer et al. 1096). Often accentuated on these social media platforms is a push for increased interaction rates, commonly achieved through outlandish ‘click-bait’ titles, captions, or images. The more ‘likes’ and shares a post receives, the more positive attention, such as through revenue or promotion by the media service itself. Unfortunately, increased recommendation does not come with increased scrutiny of the facts elicited in the materials presented. Unlike traditional media, outlets that post ‘fake news’ “lack the news media’s editorial norms and processes for ensuring the accuracy and credibility of information” (Lazer et al. 1094). With no editing, and complete reliance on the social media algorithm, posters may resort to salacious claims and bold lies to boost interaction with no care of the actual affects their claims make.

Facebook, along with other common social media platforms such as Twitter and Youtube, have built a model that incentivizes rapid spread social interaction. In fact, “their business model relies on monetizing attention through advertising. They use complex statistical models to predict and maximize engagement with content” (Lazer et al. 1096). The more interaction on each post, video, or photo, the more advertiser revenue. Although this may be advantageous in traditional marketing schemes, the benefits of the internet can also be the downfalls of this type of strategy.

As more people spread false reports, more are exposed and possibly fall victim to the scam held within. Without proper training, those who are exposed to fake retellings of events will react on first instinct and share their own thoughts, further spreading the cycle. Because of this, “the average American encountered between one and three stories from known publishers of fake news during the month before the 2016 election” (Lazer et al. 1095). ‘Fake news’ often takes the form of titles and content meant to draw the reader in specifically for an emotional reaction, and as politics is a highly divisive topic it is also one of the most effective ways to stir a response from individuals. In addition, the hyper vigilance of internet users to be on top of all information at all times only adds to the chaos. As further explained, “information on Twitter is typically retweeted by many more people, and far more rapidly, than true information, especially when the topic is politics” (Lazer et al. 1095). Again, the spread of information is not due to the truth behind it, but instead the emotions it stirs up in individuals, which causes a reaction and interaction with the material. True information is typically presented in a specific format that is not as entertaining or emotionally tied, and thus elicits less of a reaction from audiences. In order to combat this peculiarity, this paper will dive further into the intricacies that govern group patterns of ‘fake news’ and information sharing.

An additional phenomenon adding to the struggles of truth telling through social media networks are group think and the echo effect. Essentially, these terms describe the tendency for social groups to form tight bonds, in the real world or online, and the inclination of those groups to form an isolating narrative that is exaggerated by those in the group into a false account of events. Through this isolation, groups form collections that refine ideas of the outside word that can get out of hand and spiral into conspiracies full untruths that everyone in the faction is pressured into believing. In fact, “isolating subpopulations and catering to their idiosyncratic opinions, often giving people the illusion that they are in the ideological majority” (Cybenko 3). When detached cliques have no connection to the outside world other than their assembly, there is a tendency to believe that they alone have the truth and those outside of the group must agree with them.

In order for this paradox to take place, the group must have pieces of information that reaffirm their pre-held beliefs or ideas. If the information they receive from the outside world does not reaffirm their conclusions, it is immediately discredited. In addition, “individuals are more likely to accept information that appears to follow a logical narrative, that comes from a source they perceive to be ‘credible’” (Scheufele and Krause 7664). Although these sources may not be credible in the grand scale, as long as the source confirms the bias of the individuals in the group on a consistent basis it will be considered credible. This is also the reason why many conspiracy theories are compelling. Typically, conspiracies tell irresistible stories that have a clear and easy to follow plot line, which is seldom replicable in real life events. Without education in critical reading and thought, many individuals are left without the tools necessary to distinguish between a good story and the truth.

Potential Solutions

Image for post
Image for post
Image Caption: A person with short, black hair stands in a concrete lot with grass behind them staring deeply into the camera. The person is holding a newspaper that is burning in their hands.

One of the first steps that can be taken to prevent the spread of ‘fake news’ is to de-emphasize direct emotional reactions around controversial topics. This type of campaign would be a multifaceted one, as it would have to encourage disidentification with partisan beliefs. The easiest way to accomplish this would be through social rejection of emotionally led information. For example, “disincentivizing expressions of partisan anger and outrage so they cannot be leveraged by disinformation campaigns to exacerbate biased assimilation of information” (Scheufele and Krause 7665). This would be a highly effective strategy to prevent the spread of back and forth politics that only incentivize further politicization and potential exacerbation in scope.

Another potential approach to de-politicize politics would be in the opposite direction. There are numerous studies that have shown that late-night satire comedy television shows are effective tools for citizens to become better educated about the world around them while also thinking critically about the material being discussed. As a majority of traditional media coverage is bland, non-entertaining, and time-intensive to consume, satire political comedy would provide a welcome alternative and adversary to the extremism in ‘fake news.’ Interestingly, the research shows that late-night television “represent[s] authentic (real) discourse that breaks through the shell of the real (fake) news revealing layers of social construction, empty symbolism, and simulacra — thus positively affecting the traditional coverage and political discourse” (Amarasingam, Amarnath, McChesney 81). This strange dichotomy of ‘fake’ real news prepares viewers to see all media with a critical eye, therefore priming the brain to questions preconceived notions of events that are presented in other forms of media. While keeping the same emotional tone to appeal to audiences, satirical political comedies effectively combat the original layers of fear and anger that would normally bring citizens to a standstill.

Although this next step may seem fairly obvious, the consequences of it have overwhelming effects on the entirety of the ‘fake news’ system. One of the major faults of the current education system is the lack of critical thinking skills that are taught to all levels of students across the United States. As previously discussed, critical thinking skills are essential when discerning the true from the false, however many students who leave the current US education system depart without any knowledge of the ability. This gap is most seen in underserved areas, as “quality scientific information is not only more likely to reach more educated and higher-income audiences, but, when it does, the ability of citizens with higher socioeconomic status to process new information more efficiently can further widen existing gaps” (Scheufele and Krause 7667). Those with less traditional education, commonly in lower income individuals, are more likely to not be taught the critical thinking skills required to take part in civil discussions.

Some feel that these techniques are not always necessary for lower income individuals; however, these populations are also the least aware of their own lack of knowledge. In fact, those with low levels of traditional knowledge are also less likely to refer to scientific judgment and instead rely on their own nonexpert intuition (Scheufele and Krause 7663). Without improving the education system to contribute preventative measures for individuals more likely to be susceptible to ‘fake news’ influence, America will continue to have the problem of emotionally backed political decisions and chaos. It is vital that critical thinking and traditional science information continues to be taught in schools in an accessible way so that everyone can have the same opportunities and understanding of how the world works.

There are multiple steps that can be taken to improve the overall filtering of traditional and social media so that the spread of ‘fake news’ is not incentivized. In fact, one easy way for this to be done is to “adjust [the] models to increase emphasis on quality information” (Lazer et al. 1096). Instead of focusing on quantity, which is very common for current social media networks who rely on interaction for ad revenue, there should be a move to focus more on quality of the information being presented. As in traditional media, there should be a focus on the quality of the entertainment itself, thereby increasing the chance that good content would be endorsed rather than controversial content that had no real value.

In addition to this, there can be efforts made to be more specific in filtering content. There have been multiple instances, especially in the recent news, of satirical or informational material being classified as ‘harmful’ despite the original truth promotion purpose of the publishing. As further explained, “a more fine-grained classification of information by intent might be especially beneficial in identifying truly fake news from closely related in- formation such as satire and opinion news” (Sharma et al. 21:35). Although this approach can sometimes backfire and clear blatantly racist, homophobic, and hateful content in defense of ‘opinion’ publication, this clarification can better specify for censors what content is considered true satire. In fact, to combat this ‘opinion’ phenomenon, “some recent works have considered classification of fake vs. satire news and fake vs. hyperpartisan news, and we believe that this is an important direction for the future” (Sharma et al. 21:35). These further clarifications would attempt to better direct filtering software so that more controversial material will not be flag for the subject matter itself, but instead for the quality of the discussion around the content.

The last, and most difficult to implement solution, would be to break up group think dynamics and prevent echo chambers from forming. Unfortunately, the current setup of many social networks encourage group think by only recommending what they believe to be directly relevant to a user’s interest. This system creates an isolation from diverse thinking and resistance to other points of view. Indeed, “individual misperceptions emerge in group-level processes as part of social networks, and they are embedded in and shaped by societal dynamics (such as political events or campaigns)” (Scheufele and Krause 7667). Intense self-isolation and reluctance to open up strengthens echo chambers and encourages the growth of extremist views with no restraint. Instead, there should be encouragement by social networks to branch out and try new ideas with a focus on quality of the entertainment given and not the quantity. In fact, “diversity of thought and opinion is valued in modern society. Often called “cognitive diversity,” it can counter groupthink and enables better decision making” (Cybenko 3). Human beings are incredibly social and easy to manipulate, as previously discussed. Even with simple exposure to new idea, media can open the door to less intolerance and more inclusion in social groups.

Conclusion

There is no specific answer to the behemoth that is the term ‘fake news.’ There are a multitude of interconnected components that all must be resolved before the primary work to untangle true from false begins. Despite this, there is a path to change the way society understands and processes truth. There are some measures that can be taken by government entities, such as requesting more specific filtering of social media, or increasing critical thinking education funds. However, a majority of the work that will bring a real solution will have to be done by individuals. There needs to be a change in how citizens educate themselves on the news, and how they participate in society as a whole. These conclusions inherently go against the nature of human beings and how they have grown to depend on social groups. Nonetheless, if humans are to continue to mature in an increasingly stimulating world they must take steps to better consume the media they take part in. There must be concentrated efforts by all individuals to think critically about the media they consume, prevent emotions from clouding their judgment, educate themselves on current scientific beliefs, and prevent echo chambers by promoting diversity in their circles of relationships. Without these changes, even if there are steps taken by governments or private companies, there will be no advancement made on the baseline level. In order for the system of media and communication to truly change, there needs to be an effort from the top down, and the bottom up.

References

A Brief History of Fake News | Center for Information Technology and Society — UC Santa Barbara. https://www.cits.ucsb.edu/fake-news/brief-history. Accessed 9 June 2019.

Amarasingam, Amarnath, and Assistant Professor of Journalism and Mass Communication Robert W. McChesney. The Stewart/Colbert Effect: Essays on the Real Impacts of Fake News. McFarland & Company, Incorporated Publishers, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/bayloru/detail.action?docID=699252.

Cybenko, A. K., and G. Cybenko. “AI and Fake News.” IEEE Intelligent Systems, vol. 33, no. 5, Sept. 2018, pp. 1–5. IEEE Xplore, doi:10.1109/MIS.2018.2877280.

Lazer, David M. J., et al. “The Science of Fake News.” Science, vol. 359, no. 6380, Mar. 2018, pp. 1094–96. science-sciencemag-org.ezproxy.baylor.edu, doi:10.1126/science.aao2998.

Scheufele, Dietram A., and Nicole M. Krause. “Science Audiences, Misinformation, and Fake News.” Proceedings of the National Academy of Sciences, vol. 116, no. 16, Apr. 2019, pp. 7662–69. www-pnas-org.ezproxy.baylor.edu, doi:10.1073/pnas.1805871115.

Sharma, Karishma, et al. “Combating Fake News: A Survey on Identification and Mitigation Techniques.” ACM Transactions on Intelligent Systems and Technology, vol. 10, no. 3, Apr. 2019, pp. 1–42. Crossref, doi:10.1145/3305260.

Smithsonian. “What Happens in the Brain When We Feel Fear.” Smithsonian, https://www.smithsonianmag.com/science-nature/what-happens-brain-feel-fear-180966992/. Accessed 10 June 2019.

They/Them Pronouns | 中文名字: 柯梅 | Recent graduate with a Master of International Affairs | Working in civilian military, security, and intelligence analysis

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store