Abstract
When addressing the spread of misinformation, online media literacy programs have mostly emphasized an approach that seems to ignore the influences of individual emotions and social goals. However, when online, individual emotions are manipulated and users are directed to seek to fulfill social needs. This distracts from our obligation to fact check when online. A metacognitive approach will provide a more complete online social media literacy that will help users become aware of the social and emotional basis for their online behaviors, with the goal of becoming mindful of the information they access and share.
Keywords
Cognition, Emotion, Metacognition, Mindfulness, Misinformation, Online Media Literacy, Social Motivation
Introduction
Online media literacy programs have mostly emphasized evaluating the “facts, sourcing, and verifiability” of online content (Currie Sivek, 2018). This strategy ignores the influences of social motivations and emotions. However, emotions are manipulated online (Grabe & Myrick, 2016; Laybats & Tredinnick, 2016; Jones et al., 2016; Boler, 2019; Boler & Davis, 2018; Middaugh, 2019), and users are directed to meet social needs when online (Lee & Ma, 2012; Leung, 2013; Ng & Zhao, 2020). As a result, users are less likely to focus on their obligation to evaluate content before sharing (Pennycook et al, 2021). There have been some calls for media literacy to incorporate the role that feelings play when online. For example, Middaugh (2019) calls for a media literacy that pays attention to both “emotional and factual elements.” While researchers are not making the same plea for the social, there is some evidence showing how feelings motivate online behaviors that support social needs (Brady et al., 2017; Majmundar et al., 2018; Boler & Davis, 2019). One approach, based on a suggestion by Currie Sivek (2018) is to teach mindfulness techniques so that users become aware of the constant unconscious role of “emotional and cognition” when “processing news.” A metacognitive process could help people become aware of the social and emotional basis for their online behaviors with the goal of becoming mindful of the information they access and share when online.
Background
Misinformation is harmful to civil society because “inaccurate information, rumors, and conspiracy theories” (Barzilai & Chinn, 2020) can make it difficult for citizens to meet their basic obligations to be informed, make decisions, and be aware of how those decisions affect others (Middaugh, 2018). Misinformation travels at a faster rate online than through mass media. Before the internet, citizens accessed news that was fact-checked and filtered through the print and broadcast news organizations (legacy press). Through the internet, citizens can access and share an infinite amount of information at their fingertips, including news provided by legacy press, unvetted information by well-intentioned non-experts, opinions, and information that can be purposely misleading, factually incorrect, and even outright lies. Examples of the latter include disinformation, fake news (literally fake news stories made to look like real news stories), and astroturfing (i.e., fake grassroots campaigns using online personas made to appear as if they are real people).
Much of this information is curated via algorithms to generate clicks, shares, and likes regardless of the veracity of content. When individuals click, share, and comment, they become secondary gatekeepers because they decide which content is worthy of being consumed by others (World Economic Forum, 2013; Hermida, 2020; Singer, 2014). The spread of misinformation online has been attributed, for example, to vaccine hesitancy (Gyenes & Mina, 2018), rejecting climate change evidence (Lewandowsky et al., 2012), harmful personal decisions regarding the COVID-19 pandemic (Satariano, 2020), and widespread belief that the result of the 2020 election was illegitimate (Seitz, 2022; Frenkel, 2020).
In response to the need to help individuals cope with their role as informational gatekeepers, online media literacy has focused on an approach that helps them verify and fact-check before sharing information they acquire online. However, that route does not consider the role of emotions and social motivations people encounter when online. A possible solution to improve information sharing is a metacognitive process that helps people become aware of their social and emotional state of being before sharing information.
Current Online Media Literacy
Online media literacy largely focuses on an approach that doesn’t seem to account for the role that emotions and social goals play in online engagement. For example, Valtonen et al. (2019) found that online literacy programs focused on some form of critical thinking, and Currie Sivek (2018) argued that such curricula tend to underscore evaluating the “facts, sourcing, and verifiability” of online content (see Figure A below). Barzilai and Chinn (2020) organized educational responses into “four lenses” based on a model of epistemic thinking. However, there is evidence that emotions and social motivations also play a role in online behavior.
The Role Emotions and Social Motivations in Online Behavior
Research in the fields of neuroscience and psychology show that emotions contribute to our decisions, judgements, and choices generally. Neuroscientists have used the somatic marker hypothesis (Damasio, 1994, 1999; Reimann & Bechara, 2010) to explain that role. According to the somatic marker hypothesis, when humans experience something, changes occur in their somatic or visceral state (for example, heart rate, body temperature, and hormonal changes) that are marked by the body as positive or negative feelings. When they later encounter (or imagine) a similar situation, that somatic marker is automatically and non-consciously recalled, and subsequently, their evaluation of the situation is driven by the associated feeling. Psychologists have developed the “affect heuristic” (Slovic et al., 2007) and the Risk as Feelings hypothesis (Lowenstein et al., 2001) to explain that role. According to the “affect heuristic,” when humans encounter stimuli, such as words, images, sounds, or smells, they immediately experience a positive or negative feeling. The decisions we make are biased by the stimuli associated with that decision. According to the Risk as Feelings hypothesis, cognition and emotion play a complementary yet independent role in risk-related decisions. This is because each responds to different inputs. While rational thinking considers probabilities, emotions consider vividness and proximity to the possible consequences.
Other researchers have discussed the role that emotions play online specifically. Jones et al. (2016) argue that “viral content…evokes high-arousal emotions such as joy or fear.” Leybats and Tredinnick (2016) contend that “emotional content has a greater likelihood of being “shared or open before rational second thoughts” kick-in.
Middaugh (2018) found that while users understand how to find reliable information when online, “factual accuracy” is rarely considered when sharing online content. Not only does emotion play a role in our online behaviors, it can also make it difficult to discern fact from fiction. For example, Martel et al (2020) found that a reliance on emotions increases the likelihood of “incorrectly perceiving fake news as true.”
Social motivations also play a role. Larose et al. (2001) reported social outcomes (interactions, communications) among others, positively correlated with internet usage. Lee and Ma (2002) discovered that people are motivated to socialize (build and maintain relationships) and seek status. Finally, Ng and Zhao (2020) learned that both surveillance and prosociality motivated online behaviors. This focus on social goals can distract us from our obligation to fact check. For example, Pennycook et al (2021) posit that while we want to share accurate information, the “online context” focuses our attention to social goals such as to “attract and please followers/friends or to signal one’s group membership.”
Often though, the roles of social needs and emotions work together. Brady et al. (2017) found that “moral-emotional” words increased the diffusion of online messaging that was “bounded by social membership.” Majmundar et al. (2018) discovered that people retweet to “show approval, argue, gain attention.” Similarly, Boler and Davis (2019) recognized that people may gain a sense of “esteem and belonging” when receiving “likes” for their online posts and comments, creating an “affective feedback loop” that encourages people to stay online longer and to return online frequently.
A More Complete Online Media Literacy
Recently, some researchers have called for media literacy to include the role that feelings play when online. For example, Middaugh (2019) argues for a media literacy that pays attention to both “emotional and factual elements”. This is further supported by a lack of focus on feelings in media literacy education handbooks. Boler (2019), in a review of nine media literacy handbooks, found that of the four that actually reference emotions, only two suggest that users need to be self-aware of the ways their emotions are triggered. While researchers are not making the same plea for social motivations, as previously mentioned (see Brady et al., 2017; Majmundar et al., 2018; Boler & Davis, 2019), people are motivated to fulfill social needs when online. However, from a fact-checking standpoint, there is room for optimism. For example, Fazio (2020) observed that users will pause to consider whether a news headline is factual if reminded to do so, and Pennycook et al (2021) found that when users are asked to rate the truthfulness of a headline, the user’s focus will shift to accuracy for subsequent headlines.
If the goal is an online media literacy that includes not just evaluating trustworthiness, but also awareness of feelings and social motivations that may crowd-out our need to think critically (see Figure b), a metacognitive approach may be the solution.
Metacognition occurs when we think about our own thinking, and includes interactions among “metacognitive knowledge, experiences, goals and actions” with regard to just about any cognitive task, including “attention, memory, problem solving, [and] social cognition” (Flavell, 1979). Therefore a metacognitive approach to online media literacy would include thinking about the factors that help us become better at discerning the trustworthiness of content we consume and share online, including an awareness of the feelings and social motivations that distract and/or inform our ability to think critically (see figure c). For example, while users may want to share accurate information, they may not be aware that the online context focuses their “attentional spotlight” instead on social motivations (Pennycook et al, 2021), or that social media platform algorithms provide highly emotive content to encourage us to engage (click, comment, like, share) without thinking (Kozyreva, 2020), or that feelings themselves can subconsciously influence our decisions and behaviors (Damasio, 1994, 1999; Reimann & Bechara, 2010; Slovic et al, 2007; Lowenstein et al., 2001). This metacognitive process empowers users to bring these factors to the surface so as to make them less likely to distract from the fact-checking task. Further, in keeping with the findings of both Lazio (2020) and Pennycook et al (2021) users are prompted to think about whether they paused to ask if the content they encountered was factual. By repeatedly applying this process to their own online experiences over multiple days, users can tone down the social and emotional distractions to leave some space for fact-checking.
Figure c: Metacognitive Approach to Online Media Literacy
Works Cited
Anderson, M. & Jiang, J. Teens, Social Media and Technology 2018. (2018, May 31) Pew Research Center. https://www.pewresearch.org/internet/2018/05/31/teens-social-media-technology-2018/
Barzilai, S. & Chinn, C.A. (2020) A review of educational responses to the “post-truth” condition: Four lenses on “post-truth” problems, Educational Psychologist, 55:3, 107-119
Boler, M. (2019, November). Digital disinformation and the targeting of affect: New frontiers for critical media education. Research in the Teaching of English, 54, 187–91.
Boler, M., & Davis, E. (2018). The affective politics of the “post-truth” era: Feeling rules and networked subjectivity. Emotion, Space and Society, 27, 75–85. https://doi.org/10.1016/j.emospa.2018.03.002
Brady, W. J., Wills, J. A., Jost, J. T., Tucker, J. A., & Van Bavel, J. J. (2017). Emotion shapes the diffusion of moralized content in social networks. Proceedings of the National Academy of Sciences, 114(28), 7313–7318. https://doi.org/10.1073/pnas.1618923114
Currie Sivek, S. (2018). Both facts and feelings: Emotion and news literacy. Journal of Media Literacy Education, 10(2), 123–138. https://doi.org/10.23860/JMLE-2018-10-2-7
Damasio, A. R. (1999). The feeling of what happens: Body and emotion in the making of consciousness (1st ed). Harcourt Brace.
Dunlosky, J., & Metcalfe, J. (2009) Metacognition. Sage Publications, Inc.
Fazio, L. K. (2020). Pausing to consider why a headline is true or false can help reduce the sharing of false news, The Harvard Kennedy School (HKS) Misinformation Review, Volume 1, Issue 2 Received: Dec. 20, 2019 Accepted: Jan. 23, 2020 Published: Feb 10, 2020
Flavell, J. (1979, Oct.) Metacognition and Cognitive Monitoring A New Area of Cognitive—Developmental Inquiry. American Psychology. Vol. 34, No. 10,906-911
Frenkel, S. (2020, Nov. 23) How Misinformation ‘Superspreaders’ Seed False Election Theories. The New York Times. https://www.nytimes.com/2020/11/23/technology/election-misinformation-facebook-twitter.html
Grabe, M. E., & Myrick, J. G. (2016). Informed citizenship in a media-centric way of life: Informed citizenship. Journal of Communication, 66(2), 215–235. https://doi.org/10.1111/jcom.12215
Gyenes, N., & Mina, A.X. How Misinfodemics Spread Disease. (2018, August 30). The Atlantic. https://www.theatlantic.com/technology/archive/2018/08/how-misinfodemics-spread-disease/568921/
Hermida, Alfred (2020). “Post-Publication Gatekeeping: The Interplay of Publics, Platforms, Paraphernalia, and Practices in the Circulation of News.” Journalism & Mass Communication Quarterly 97, 2: 469-491.
Howell L (2013) Digital wildfires in a hyperconnected world. WEF Report 2013. Available at eports.weforum.org/global-risks-2013/risk-case-1/digital-wildfires-in-a hyperconnected-world. Accessed June 25, 2021.
Kozyreva, A, Lewandowsky, S., & Herwig, R. (2020) Citizens Versus the Internet: Confronting Digital Challenges With Cognitive Tools. Psychological Science in the Public Interest Vol. 21(3) 103–156
Larose, R., Mastro, D., & Eastin, M. S. (2001). Understanding internet usage: A social-cognitive approach to uses and gratifications. Social Science Computer Review, 19(4), 395–413. https://doi.org/10.1177/089443930101900401
Laybats, C., & Tredinnick, L. (2016). Post truth, information, and emotion. Business Information Review, 33(4), 204–206. https://doi.org/10.1177/0266382116680741
Leung, L. (2013). Generational differences in content generation in social media: The roles of the gratifications sought and of narcissism. Computers in Human Behavior, 29(3), 997–1006. https://doi.org/10.1016/j.chb.2012.12.028
Loewenstein, G. F., Weber, E. U., Hsee, C. K., & Welch, N. (2001). Risk as feelings. Psychological Bulletin, 127(2), 267–286. https://doi.org/10.1037/0033-2909.127.2.267
Majmundar, A., Allem, J.-P., Boley Cruz, T., & Unger, J. B. (2018). The why we retweet scale. PLOS ONE, 13(10), e0206076. https://doi.org/10.1371/journal.pone.0206076
Martel, C., Pennycook, G., & Rand, D. G. (2020). Reliance on emotion promotes belief in fake news. Cognitive Research: Principles and Implications, 5(1), 47. https://doi.org/10.1186/s41235-020-00252-3
Middaugh, E. (2018). Civic media literacy in a transmedia world: Balancing personal experience, factual accuracy and emotional appeal as media consumers and circulators. Journal of Media Literacy Education, 10(2), 33–52.
Middaugh, E. (2019). More than just facts: Promoting civic media literacy in the era of outrage. Peabody Journal of Education, 94(1), 17–31.
Ng, Y.L., & Zhao, X. (2020). The human alarm system for sensational news, online news headlines, and associated generic digital footprints: A uses and gratifications approach. Communication Research, 47(2), 251–275.
Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A. A., Eckles, D., & Rand, D. G. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592(7855), 590–595.
Reimann, M., & Bechara, A. (2010). The somatic marker framework as a neurological theory of decision-making: Review, conceptual comparisons, and future neuroeconomics research. Journal of Economic Psychology, 31(5), 767–776.
Satariano, A. Coronavirus Doctors Battle Another Scourge: Misinformation. (2020, August 17). The New York Times. https://www.nytimes.com/2020/08/17/Technology/coronavirus-disinformation-doctors.html
Seitz, A. (2022, April 23) In election misinformation fight, ’2020 changed everything’ Associated Press. https://apnews.com/article/2022-midterm-elections-voting-rights-technology-business-social-media-f5ba340c7a98f6f058fb3afac74a26bb
Singer, J. (2014). User-generated visibility: Secondary gatekeeping in a shared media space. New Media & Society, 16, 55–73.
Slovic, P., Finucane, M., Peters, E., & MacGregor, D. G. (2002). The affect heuristic. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 397–420). Cambridge University Press.
Current Issues
- Media and Information Literacy: Enriching the Teacher/Librarian Dialogue
- The International Media Literacy Research Symposium
- The Human-Algorithmic Question: A Media Literacy Education Exploration
- Education as Storytelling and the Implications for Media Literacy
- Ecomedia Literacy
Leave a Reply