Abstract
Teenagers consume news via social media, along with entertainment, sports, and user-generated content (UGC). Politicians and institutions use platforms to promote themselves, but these same platforms are used to spread misinformation. Rapid advancements in Artificial Intelligence (AI) blur the lines between physical and digital, making it difficult for young media consumers (often younger than 13) to verify information or—more seriously—to diversify commercial content from reports on world developments or conflicts. In the context of mobile phones, news literacy is essential for young people to develop criteria to avoid perceiving war as a branding campaign.
Keywords
Artificial Intelligence, News Platforms, Young Media Consumers, Understanding Conflicts

Teenagers get most of their messages, photos, videos and news on their mobile phones. According to a recent survey, teenagers in the US-based Gen Z younger cohort (14–19) consume more news than their adult Gen Z (20–25) counterparts in the form of news alerts and social media feeds, which keep them abreast with social issues and current events, offering immediate, interactive and engaging news experiences (Westcott, et al., 2022). In addition to local and global news, their feeds include entertainment, sports, opinions on social issues, and other content from various sources, including user-generated content, which they consider more authentic than mainstream media (Chryssanthopoulou, 2023). The increasing power of influencers made the White House hold a briefing to leading social media content creators on the crisis in Ukraine in 2022, providing authentic and timely information that can be shared with their audience (Palmer, 2023); further, it is no secret that political institutions use paid influencers for campaigns (Wheaton, 2024). In this context, teenagers use their mobile phones to learn about events happening around the world, including wars.
Artificial Intelligence (AI) is employed in many areas in the news industry, for example, newsgathering (gathering information, generating ideas and narrative structures, identifying trends, investigations, tracking issues, extracting information or content); news production (content creation, editing, structuring and formatting for different platforms, creating text, image and video, targeting content to different audiences); and news distribution (personalizing content and format, marketing, engaging audiences, understanding user behaviour, subscriptions, etc.) (Beckett, 2022). With the rapid advancements in AI technologies, the boundaries between physical and digital realities are becoming increasingly blurred, making it more challenging for younger media consumers to verify the truth about events in their country and around the world (Hasse, 2019). By “younger,” we refer to both tweens and teens, as the 13-year-old age limit is frequently disregarded.
News on the mobile
In this landscape, it is important to examine the behaviours and attitudes of young people and adolescents as news consumers. We must consider the ways in which they are exposed to the news (Hendrickx, 2023), the absence of news literacy scaffolding from their environment (school, family, university) (McGrew et al., 2018), the lack of knowledge and skills to locate verified content from trusted sources (Madden et al., 2017; Swart, 2021) in an environment where generative AI pushes consumers to increase consumption of lower value products, i.e. misleading content (clicks on fake news) (Sandrini, 2023) and where algorithms limit exposure to opposing news and, consequently, increase polarization (Levy, 2021). Despite easy access to information, a teenager can grow up in a void of socially significant content regarding how society is organized or their opportunities for civic participation. In addition, research on motivated reasoning (Kahneman, 2011; Redlawsk, 2002) suggests that online social attitudes can be influenced by emotional reactions, beliefs and needs, for example, the motivation to “belong to a group”; thus, the systematic interaction with “friends” to adopt opinions in “echo chambers” (Clark & Van Slyke, 2010). If readers’ criterion shifts towards accuracy, the quality of the news they share increases; people tend to share misinformation because they focus on factors other than accuracy (Pennycook et al., 2021).
Teenagers today strengthen their identity and develop attitudes and beliefs in digital spaces. Participation is especially important for them, and a sense of socio-political empowerment is linked to their self-esteem and well-being. In digital communities, young people adopt and express opinions on private or public matters; construct personal, social and political identities; give meaning to social or political concepts; decide on issues of their communities; express themselves and amplify their voice; participate in advocacy and “activism” practices; learn how to participate in public debate (Cho et al., 2020).
Despite private ownership and privacy issues, social media platforms guide what many people see, share or like and have become the new public commons for youth, reshaping how they form identities, communicate and engage with their communities, and learn, connect and advocate. In this way, by offering creative spaces and entertaining curated content, TikTok has drawn a massive younger audience. They find communities with shared interests and a means to amplify their voices on various matters; from climate change to racial justice, from fashion to ideologies, from politics to mental health awareness. A platform’s algorithm can configure the youth’s understanding of the “world out there.”
In many European countries, young people must make informed civic decisions at an early age, as they are eligible to vote before turning 18 (e.g., in Greece, Belgium, Germany, Malta, and Austria). Our research analyzing voting age and social media (Chryssanthopoulou, 2023) shows that the main concern of young people and teenagers is news reliability, accuracy and truth; they highlight critical thinking and the need for news literacy, emphasizing the pedagogical dimension of journalism. This emerging media audience that votes before finishing school needs to be able to process and evaluate elaborate information and communication strategies.
Consuming war scenes
Social media platforms are causing concern due to the overwhelming amount of AI-driven news and engagement techniques. At the same time, the battleground has changed: digital warfare is intertwined with traditional (tangible) techniques (e.g. physical warfare, military actions), creating confusion about what a “conflict” is and extending the battle beyond the field. This development has influenced our very understanding of conflicts as well as the way we learn about them. A virtual first-world has become the new normal, with the ease and accessibility of online experiences enhancing hybrid user journeys, where digital interactions are prioritized or even preferred over face-to-face ones, blending online (virtual) with offline (physical) experiences.
TikTok became much more popular once the war in Ukraine began, even with news organizations like the BBC; thus, “the first TikTok War.” However, influencer videos received more engagement and reached a larger audience than journalist videos, as first-person narration and self-performance are more likely to become viral than news and explanation videos. Many such videos belong to anonymous sources, while numerous others are posted by “wartime influencers” who have become TikTok celebrities; this is easy as long as the platform’s algorithmic recommendation engine, the “For You” tab, provides content based on users’ interests rather than the follower counts of accounts. Overall, this platform’s portrayal of the war differs greatly from traditional media platforms: it combines tragedy and comedy, campaigns and propaganda, and fact with fiction (Ertuna, 2023).
The use of technology to connect the digital and physical worlds provides users with unique “phygital” interactive experiences; this is a marketing term that combines digital and physical interactions to increase customer convenience and satisfaction. Technologies like marketing automation, geolocation information, beacons, wearable devices, IoT sensors, and smart mirrors enhance the value of various channels, allowing businesses to predict customer engagement, serve needs, and offer upselling opportunities. Phygital strategies work well because they are instant, connected, and engaging.
Naturally, war consumption has become phygital too. On the one hand, digital warfare is being integrated into traditional approaches, which makes it more difficult to discern what is real and what is not. On the other hand, screens mediate multiple daily activities, including violence: the struggle spreads beyond the territory in which it is taking place, into homes and communities all around the world via the internet. The comfort and accessibility of the internet promotes virtual experiences when buying products, following an influencer, or watching the latest war updates in Ukraine or Middle East. We turn to our screens not only to explore beauty products but also to watch the latest war unfold on TikTok. Recontextualized content, gamification, memes, altered videos from conflict zones, and entirely fabricated content are packaged to capture attention, generate views, and drive engagement.
TikTok videos employ various memetic techniques and templates: lip-syncing, duets, or point-of-view; this “gamifies” the user’s experience when watching war rolls. The platform can also add affective audiovisual content, transforming the users’ performances and experiences (Cervi & Divon, 2023). In this environment, sense-making collapses, and users lose sight of the goal and focus on the medium. The importance of the “why” of the action is overshadowed by the “how” of acting; thus, the goal of “information” is lost, the distinction between reality and falsity is blurred. News becomes distorted, and trust and cohesion are undermined.
From the declaration of war to a foreign country to an invasion of the Capitol, social media provide breaking news, user content and “theories.” Do we really see soldiers dancing in this TikTok video? Are these women really injured, or are they actors? Does our emotional state change when we watch breaking news on a missile hit, and then scroll down to our favourite YouTuber’s new roll? Can young people process what is happening in the physical world when everything looks like a TV show? Do they see a riot as just another branding exercise – the memes, comments and hashtags by “militarized” media users? Is TikTok real life? The distinction becomes increasingly blurred: the more algorithmic feeds provided by advertising-supported corporations shape our view of reality, the more those feeds become our reality. Who is responsible for leading us into this confluence?
A Wall Street Journal experiment in 2023 showed the potential of TikTok to serve unfiltered, even graphic content to young media consumers; gore content, while sometimes blurred or not visualized at all, was often explicitly described in detail. War meme hashtags can reveal hundreds of amateur creations vying for attention. Even after being flagged, they are rarely removed. Many of these videos lack context: they are simply presented, with or without tags, and isolated. What should the spectator make of them in the absence of background or information on the precise event? Photos turned into memes get a unique meaning for young audiences; for example, Zelensky is dressed as Captain America in Ukrainian colours, which makes him a fictional hero, not a real politician.
A child’s reactions to such content, coupled with various “campaigns” encouraging donations to help those in need or hashtags that draw attention to trivial details of operations, can vary widely. Initially, they may feel emotionally moved and engaged, but after encountering hundreds of similar posts, they risk falling into cynical apathy toward global events.
The digital space is today a hotbed of contention, giving rise to cognitive warfare, namely, the ability to exploit vulnerabilities of the human brain and hack it using various methods, for example, information manipulation, disinformation, cybernetics, psychology, or social engineering (Hung & Hung, 2022). The digital space is a hub of conflict, with cyber-attacks, disinformation campaigns, and espionage. Different actors have different methods for “brain hacking”: some countries focus on technology, on the “weaponization of biotechnologies and neuroscience” (Giordano, 2017; Rickli & Ienca, 2021); while others introduce cognitive warfare into informational warfare.
So, how do young people understand war as content? How do they experience their “customer journey” when they scroll down to more videos? How do they respond to TikTok remixing content, recontextualizing images, adding audio, combining self-performance within trends, facilitating affective audio networks and mediating emotional responses?
Physical crises are reimagined online through the lens of platforms and influencers, out of context, by affective publics (Bruns & Burgess, 2015; Marwick & Boyd, 2011; Papacharissi & Oliveira, 2012), particularly young people. Reimagined wars are mediated by both professionalized and amateur content creators, with the former focusing more on self-performance and engagement and the latter on virality and specific aspects of the conflict, facilitating emotionally contagious effects on media consumers. Intriguingly, both amateurs and professionals utilize a person-centred approach to material delivery, borrowing from journalistic practice and influencer industry standards while tailoring the presentation to the platform’s emotive style, and use media remixes and person-centered narratives to game the algorithm to build authentic audience relationships. Qualitative research of social media hashtags suggests that influencer accounts provide more information and testimonial content and have a bigger impact than mainstream media (Sidorenko Bautista et al., 2023).
Social media in warfare
TikTok videos change the interpretation of war, shifting between serious journalistic work and user-created parodies. The dynamics of media coverage during crises include the interaction of journalistic methodology, government policies, and public sentiment. The potential to influence public opinion during times of national upheaval gives rise to ethical issues when reporting on conflict and terrorism. The “affective economy” of war is shaped by platform design and its feedback-driven algorithm.
More research is needed into affective audio networks, political communication on TikTok, and topic distributions. TikTok’s impact on young people’s social media experiences will continue to pique the curiosity of communication researchers. We must investigate users’ abilities to conceive, conceptualize, and envisage current warfare in an ever-changing stream of various content, as well as the influence of “digital infowar.” We must evaluate how user contributions might add to shared understandings of war while inspiring political action and a desire for change. Furthermore, we need to study users’ online conduct as a contribution to the “war feed,” in the sense that it develops narratives and public views about the conflict (Yarchi & Boxman-Shabtai, 2023)
The rise of social media in warfare, influenced by large platforms, has transformed warfare. In wars like Ukraine and Middle East, it has swayed public opinion, exposed human rights violations, challenged narratives, and documented combatant experiences. However, social media platforms are being weaponized for disinformation, propaganda, and psychological warfare, leading to claims of unjust treatment and censorship.
MIL to the rescue
Social media is increasingly used in military tactics for geopolitical influence, with hacking, bot wars, and disinformation spreading. Content moderation is weaponized to silence dissenting voices. While most scholarly studies on media trust in TikTok have focused on professional journalists and news organizations, these are just a small subset of the many individuals who feed the platform’s media ecosystem. Citizen journalism plays an important role in creating and sharing war-related content. According to the uses and gratifications hypothesis, TikTok influencers can develop deep relationships with their followers, with high audience loyalty potential and parasocial interaction.
Studies analyzing TikTok videos reveal the presence of far-right extremism on the ByteDance platform, which has not implemented Terms of Service to prevent provocative, antagonizing, harassing, distressing, or violent content or postings that are deliberately intended to scare, embarrass, or upset people, or to threaten physical violence (Weimann & Masri, 2023). This issue must be addressed at political and practical levels through national regulation and education. Furthermore, the use of AI in inventing evidence to justify action in these combat zones has become all too widespread and must be addressed. AI has the potential to set a hazardous precedent by participating in unjustified aggression or legitimizing an invading force during a conflict.
News literacy (NL) is critical to democratic life and trust in institutions. Under the broader umbrella of media and information literacy (MIL), NL examines how the world and society are mediated and empowers citizens to participate more actively and judiciously in the democratic process and political life (Ashley, 2020).
As our world becomes more polarised, the present war narratives on social media will shape future forms of information warfare. Researchers must build critical approaches to war and media technologies while considering issues in digital literacy and conflict resolution. Partnering with digital platforms, holding open discussions on information warfare, checking online sources, and fostering digital literacy on social media is essential. As global tensions rise and information warfare continues in predictable and malevolent ways, there is a growing need to strengthen genuine MIL and NL policies that root out manipulation and focus the continual conversation on ideas of compassion, aid, and humanity.
References
Beckett, C., (2022). New powers, new responsibilities A global survey of journalism and artificial intelligence. LSE Blogs, [online].
Bruns, A., & Burgess, J. (2015). Twitter hashtags from ad hoc to calculated publics. Hashtag publics: The power and politics of discursive networks, 13-28.
Cervi, L., & Divon, T. (2023). Playful activism: Memetic performances of Palestinian resistance in TikTok# Challenges. Social media+ society, 9(1), 20563051231157607
Cho, A., Byrne, J., and Pelter, Z., 2020, Digital civic engagement by young people RAPID ANALYSIS, UNICEF Office of Global Insight and Policy.
Chryssanthopoulou, K. (2022). Fake News Deconstructed Teens and Civic Engagement: Can Tomorrow’s Voters Spontaneously Become News Literate? In The Palgrave Handbook of Media Misinformation (pp. 45-62). Cham: Springer International Publishing.
Chryssanthopoulou, K. (2023). Facts, Opinions, and News. Media Literacy, Equity, and Justice.
Clark, J., & Van Slyke, T. (2010). Beyond the echo chamber: Reshaping politics through networked progressive media. The New Press.
Ertuna, C. (2023). “TikTokisation” of the War. Mapping Lies in the Global Media Sphere.
Freimann, C. (2024). Citizen War Journalism on TikTok: A reception study about young adults’ trust in war content on the example of alternative reporting on the Israel-Gaza conflict.
Gray, J. E. (2021). The geopolitics of” platforms”: The TikTok challenge. Internet policy review, 10(2), 1-26.
Hasse, A., Cortesi, S., Lombana-Bermudez, A., & Gasser, U. (2019). Youth and artificial intelligence: Where we stand. Berkman Klein Center Research Publication, (2019-3).
Hendrickx, J. (2023). The rise of social journalism: An explorative case study of a youth-oriented Instagram news account. Journalism Practice, 17(8), 1810-1825.
Hung, T. C., & Hung, T. W. (2022). How China’s cognitive warfare works: A frontline perspective of Taiwan’s anti-disinformation wars. Journal of Global Security Studies, 7(4), ogac016.
Kahneman, D. (2011). Thinking, fast and slow. Macmillan
Karalis, M. (2024). Fake leads, defamation and destabilization: how online disinformation continues to impact Russia’s invasion of Ukraine. Intelligence and National Security, 39(3), 515-524.
Levy, R. E. (2021). Social media, news consumption, and polarization: Evidence from a field experiment. American economic review, 111(3), 831-870.
Madden M., Lenhart A., Fontaine C., (2017), How youth navigate the news landscape, Knight Foundation.
Marwick, A. E., & Boyd, D. (2011). I tweet honestly, I tweet passionately: Twitter users, context collapse, and the imagined audience. New media & society, 13(1), 114-133.
McGrew, S., Breakstone, J., Ortega, T., Smith, M., & Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & research in social education, 46(2), 165-193.
Palmer, A. W. (2023). How TikTok Became a Diplomatic Crisis. International New York Times, NA-NA.
Papacharissi, Z., & Oliveira, M.F. 2012). Affective news and networked publics: The rhythms of news storytelling on# Egypt. Journal of communication, 62(2), 266-282.
Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A. A., Eckles, D., & Rand, D. G. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592(7855), 590-595.
Primig, F., Szabó, H. D., & Lacasa, P. (2023). Remixing war: An analysis of the reimagination of the Russian–Ukraine war on TikTok. Frontiers in Political Science, 5, 1085149.
Redlawsk, D. P. (2002). Hot cognition or cool consideration? Testing the effects of motivated reasoning on political decision making. Journal of Politics, 64(4), 1021-1044.
Rickli, J. M., & Ienca, M. (2021). The security and military implications of neurotechnology and artificial intelligence. Clinical Neurotechnology Meets Artificial Intelligence: Philosophical, Ethical, Legal and Social Implications, 197-214.
Sandrini, L., & Somogyi, R. (2023). Generative AI and deceptive news consumption. Economics Letters, 232, 111317.
Sidorenko Bautista, P., Alonso López, N., & Paíno Ambrosio, A. (2023). Tiktok as a new paradigm for information in the Ukrainian War. A study from the West of the initial coverage of the conflict through this platform. Estudios sobre el mensaje periodístico, 29(3).
Weimann, G., & Masri, N. (2023). Research note: Spreading hate on TikTok. Studies in conflict & terrorism, 46(5), 752-765.
Westcott, K., Arbanas, J., Arkenberg, C., Auxier, B., Louks, J., & Downs, K. (2022). Digital media trends: Toward the metaverse. Deloitte.
Wheaton, S. (2024). Paid influencers in the Parliament election. Politico, May 23 ,2024.
Yarchi, M., & Boxman-Shabtai, L. (2023). The Image War Moves to TikTok Evidence from the May 2021 Round of the Israeli-Palestinian Conflict. Digital Journalism, 1-21.
Current Issues
- Media and Information Literacy: Enriching the Teacher/Librarian Dialogue
- The International Media Literacy Research Symposium
- The Human-Algorithmic Question: A Media Literacy Education Exploration
- Education as Storytelling and the Implications for Media Literacy
- Ecomedia Literacy
- Conference Reflections
Leave a Reply