Abstract
Drawing on examples of Bitcoin and climate disinformation, this article demonstrates why Big Tech algorithms have a significant environmental impact and how media literacy educators can respond. Big Tech algorithms reinforce the economic models of surveillance and carbon capitalism, which are dependent on two forms of extractivism: data harvesting and resource extraction. To encourage a holistic environmental analysis of algorithms, ecomedia literacy’s four zone approach enables an investigation from the perspectives of ecoculture, political ecology, ecomateriality, and lifeworld. For media literacy educators, the challenge is to develop curricula and methods that address these different standpoints, which can include critical media literacy, design justice, civic media literacies, news and misinformation literacies, and ethical algorithm audits.
Keywords
AI, Surveillance Capitalism, Algorithms, Ecomedia Literacy
Introduction
Algorithms at the core of Big Tech are epistemological because they codify the kinds of information and media content that count as knowledge; they also normalize surveillance and carbon capitalism as the global economic status quo (Brevini & Murdock, 2017; Couldry & Mejias, 2019; Zuboff, 2019). “Algorithms provide a structure that frames—and constrains—how we express ourselves. They are a way of seeing and acting in the world…” (Ridley & Pawlick-Potts, 2021, p. 2). For example, the epistemology of frictionless online shopping supported by recommendation algorithms and their vast data networks increases demand for fast and convenient consumerism, normalizing a fossil fuel culture of consumption, toxic material supply chains, and waste as an acceptable cost of doing business (Corbett, 2021). The physical infrastructure of data processing requires massive server farms, which cannot exist without a steady supply of fossil fuel energy, natural resource extraction, and e-waste. This supply chain is predicated on designating disposable populations (those deemed unworthy of health and safety protections) and sacrifice zones (ecosystems that can be destroyed for the sake of the global economy).
Given the important role algorithms have in defining our relationship with the environment (while also constituting their own environment), it’s worth asking, are algorithms good for the planet?[1] The methodology of ecomedia literacy can help us answer this. Ecomedia literacy explores how all media have an environmental mind/footprint. This follows from the basic premise from ecomedia studies that media are of and about the environment, meaning that a holistic analysis of media requires examining both their material and symbolic characteristics (see López et al., 2023). As formulated by López (2021), ecomedia literacy should approach any study of media from four different zones: ecoculture, political ecology, ecomateriality, and lifeworld.[2] Ecoculture refers to the shared beliefs that are conveyed through multimodal language (symbols and discourses). Political ecology is the study of how economic and power structures design systems and produce impacts on the environment, including the production of ideologies and material goods. Ecomateriality is the realm of the environmental and material conditions of media, be it the physical properties of a medium (i.e., paper in books or magazines) or of gadgets (i.e., chemicals, glass, plastic, metals, etc.). Lifeworld is the ecology of perception and sensemaking, and how media affect our sense of place, time, and space. All these standpoints are applied to the object of analysis, which can be a media text, gadget, platform, or hyperobject (a dispersed phenomenon like the internet).
Starting with the example of Bitcoin for framing how information and communication technology (ICT) impacts the environment, this chapter then pivots and offers a roadmap to both the content and configuration of climate disinformation to help guide media literacy educators to understand how Big Tech algorithms affect the environment. We offer multiple approaches that include the ability to debunk and decipher automated climate disinformation, recognize and respond to the material infrastructure of disinformation, and engage in eco-citizenship to reform and change the system so that it affords climate solutions.
Bitcoin: A Primer on Algorithmic Environmental Impacts
To give one extreme example of this kind of analysis applied to algorithms, Bitcoin is a cryptocurrency based on blockchain digital ledgers. From the political ecology perspective, its design requires increased energy consumption. Bitcoin “minting” is proportional to how much energy is consumed (mostly fossil fuels) to solve the complex mathematical formulas necessary to validate and record new transactions (and receive transaction fees). Bitcoins are awarded for those who first perform this validation function, so there is a race to utilize bigger and faster server farms to perform as many transactions as possible in the shortest amount of time. In addition, the emergence of “crypto colonialists”–tech-savvy groups setting up shop in economically disadvantaged regions around the world–further exacerbate existing global power hierarchies.
The ecocultural standpoint recognizes how, like other cryptocurrencies and blockchain technologies such as NFTs, Bitcoin is promoted as something revolutionary, innovative, and disruptive. As a part of the emerging Web 3.0, Bitcoin is boasted along with other blockchain technologies as a magical solution for economic woes and financial independence. In the Global South it is touted to some as an alternative to the governing structure of finance and banking, which are rightfully seen as oppressive and corrupt. Bitcoin can create a sense of ownership that bypasses national currencies and tax authorities. In general, cryptocurrencies are marketed as empowering, inevitable, and a solution to social problems, reflecting the rhetoric of modernity that values technological progress and individualism as signs of civilizational development. But far from revolutionary, cryptocurrencies are controlled by a small group of coin holders (called “whales”) and are completely unregulated, so they are subject to extreme market fluctuations and scams. According to the lifeworld assessment, the negative personal and social impacts of Bitcoin are akin to those of gambling and addiction.
Viewed from the zone of ecomateriality, Bitcoin’s technical infrastructure has a huge environmental burden. Blockchain technologies require massive server power and constant equipment upgrades, since efficiency and speed are valued for cheaper forms of crypto-mining. Bitcoin produces 22 to 22.9 million metric tons of CO2 emissions each year—equivalent to the CO2 emissions from the energy use of 2.6 to 2.7 billion homes for one year. The immense amount of electricity consumed annually by Bitcoin (estimated to be as much as Argentina in a year) can tip global CO2 levels past the 1.5-degree Celsius goal set by the Paris climate accords. Even though the increased use of electric vehicles has prevented 50 million tonnes of CO2, that is just half Bitcoin’s emissions. To fulfill sustainability aims and to reduce stress on its grid infrastructure, in 2021 China kicked out Bitcoin miners (who were previously 60% of the world’s total). Other environmental impacts are e-waste generated by Bitcoin machines (11.5 kilotons a year) and water needed to cool servers (Cho, 2021). As Bitcoin grows and its price increases, it demands more work from servers to solve algorithmic puzzles across the system. Supposedly proof of stake algorithms promoted by Ethereum can reduce energy consumption of cryptocurrencies by 99.95, but scholars note that energy waste in Bitcoin is a design feature, not a bug. Indeed, research shows that voluntary implantation of energy efficient code is unlikely to be adopted voluntarily and it will take global governance and regulation to reign in Bitcoin’s environmental impacts (Howson, 2022).
The cryptocurrency culture of get-rich quick and technological progress combines to drive a system that demands high energy use, gadget consumption, and e-waste. This system is designed and driven based on values that are environmentally and socially destructive, and though promoted as a social good, it is intended to evade social responsibility by facilitating illegal activities (such as drug or arms trafficking), stressing power grids, and driving ecological damage. Bitcoin is an extreme example of algorithmic impacts on the environment, but it enables us to see how we can approach algorithms holistically with a method like ecomedia literacy.
Algorithms of Climate Disinformation
This article now shifts its primary focus to another kind of environmental impact of algorithms: climate disinformation. Although different from Bitcoin in significant ways, the same ecomedia literacy framing applies. The fossil fuel industry and its PR apparatus game social media algorithms by driving climate disinformation to produce a landscape of climate inaction and denial. The frictionless nature of social media means that most climate disinformation goes unchecked or is unverified. Much of it gets distributed as fake climate news and propaganda via AdTech (the automated system of online advertising), targeted ads, troll bots (automated user accounts), and re-distributed content from a closed rightwing influencer ecosystem. Algorithms also direct users to extremist media producers and amplify content that generates outrage and conspiracy theories, which often aligns with climate denial and fossil fuel interests. The goal is to inhibit climate action by confusing and influencing the public and policy makers by reducing climate literacy, disrupting a coherent narrative to solve the crisis, increasing political polarization, deflecting blame and responsibility, and promoting climate silence (López, 2022a). Facilitating this system is the underlying infrastructure of surveillance capitalism’s data colonialism and AdTech, which have a massive carbon footprint (Brevini, 2022; Couldry & Mejias, 2019; Cucchietti et al., 2022).
The central problem is that society depends on trust and a coherent narrative to solve the climate crisis. The prevalence of disinformation has a chilling effect that creates a spiral of climate silence, leading to low trust in journalism, science, and academia.
Ecoculture: Delay and Denial
The increased importance of algorithms, which determine the kind of information and news people are exposed to when they use social media or perform searches, has created an information environment where serious, professional climate journalism is competing with concerted disinformation campaigns that are amplified by algorithms. Climate disinformation is associated with “skepticism, contrarianism, and denial,” which is often ascribed to the rightwing, whereas “climate alarmism” tends to be associated with leftwing discourses (Treen et al., 2020). Whereas past science denial movements were catalyzed solely by experts of the field, Al-Rawi et al. (2021) assert climate disinformation is an integral tool through which skepticism, denial, and contrarianism spreads. While misinformation is widely understood as information that is simply false or incorrect, disinformation is “explicitly false or misleading information” (Benkler et al., 2018, p. 32) or “accurate information deliberately presented in such a way as to be misleading” (Treen et al., 2020, para. 12).
The more colloquial term, “fake news,” maintains Trump-era connotations often ascribed to any concept or event deemed unfavorable, regardless of its truths or falsehoods. The abuse and wide-ranging application of the term fake news has made it practically meaningless, which muddies the discussion of real problems with news media and propaganda, which are substantial when it comes to how the climate crisis is communicated and covered by journalists (Hertsgaard, 2021). In the context of examining the environmental impacts of algorithms, “fake climate news” is used to describe deliberate climate disinformation and propaganda designed to reinforce rightwing ideology about the market economy and to create confusion about climate science to prevent industry regulation (López & Share, 2020). It is a form of network propaganda, which is “the ways in which the architecture of a media ecosystem makes it more or less susceptible to disseminating these kinds of manipulations and lies” (Benkler et al., 2018, p. 24). Algorithms are central to the system of climate disinformation.
Students can learn how to identify the main tactics of fake climate news, which are a mix of denial and delay. Big Carbon—defined here as a network of extractors, producers, refiners, and distributors of coal, gas, and petroleum—promotes fake climate news to achieve several objectives. First and foremost, they must maintain their social license to operate through reputation management. To deflect criticism of their business practices, they maintain a “denial space” around climate science to prevent present and future regulation of their industry. They drive disinformation to convince the public that climate action is pointless and to distract from blame and responsibility. The net result is reduced climate literacy, political polarization, silencing scientists and scholars, and disrupting a coherent narrative of climate action. A politics of disorientation and misperceptions makes it difficult for many to be able to tell the difference between real and false information (López, 2022a).
The main tactic is to position global heating as a theory/belief, but not fact, by creating the impression that there is debate over the science. But given this increasingly untenable argument—refuted by a global scientific consensus and rising public alarm—they are now touting adaptation and the so-called benefits of a warming world (reflected in phrases such as, “The Earth is greening” or “CO2 is the gas of life”). They also confuse weather and climate, so that in winter it can be claimed that there is no global warming. They reframe environmental arguments around jobs and economic development and argue for energy “independence” (i.e., domestic fossil fuel production over dependence on foreign sources). They also frame climate change advocates as irresponsible extremists.
Though climate science denial remains a strong belief within the ideologically aligned rightwing mediasphere, it is losing legitimacy in broader, global policy debates. As part of an evolving response, there are emerging discourses of solutions denial and delay. Solutions denial goes something like: human-produced greenhouse gasses are not causing global warming; climate impacts are not bad; climate solutions won’t work; clean energy is unreliable and will lead to shortages. They fear-monger about potential disruption caused by climate action and argue that policies like the Green New Deal are too expensive and will hurt lower-income populations, leading to “energy poverty.” They assert climate science or scientists are unreliable by touting conspiracy theories that climate change is a hoax to get funding. Another conspiracy story they promote is that global elites are using climate action as a trojan horse for “green tyranny” (mimicking the great reset rhetoric of Q-Anon conspiracies). Discourses of delay are grounded in climate “inactivism” by promoting individual change, not system change, and pushing non-transformative solutions that don’t disrupt the status quo. They redirect responsibility to consumers, migrants, and lower-income regions of the world. Cynically, they promote the idea that collective climate action is not possible and that only the fossil fuel industry is a responsible actor (savior) working on solutions and technological fixes (Lamb et al., 2020; Shenker, 2021).
Lifeworld: Epistemic Crisis
Algorithms amplify disinformation, catalyzing polarization, public confusion, and political inaction. Due to our inherent cognitive biases, this leads to what has been deemed an “epistemic crisis”: in a post-truth world conspiracy theories triumph over scientific consensus. Eli Pariser (2012) coined the term “filter bubble” which refers to the ways in which algorithms curate content based on past searches and isolate individuals from content that is ideologically dissimilar to those past searches, leading to homophily, polarization, and echo chambers. In surrounding oneself with others who share underlying belief systems, ideologies, and social norms, individuals confirm their own biases to the point that climate skepticism, contrarianism, and denial are inextricable facets of their political identity.
Due to the framing effect, people remember disinformation once it spreads, making it very difficult to debunk (Lakoff, 2004). As the stolen election lie spread by Trump demonstrates, the phenomena of illusory truth asserts that the more someone is exposed to a falsehood, the more likely it will be believed (Resnick, 2017). This is reinforced by anchoring, which is the tendency to rely on the first piece of information offered, and in-group bias, which leads people to favor the opinions and beliefs of those who belong to a group. Unfortunately, prior knowledge doesn’t make people immune.
Political Ecology: The Infrastructure of Disinformation Algorithms
The roots of disinformation can be traced back to money. Corporate and philanthropic actors (Donors Capital Fund, ExxonMobil Foundation, Koch Affiliated Foundations, Vanguard Charitable Endowment Program, Mercer Family, etc.) finance conservative foundations (International Climate Science Coalition, Global Climate Coalition, Information Council for the Environment) and industry trade groups (Manufacturers’ Accountability Project, Alliance of Automobile Manufacturers, ALEC, American Petroleum Institute), who in turn fund the production of climate disinformation by political and religious organizations, astroturf organizations (fake environmentalists), libertarian and conservative think tanks, and contrarian scientists. Disinformation is then disseminated through a rightwing influencers echo chamber consisting of conservative and rightwing media (Infowars, Breitbart, True Pundit, Gateway Pundit, One America News, talk radio, etc.), far-right internet subcultures (4Chan/8Chan, dark web), skeptical bloggers, and politicians (Trump, etc.). This finally infiltrates public discourses through more mainstream channels, such as Tucker Carlson and Laura Ingram on Fox News, who amplify climate denial, skepticism, and contrarianism to wider audiences, which in turn causes an even broader circulation of climate disinformation. Overall, this trolling and disinformation apparatus is more likely to drive the climate narrative than journalism (Treen et al., 2020).
Algorithms are strategically deployed throughout this process. First is the way social media amplify hate and controversy to generate clicks to keep people on their platforms, a kind of “outrage” economy of affect and attention (Benkler et al., 2018). Marwick and Lewis (2017), who extensively researched how internet subcultures influence media, explain how these subcultures are expert in manipulating algorithms. Through “attention hacking,” far-right groups “increase the visibility of their ideas through the strategic use of social media, memes, and bots—as well as by targeting journalists, bloggers, and influencers to help spread content” (p. 1). This feeds into the “media’s dependence on social media, analytics and metrics, sensationalism, novelty over newsworthiness, and clickbait makes them vulnerable to such media manipulation” (p. 1). As a result, “Media manipulation may contribute to decreased trust of mainstream media, increased misinformation, and further radicalization (red-pilling)” (p. 1). Overall, this attention hacking strategy feeds into the broader Big Carbon goal of perception hacking to create doubt about their role in the climate crisis.
The spread of disinformation is often described in epidemiological terms; those who are susceptible to disinformation through homophily-based social connections can in turn infect other people. This so-called epidemic takes place primarily on platforms like Facebook and Twitter through feedback loops driven by algorithms. Augmenting astroturfers, opinion leaders, contrarian scientists, and bloggers, bots (automated user accounts) commonly tamper with algorithms to encourage a wider spread of disinformation (Treen et al., 2020). This can include troll army and swarm attacks orchestrated by fake news factories and foreign troll farms against climate scientists, the creation of botnets and sockpuppets, and clickbait memetic warfare (McKew, 2018).
As if these feedback loops are not dangerous enough, the weak infrastructure set in place to discourage such polarization is abysmally unreliable. Facebook engages in what they call “downranking” or suppressing content regarding topics like nudity, war, and other sensationalist content in the hopes of reducing polarization. But according to Gizmodo, Facebook has a laissez-faire attitude about climate denial (Kahn, 2021). In one example, during the February 2021 power outages in Texas, 99% of climate disinformation went unchecked (Friends of the Earth, 2021). Overall, ten publishers are shown to be responsible for 69% of digital climate change denial content, with 92% of the most popular articles having no label about climate crisis misinformation (Center for Countering Digital Hate, 2021). According to a 2021 report, there were between 818,000 and 1.36 million views of climate misinformation every day on Facebook, but only 3.6% were fact checked (Stop Funding Hate, 2021). In 2021, The American Petroleum Institute inundated Facebook with ads targeting the budget reconciliation debate’s climate initiatives, sending users to their “Energy Citizens” page and thanking politicians like Senator Joe Manchin for being a “Champion of American Made Energy” (InfluenceMap, 2021).
In the case of Twitter, there are several examples of algorithm manipulation. After CNN’s climate change town hall in 2019, there was a surge in activity of trollbots on Twitter, which originated from sites known to be unreliable or for repeatedly violating Twitter’s terms of service (Lavelle, 2019). In 2020 a quarter of all tweets about the climate crisis were produced by climate disinformation bots, which can be augmented through Twitter’s promoted tweets option allowing accounts to boost their posts (Milman, 2020). In the period leading up to the US withdrawal from the Paris climate agreement, 25% of climate-related tweets originated from bots, the majority of which supported President Trump’s decision and spread disinformation about “fake science” (Milman, 2020). An analysis of 300,000 tweets between January 2016 and May 2021, which included commonly used denier hashtags such as #climatechangehoax, #climatechangeisfake, and #climatecult, confirmed an evolving strategy from outright science denial to attacking solutions, creating fear, and culture war misinformation.
AdTech, the ecosystem of advertising technology integral to surveillance capitalism, is the backbone of online advertising, but is little known because it is opaque and mostly unregulated. AdTech is the means by which advertising targets and distributes content and is one of the primary means of financing disinformation. This form of algorithmically-based “surveillance advertising” is a $763.2 billion (in 2021) industry that automates where ads are placed through microtargeting and generating revenue for websites that feature disinformation. In 2020, 97.9% of Facebook’s and 80% of Google’s global revenue was generated from advertising. In 2022, 80% to 90% of the market (excluding China) will be accounted for by Facebook, Google, and Amazon (Cucchietti et al., 2022). AdTech companies such as Google Ads have categories like gambling or adult content that advertisers can choose to remove from their inventories. But there are no categories for disinformation sites, and AdTech companies don’t want to decide what is or isn’t disinformation (Kelly, 2021). Check My Ads Institute is a watchdog group that monitors and fights the spread of disinformation in AdTech. They advocate for the defunding of disinformation through targeting national advertisers to make sure they are not monetizing disinformation sites. As we will see below, AdTech is also a driver of greenhouse gasses. Those who profit off climate disinformation not only include the social media companies whose advertisements make money, but so do Big Carbon and associated industries (such as automobile manufacturers) who fund these disinformation campaigns.
Ecomateriality: Environmental Impacts of Climate Denial Algorithms
Algorithms that enhance climate disinformation are an expression of surveillance capitalism’s extractive economy and supply chain. Fossil fuel energy extraction is part of a materials economy that includes other natural resources needed for the cloud and our gadgets, such as rare earth minerals and metals. Thus, it’s necessary to think more broadly how surveillance capitalism impacts the environment and people, especially through the lens of water. Surveillance capitalism’s sensors, machine intelligence capabilities, and platforms depend on a massive material infrastructure, often referred to as the “cloud.” Energy is used to power and cool the cloud’s servers, so much so that it has a greater carbon footprint than the entire global airline industry (Dillon, 2022). Data centers consume the equivalent electricity of 50,000 homes, but this excludes the energy used for making machines (referred to as emergy—embedded energy), energy consumed by users’ devices, and the trafficking of content across the internet. Fossil fuels, which are the primary source of climate emissions, account for 64% of the cloud’s current energy mix. Overall, IT’s energy footprint is 7% of global power consumption. In addition, land clearing for data centers destroys wildlife habitats and reduces the capacity of land to absorb CO2 (Brevini, 2022).
Water consumption, contamination, and waste are fundamental to digital media technologies. For example, in China manufacturing and recycling degrade local ecosystems; half of China’s water is now unusable because of pollution. Mining of rare earth metals, using methods like acid mine drainage, also contaminate groundwater. When it comes to e-waste, rainwater dissolves chemicals and hazardous toxic metals like chromium, lead, mercury, iron, zinc, copper, and cadmium that end up in deep aquifers and surface waters such as ponds, rivers, and lakes. Irrigated water and toxic dust impacts soil, with toxins finding their way into plants, animals, and humans by getting concentrated in the food chain. Microplastic fibers are now detected in snow and rain across the globe.
Advertising algorithms drive increased consumerism, which impact the climate, but they also produce emissions. A 2016 study estimated that the global carbon emission of online advertising to be 60 Megatons. To put this in perspective, if Netflix adapted their platform into an ad-based service, much like Spotify, it would increase global video ads by 10%, which would lead to an extra 23.76 million tons of emitted carbon dioxide (Pärssinen et al., 2018). AdTech has a large carbon footprint, with energy subsidized by users’ individual devices running cookies in the background. An analysis of AdTech’s carbon footprint found that:
Despite being created and stored in the user’s device, tracking technologies are mostly undetectable to the average user, which makes extracting large amounts of user data a relatively easy task. Moreover, despite their “invisibility” and relatively small size, tracking technologies are responsible for triggering millions of algorithmic processes that ultimately facilitate trading in data on a global scale, nurturing an ever-growing ecosystem that densely relies not just on exploiting user data but also on sucking out the power of the user’s device to actually function. (Carbolytics, n.d., para. 4)
On average, there are 97 trillion browser-based cookies per month, producing 11,442 monthly metric tons of CO2 emissions. This excludes emissions data from apps tracking user data.[3]
When it comes to water, climate disinformation is part of a feedback loop: it leads to climate inaction, which means ongoing and rising CO2 emissions. While increased global heating threatens water supplies, so does the expansion of surveillance capitalism’s supply chain. Big Tech’s partnership with the fossil fuel industry to make extraction more efficient through AI development exacerbates environmental impacts. This whole system is made possible by designating disposable populations and sacrifice zones.
Media Literacy, Algorithms, and the Planet
This article demonstrates why the conversation about AI and algorithms emerging in media literacy must include the climate crisis. To reiterate, what’s at stake is what the sixth assessment report of the Intergovernmental Panel on Climate Change (IPCC) summarizes as the many dangers that impact all of us: no region in the world will be unimpacted by the climate emergency; half the global population is highly vulnerable; major ecosystems are losing their capacity to absorb CO2; and we have three years to peak our emissions until we cut them drastically. Of interest to media literacy, this is the first IPCC report to cite misinformation as a major hurdle to climate action: “Addressing these risks have been made more urgent by delays due to misinformation about climate science that has sowed uncertainty, and impeded recognition of risk (high confidence)”; and “Despite scientific certainty of the anthropogenic influence on climate change, misinformation and politicization of climate change science has created polarization in public and policy domains in North America, particularly in the US, limiting climate action (high confidence)” (quoted in López, 2022c).
As we have seen in the case of Bitcoin and climate disinformation, algorithms have a significant environmental dimension. Ecomedia literacy’s four zone approach enables the examination of algorithms from the perspective of ecoculture, political ecology, ecomateriality, and lifeworld. For media literacy educators, the challenge is to develop curricula and methods that address these different standpoints. Educational and reform approaches emerging from the study of climate disinformation include inoculating against misinformation; responding to mis/disinformation with facts and correct information; early detection of malicious accounts; and the use of ranking and selection mechanisms (Treen et al., 2020). However, we should not fully rely on “responsibilization” that puts the onus of the climate disinformation problem on individuals. The political ecology of ICTs driving climate disinformation—surveillance and carbon capitalism—have to be engaged. Kunstman and Rattle (2019) outline the need for critical voices within spaces where decisions about algorithms and advertising take place. Additionally, they encourage bringing the discussion of climate and energy into the center of digital media and communication studies.
Examples of climate disinformation on Facebook and Twitter like those discussed in this article could be explored through ethical algorithm audits performed by students. Ethical algorithm audits are defined as “assessments of the algorithm’s negative impact on the rights and interests of stakeholders, with a corresponding identification of situations and/or features of the algorithm that give rise to these negative impacts” (Brown et al., 2021, p. 2). Put into the context of the climate crisis, this would entail an expanded definition of stakeholder interests and rights, especially when considering extractivism and its impact on disposable populations and sacrifice zones. This would include the e-waste workers in Ghana whose labor is excluded from how we conceive the digital economy, yet is certainly part of what constitutes “digital labor” (Iheka, 2021).
Considerable resources in media literacy have been devoted to understanding advertising, but the opacity of AdTech’s mechanism of surveillance capitalism and data colonialism represents a challenge for educators. We need to draw attention to their carbon emissions, but also to how AdTech subsidizes climate disinformation and promotes climate destroying consumerism. The examination of surveillance capitalism’s political ecology requires methods like critical media literacy, which help identify economic and power structures and how they produce and disseminate fake climate news. The hope is that educators engaged in news literacy, propaganda, fake news, and mis/disinformation will see the value of working specifically on fake climate news to center the climate crisis as an important theme in their work.
Jocelyn Miller et al. (2021) have made an important assertion for combining media and science literacy to tackle climate and Covid-19 disinformation. The ecomediasystem model developed by López (2021, 2022b) can assist in addressing climate disinformation from a systems perspective by examining the structure of the media and tech industries. Methods of eco-citizenship that build on Paul Mihailidis’ model of civic media literacy (Gordon & Mihailidis, 2016; 2019) should expand to include efforts to respond to the climate crisis. The design justice methodology (Costanza-Chock, 2020) can be used to help students envision green tech that doesn’t harm marginalized communities.
The climate crisis is not fake news. It is bad news that needs to be addressed. The growth of climate disinformation and the confusion, polarization, and inaction that comes along with it threatens our civilizational project. In his book, Anti-Social: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation, Andrew Marantz (2019) writes, “We like to assume that the arc of history will bend inexorably toward justice, but this is wishful thinking” (p. 4). The environmental damage of surveillance and carbon capitalism is under-theorized in media literacy scholarship, so as educators start responding to the important role of algorithms, it’s essential that the climate crisis be centered in the discussion. To answer this article’s central question, as of now it’s clear that Big Tech algorithms are not good for the planet. Algorithms are not neutral but designed. While AI is touted as a solution for solving many of our pressing environmental problems, Big Tech’s partnership with Big Carbon to accelerate and make more efficient extractive technologies is a step in the wrong direction (Brevini, 2022). The ecomodernist promise of technological solutions to social and environmental problems that stem from our global system of capitalism cannot simply be solved by the same thinking that generated the problems in the first place. Given our predicament, ecomedia literacy and design for justice will be essential for identifying and proposing solutions.
References
Al-Rawi, A., OʼKeefe, D., Kane, O., & Bizimana, A.-J. (2021). Twitter’s fake news discourses around climate change and global warming. Frontiers in Communication, 6, 729818. https://doi.org/10.3389/fcomm.2021.729818
Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press.
Brevini, B. (2022). Is AI good for the planet? Polity Press.
Brevini, B., & Murdock, G. (Eds.). (2017). Carbon capitalism and communication: Confronting climate crisis. Palgrave Macmillan.
Brown, S., Davidovic, J., & Hasan, A. (2021). The algorithm audit: Scoring the algorithms that score us. Big Data & Society, 8(1), 205395172098386. https://doi.org/10.1177/2053951720983865
Carbolytics. (n.d.). Retrieved June 5, 2022, from https://carbolytics.org/
Center for Countering Digital Hate. (2021). The toxic ten: How ten fringe publishers fuel 69% of digital climate change denial. Center for Countering Digital Hate.
Cho, R. (2021, September 20). Bitcoin’s impacts on climate and the environment. State of the Planet. https://news.climate.columbia.edu/2021/09/20/bitcoins-impacts-on-climate-and-the-environment/
Corbett, J. B. (2021). Communicating the climate crisis: New directions for facing what lies ahead. Lexington Books.
Costanza-Chock, S. (2020). Design justice: Community-led practices to build the worlds we need. The MIT Press.
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
Cucchietti, F., Moll, J., Esteban, M., Reyes, P., & García Calatrava, C. (2022, February 16). Carbolytics: An analysis of the carbon costs of online tracking. Carbolytics.
Dillon, G. (2022, January). Predictions 2022: The environmental impact of digital advertising. ExchangeWire. https://www.exchangewire.com/blog/2022/01/21/predictions-2022-the-environmental-impact-of-digital-advertising/
Friends of the Earth. (2021). Truth and lies: 4 days of Texas-Sized disinformation. Friends of the Earth. https://foe.org/news/facebook-study-climate-disinformation/
Gordon, E., & Mihailidis, P. (Eds.). (2016). Civic media: Technology, design, practice. The MIT Press. https://doi.org/10.7551/mitpress/9970.001.0001
Hertsgaard, M. (2021, September 2). Why won’t US TV news say ‘climate change’? The Guardian. https://www.theguardian.com/environment/2021/sep/02/us-media-hurricane-ida-climate-change
Howson, P. (2022, March 31). Bitcoin: Greenpeace says a code change could slash cryptocurrency energy use – here’s why it’s not so simple. The Conversation. http://theconversation.com/bitcoin-greenpeace-says-a-code-change-could-slash-cryptocurrency-energy-use-heres-why-its-not-so-simple-180264
Iheka, C. N. (2021). African ecomedia: Network forms, planetary politics. Duke University Press.
InfluenceMap. (2021, September 30). Part of the American Petroleum Institute’s Facebook strategy to undermine climate initiatives in the budget reconciliation package involves its ‘Energy Citizens’ page. It is running hundreds of ads targeting individual members of Congress. Here are some examples https://t.co/28i54bowLZ [Tweet]. @InfluenceMap. https://twitter.com/InfluenceMap/status/1443560418958614528
Kahn, B. (2021, October 26). The climate denial is coming from inside Facebook’s house. Gizmodo. https://gizmodo.com/the-climate-denial-is-coming-from-inside-facebooks-hous-1847939802
Kelly, M. J. (2021, March 24). How two women are taking on the digital ad industry one brand at a time. The Mozilla Blog. https://blog.mozilla.org/en/internet-culture/interviews/nandini-jammi-claire-atkin-check-my-ads/
Kuntsman, A., & Rattle, I. (2019). Towards a paradigmatic shift in sustainability studies: A systematic review of peer reviewed literature and future agenda setting to consider environmental (un)sustainability of digital communication. Environmental Communication, 13(5), 567–581. https://doi.org/10.1080/17524032.2019.1596144
Lakoff, G. (2004). Don’t think of an elephant!: Know your values and frame the debate. Chelsea Green.
Lamb, W. F., Mattioli, G., Levi, S., Roberts, J. T., Capstick, S., Creutzig, F., Minx, J. C., Müller-Hansen, F., Culhane, T., & Steinberger, J. K. (2020). Discourses of climate delay. Global Sustainability, 3, e17. https://doi.org/10.1017/sus.2020.13
Lavelle, M. (2019, September 16). “Trollbots” swarm Twitter with attacks on climate science ahead of UN summit. Inside Climate News. https://insideclimatenews.org/news/16092019/trollbot-twitter-climate-change-attacks-disinformation-campaign-mann-mckenna-greta-targeted/
López, A. (2021). Ecomedia literacy: Integrating ecology into media education. Routledge.
López, A. (2022a). Gaslighting: Fake climate news and Big Carbon’s network of denial. In J. McDougall & K. Fowler-Watt (Eds.), Palgrave handbook of media misinformation. Palgrave MacMillan.
López, A. (2022b). Weak, resilient, or collapsing news ecosystems? Three scenarios for the future of environmental news. In R. Anderson, N. Higdon, & S. Macek (Eds.), Censorship and the global crackdown. Peter Lang Publishing.
López, A. (2022c, April 12). Just look up!: The Sixth IPCC Report, polar temperature spikes, and the slap seen around the torld. International Council for Media Literacy. https://www.ic4ml.org/post/just-look-up-the-sixth-ipcc-report-polar-temperature-spikes-and-the-slap-seen-around-the-world
López, A., Ivakhiv, A., Rust, S., Chu, K., Chang, A., & Tola, M. (Eds.). (2023). Routledge handbook of ecomedia studies. Routledge.
López, A., & Share, J. (2020). Fake climate news: How denying climate change is the ultimate in fake news. Journal of Sustainability Education, 23. http://www.susted.com/wordpress/content/blog-post-fake-climate-news-how-denying-climate-change-is-the-ultimate-in-fake-news_2020_04/
Marantz, A. (2019). Antisocial: Online extremists, techno-utopians, and the hijacking of the American conversation. VIKING an imprint of Penguin Random House LLC.
Marwick, A., & Lewis, R. (2017). Media manipulation and disinformation online. Data & Society Research Institute.
McKew, M. (2018, October 3). Brett Kavanaugh and the Information terrorists trying to reshape America. Wired. https://www.wired.com/story/information-terrorists-trying-to-reshape-america/
Mihailidis, P. (2019). Civic media literacies: Re-imagining human connection in an age of digital abundance. Routledge.
Miller, J., Rost, L., Bryant, C., Embry, R., Iqbal, S., Lannoye-Hall, C., & Olson, M. (2021). Media literacy in the age of COVID and climate change. The Science Teacher, 88(6). https://www.nsta.org/science-teacher/science-teacher-julyaugust-2021-0/media-literacy-age-covid-and-climate-change
Milman, O. (2020, February 21). Revealed: Quarter of all tweets about climate crisis produced by bots. The Guardian. https://www.theguardian.com/technology/2020/feb/21/climate-tweets-twitter-bots-analysis
Pariser, E. (2012). The filter bubble: How the new personalized web is changing what we read and how we think. Penguin Books.
Pärssinen, M., Kotila, M., Cuevas, R., Phansalkar, A., & Manner, J. (2018). Environmental impact assessment of online advertising. Environmental Impact Assessment Review, 73, 177–200. https://doi.org/10.1016/j.eiar.2018.08.004
Resnick, B. (2017, October 5). The science behind why fake news is so hard to wipe out. Vox. https://www.vox.com/science-and-health/2017/10/5/16410912/illusory-truth-fake-news-las-vegas-google-facebook
Ridley, M., & Pawlick-Potts, D. (2021). Algorithmic Literacy and the role for libraries. Information Technology and Libraries, 40(2). https://doi.org/10.6017/ital.v40i2.12963
Shenker, J. (2021, November 11). Meet the ‘inactivists’, tangling up the climate crisis in culture wars. The Guardian. https://www.theguardian.com/environment/2021/nov/11/inactivists-tangling-up-the-climate-crisis-in-culture-wars-manston-airport-kent
Stop Funding Hate. (2021). #InDenial – Facebook’s Growing Friendship With Climate Misinformation. Stop Funding Heat.
Treen, K. M. d’I., Williams, H. T. P., & O’Neill, S. J. (2020). Online misinformation about climate change. WIREs Climate Change, 11(5). https://doi.org/10.1002/wcc.665
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
[1] This is a riff on Brevini’s (2022) important book, Is AI Good for the Planet?
[2] These terms can be simplified to culture, political economy, materiality, and lifeworld.
[3] You can see this visualized in the Carbolytics project by artist Joana Moll (https://carbolytics.org/)
Current Issues
- Media and Information Literacy: Enriching the Teacher/Librarian Dialogue
- The International Media Literacy Research Symposium
- The Human-Algorithmic Question: A Media Literacy Education Exploration
- Education as Storytelling and the Implications for Media Literacy
- Ecomedia Literacy
Leave a Reply