• Skip to main content
  • Skip to secondary menu
  • Skip to footer
International Council for Media Literacy

International Council for Media Literacy

Bridging Academia to Action

International Council for Media Literacy
Bridging Academia to Action
  • Get Involved with IC4ML
  • Homepage
  • About Us
    • Our Board
    • Our Advisory Council
    • Our History
      • Our Founders
      • Past Projects
      • Conferences
      • Sponsor Awards
  • Awards Program
    • Marieli Rowe Innovation in Media Literacy Education Award 
      • Marieli Rowe Innovation in Media Literacy Education Award Recipients
    • The Jessie McCanse Award
      • The Jessie McCanse Award Recipients
  • Newsletters
  • Blogs
  • The Journal of Media Literacy
    • About The Journal of Media Literacy
      • Our Philosophy
      • The Journal of Media Literacy Publication Ethics Policy
      • The Journal of Media Literacy Editorial Team
      • Author Guidelines for The Journal of Media Literacy
    • The Journal of Media Literacy Print Archives
      • The Journal of Media Literacy Print Archives 2018 to 2000
      • The Journal of Media Literacy Print Archives 1999 to 1953
    • The Journal of Media Literacy Digital Issues
      • The Journal of Media Literacy – Democracy by Collision or Connection? The Crisis of the Public Commons
      • The Journal of Media Literacy – Conference Reflections Issue
      • The Journal of Media Literacy – MIL Teacher Librarian Dialogue Issue
      • The Journal of Media Literacy – Research Symposium Issue
      • The Journal of Media Literacy – Human AI Issue
      • The Journal of Media Literacy – Ecomedia Literacy Issue
      • The Journal of Media Literacy – Storytelling Issue

The Human Algorithmic Question: A Social Justice Imperative

Setembro 1, 2022 by Melda N. Yildiz

Melda N. Yildiz
Melda N. Yildiz

A Letter from Guest Editor, Melda N. Yildiz, EdD

As Artificial intelligence (AI) systems and machine learning algorithms have been used to automate decision-making processes from education to economic sectors, there is an inevitable impact on democracy and social justice. Cathy O’Neal (2016) reveals real-life impact stories of algorithmic injustice in her book, “Weapons of Math Destruction.” O’Neal highlights some troubling examples in which the reality of algorithmic decision-making falls short of our expectations. O’Neil provided examples of how big data is used in employment opportunities to the justice system and how it can be a threat to democracy and leads to inequity. (O’Neal, 2016).

According to Greenlining Institute Report (2021), “Algorithmic bias occurs when an algorithmic decision creates unfair outcomes that unjustifiably and arbitrarily privilege certain groups over others.” Based on the big data, AI could predict our life expectancy based on our zipcodes and economical status.. The report outlined how algorithms are used to decide who gets “access to affordable credit, jobs, education, government resources, health care and investment.”  (Greenlining Institute Report, 2021).

Algorithms are harnessing data to influence decisions from sentencing for imprisonment to helping banks determine who gets the best interest rate, who gets into what college. For example, “automated risk assessments used by U.S. judges to determine bail and sentencing limits can generate incorrect conclusions” that result in targeted groups who get longer prison sentences and pay higher bail amounts. (Turner, Resnick, & Barton, 2019), (Mayer-Schonberger & Cukier, 2013) and (Sumpter, 2018).

Authors in the journal brought perspectives from their research studies and professional experiences. To better prepare our next generation from the intended or unintended consequences of AI, we need to start the dialogue among public and private sectors to reengineer and rethink the future of AI and machine learning to eradicate the social injustice and historical inequalities. To eliminate bias in algorithms, all stakeholders from programmers to elected officials need to collaborate to identify, mitigate, and remedy the impacts on our lives.

In 2018, Mbadiwe outlined the statistics from ProPublica and reported on racial disparities and the “fairness paradox” and “machine bias.” (Angwin, Larson, Mattu. & Kitchner, 2016) “ Don’t blame the algorithm — as long as there are racial disparities in the justice system, sentencing software can never be entirely fair.” (Mbadiwe, 2018) We first need to address algorithmic bias, areas like education, employment, and  housing, then we may have a chance to close the racial wealth gap and health disparities. (Greenlining Institute Report, 2021)

In these journal articles, authors question how our online searches and purchase history are being used;  how we leave a trace of data going through a toll booth or using our credit cards; and how this data is used. In addition to injustices domestically, Sharley (2020) points out the importance of international humanitarian laws especially to mend the injustices and provide human rights in conflict zones to prevent the “biased algorithms being used for the technologies of violence.” (Sharley, 2020)

In this journal, authors highlighted the importance of being proactive on the ethical use of AI and machine learning on automated decision-making tools. Media Literacy Education provides the framework that examines the unrepresentative, incomplete data, misinformation and disparities that reflect historical inequalities. Most importantly, we, media educators, collaborate to educate our students and communities to take action to eradicate social injustice one data point at a time.

References

Angwin, J., Larson, J. Mattu, S. & Kitchner, L. (2016). Machine Bias.  Retrieved: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

Greenlining Institute. (2021). Algorithmic Bias explained.  Retrieved: https://greenlining.org/wp-content/uploads/2021/04/Greenlining-Institute-Algorithmic-Bias-Explained-Report-Feb-2021.pdf

Mayer-Schonberger, V., & Cukier, K. (2013). Big data: A revolution that will transform how we live, work and think. Boston, MA: Houghton Mifflin Harcourt.

Mbadiwe, T. (2018). Algorithmic injustice.  Retrieved: https://www.thenewatlantis.com/publications/executive-summary-of-algorithmic-injustice

O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. London, England: Penguin Books.

Sharley, N. (2020). Algorithmic injustice: Mend it or end it.  Retrieved: https://www.boell.de/en/2020/02/13/algorithmic-injustice-mend-it-or-end-it

Sumpter, D. (2018). Outnumbered: From Facebook and Google to fake news and filter-bubbles — the algorithms that control our lives. London, England: Bloomsbury Publishing.

Turner, L. N, Resnick, P. & Barton, G. (2019). Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms.  Retrieved: https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/

Additional Resources

Tech Bias and Algorithmic discrimination. Courtney Thomas Jr. TEDxRochester. (July 22, 2019)  Retrieved:  https://www.youtube.com/watch?v=N9XaLNfExgM

Coded Bias. Retrieved:  https://www.codedbias.com/

Algorithms rule us all – VPRO documentary.  Retrieved: https://www.youtube.com/watch?v=NFF_wj5jmiQ

Forbes. (2020). Biased Algorithms Learn From Biased Data: 3 Kinds Biases Found In AI Datasets  Retrieved: https://www.forbes.com/sites/cognitiveworld/2020/02/07/biased-algorithms/?sh=5ba335c876fc

Bloomberg. (2020). What Are Algorithms and Are They Biased Against Me?  Retrieved: https://www.bloomberg.com/news/articles/2020-12-11/what-are-algorithms-and-are-they-biased-against-me-quicktake

Chicagobooth. (2021). Algorithmic Bias Playbook.  Retrieved: https://www.chicagobooth.edu/-/media/project/chicago-booth/centers/caai/docs/algorithmic-bias-playbook-june-2021.pdf

Current Issues

  • Public Commons
  • Media and Information Literacy: Enriching the Teacher/Librarian Dialogue
  • The International Media Literacy Research Symposium
  • The Human-Algorithmic Question: A Media Literacy Education Exploration
  • Education as Storytelling and the Implications for Media Literacy
  • Ecomedia Literacy
  • Conference Reflections

Archived JML Print Issues

  • Print Issues years 2018 to 2000
  • Print Issues years 1999 to 1953

Learn More About The Journal of Media Literacy

  • About the Journal of Media Literacy
  • Our Editorial Team
  • Our Philosophy
  • Publication Ethics Policy
  • Author Guidelines
  • Get Involved
  • Melda N. Yildiz
    Global Scholar

    Melda N. Yildiz is a global scholar, teacher educator, instructional designer, and master gardener. She served as a Fulbright Scholar in Turkmenistan (2009), Azerbaijan (2016), fulbright specialist in Kenya (2022) teaching and conducting research integrating media education in P16 classrooms. Yildiz has authored, published, and presented on topics including STEM education, media and information literacy, instructional technology, and multicultural and global education. She received her Ed.D. from the University of Massachusetts, Amherst in Math & Science and Instructional Technology and an M.S. from Southern Connecticut State University in Instructional Technology. She majored in Teaching English as a Foreign Language at Bogazici University in Turkey.

Share This:

  • Share on X (Opens in new window) X
  • Share on Facebook (Opens in new window) Facebook
  • Share on Tumblr (Opens in new window) Tumblr
  • Share on LinkedIn (Opens in new window) LinkedIn
  • Share on Pinterest (Opens in new window) Pinterest
  • Share on Reddit (Opens in new window) Reddit
  • Share on Telegram (Opens in new window) Telegram
  • Share on WhatsApp (Opens in new window) WhatsApp
  • Print (Opens in new window) Print

The Journal of Media Literacy Human AI Editorials
Media Literacy Education Social Justice Algorithmic Bias Machine Learning

Reader Interactions

Leave a ReplyCancel reply

Footer

International Council for Media Literacy

Formerly the National Telemedia Council

Support Media Information Literacy:

IC4ML is a 501(c)(3) based in Wisconsin, USA with members Worldwide.

Join Our Mailing List

Read Past Newsletters

Search

Contact Us

ICforML@gmail.com

View Ways to Get Involved

  • Email
  • Facebook
  • Instagram
  • Twitter

Copyright © 2026 · International Council for Media Literacy. All Rights Reserved.

 

    • English (Inglês)
    • Português
    • Español (Espanhol)