A Letter from Guest Editor, Melda N. Yildiz, EdD
As Artificial intelligence (AI) systems and machine learning algorithms have been used to automate decision-making processes from education to economic sectors, there is an inevitable impact on democracy and social justice. Cathy O’Neal (2016) reveals real-life impact stories of algorithmic injustice in her book, “Weapons of Math Destruction.” O’Neal highlights some troubling examples in which the reality of algorithmic decision-making falls short of our expectations. O’Neil provided examples of how big data is used in employment opportunities to the justice system and how it can be a threat to democracy and leads to inequity. (O’Neal, 2016).
According to Greenlining Institute Report (2021), “Algorithmic bias occurs when an algorithmic decision creates unfair outcomes that unjustifiably and arbitrarily privilege certain groups over others.” Based on the big data, AI could predict our life expectancy based on our zipcodes and economical status.. The report outlined how algorithms are used to decide who gets “access to affordable credit, jobs, education, government resources, health care and investment.” (Greenlining Institute Report, 2021).
Algorithms are harnessing data to influence decisions from sentencing for imprisonment to helping banks determine who gets the best interest rate, who gets into what college. For example, “automated risk assessments used by U.S. judges to determine bail and sentencing limits can generate incorrect conclusions” that result in targeted groups who get longer prison sentences and pay higher bail amounts. (Turner, Resnick, & Barton, 2019), (Mayer-Schonberger & Cukier, 2013) and (Sumpter, 2018).
Authors in the journal brought perspectives from their research studies and professional experiences. To better prepare our next generation from the intended or unintended consequences of AI, we need to start the dialogue among public and private sectors to reengineer and rethink the future of AI and machine learning to eradicate the social injustice and historical inequalities. To eliminate bias in algorithms, all stakeholders from programmers to elected officials need to collaborate to identify, mitigate, and remedy the impacts on our lives.
In 2018, Mbadiwe outlined the statistics from ProPublica and reported on racial disparities and the “fairness paradox” and “machine bias.” (Angwin, Larson, Mattu. & Kitchner, 2016) “ Don’t blame the algorithm — as long as there are racial disparities in the justice system, sentencing software can never be entirely fair.” (Mbadiwe, 2018) We first need to address algorithmic bias, areas like education, employment, and housing, then we may have a chance to close the racial wealth gap and health disparities. (Greenlining Institute Report, 2021)
In these journal articles, authors question how our online searches and purchase history are being used; how we leave a trace of data going through a toll booth or using our credit cards; and how this data is used. In addition to injustices domestically, Sharley (2020) points out the importance of international humanitarian laws especially to mend the injustices and provide human rights in conflict zones to prevent the “biased algorithms being used for the technologies of violence.” (Sharley, 2020)
In this journal, authors highlighted the importance of being proactive on the ethical use of AI and machine learning on automated decision-making tools. Media Literacy Education provides the framework that examines the unrepresentative, incomplete data, misinformation and disparities that reflect historical inequalities. Most importantly, we, media educators, collaborate to educate our students and communities to take action to eradicate social injustice one data point at a time.
References
Angwin, J., Larson, J. Mattu, S. & Kitchner, L. (2016). Machine Bias. Retrieved: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
Greenlining Institute. (2021). Algorithmic Bias explained. Retrieved: https://greenlining.org/wp-content/uploads/2021/04/Greenlining-Institute-Algorithmic-Bias-Explained-Report-Feb-2021.pdf
Mayer-Schonberger, V., & Cukier, K. (2013). Big data: A revolution that will transform how we live, work and think. Boston, MA: Houghton Mifflin Harcourt.
Mbadiwe, T. (2018). Algorithmic injustice. Retrieved: https://www.thenewatlantis.com/publications/executive-summary-of-algorithmic-injustice
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. London, England: Penguin Books.
Sharley, N. (2020). Algorithmic injustice: Mend it or end it. Retrieved: https://www.boell.de/en/2020/02/13/algorithmic-injustice-mend-it-or-end-it
Sumpter, D. (2018). Outnumbered: From Facebook and Google to fake news and filter-bubbles — the algorithms that control our lives. London, England: Bloomsbury Publishing.
Turner, L. N, Resnick, P. & Barton, G. (2019). Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms. Retrieved: https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/
Additional Resources
Tech Bias and Algorithmic discrimination. Courtney Thomas Jr. TEDxRochester. (July 22, 2019) Retrieved: https://www.youtube.com/watch?v=N9XaLNfExgM
Coded Bias. Retrieved: https://www.codedbias.com/
Algorithms rule us all – VPRO documentary. Retrieved: https://www.youtube.com/watch?v=NFF_wj5jmiQ
Forbes. (2020). Biased Algorithms Learn From Biased Data: 3 Kinds Biases Found In AI Datasets Retrieved: https://www.forbes.com/sites/cognitiveworld/2020/02/07/biased-algorithms/?sh=5ba335c876fc
Bloomberg. (2020). What Are Algorithms and Are They Biased Against Me? Retrieved: https://www.bloomberg.com/news/articles/2020-12-11/what-are-algorithms-and-are-they-biased-against-me-quicktake
Chicagobooth. (2021). Algorithmic Bias Playbook. Retrieved: https://www.chicagobooth.edu/-/media/project/chicago-booth/centers/caai/docs/algorithmic-bias-playbook-june-2021.pdf
Current Issues
- Media and Information Literacy: Enriching the Teacher/Librarian Dialogue
- The International Media Literacy Research Symposium
- The Human-Algorithmic Question: A Media Literacy Education Exploration
- Education as Storytelling and the Implications for Media Literacy
- Ecomedia Literacy
Leave a Reply