Feedback Loops: Algorithmic Authority, Emergent Biases, and Implications for Information Literacy

Ian O'Hara


Algorithms have become increasingly ubiquitous in our modern, technologically driven society. Algorithmic tools that are embedded to “enhance” the user experience when information-seeking carry problematic epistemological concerns. These algorithms are developed and interjected into search tools by human beings who, consciously or not, tend to impart biases into the functionality of the information retrieval process. These search tools have become our primary arbiters of knowledge and have been granted relatively unmitigated sovereignty over our perceptions of reality and truth. This article provides broader awareness of how the bias embedded within these algorithmic systems structures users’ perception and knowledge of the world, preserving traditional power hierarchies and the marginalization of specific groups of people, and examines the implications of algorithmic search systems on information literacy instruction from a critical pedagogical perspective. 

Full Text:



Association of College & Research Libraries. (2016). Framework for information literacy for higher education. American Library Association.

Benjamin, R. (2019). Race after technology. Polity.

Caliskin, A., Bryson, J. J., & Narayanan, A. (2017). Semantics derived automatically from language corpora contain human-like biases. Science 356(6334), 183-186.

Bucher, T. (2018). If .... then: Algorithmic power and politics. Oxford University Press.

Dormehl, L. (2014). The formula: How algorithms solve all our problems - and create more. Penguin.

Drabinski, E. (2017). A kairos of the critical: Teaching critically in a time of compliance. Communications in Information Literacy, 11(1), 76-94.

Freire, P. (1970). Pedagogy of the oppressed. Herder and Herder.

Funk, C., & Parker, K. (2018). Women and men at odds over STEM workplace equity. Pew Research Center. Retrieved December 22, 2020 from

Kearns, M., & Roth, A. (2019) The ethical algorithm. Oxford University Press.

Knuth, D. E. (1998). The art of computer programming: Sorting and searching (2nd ed., Vol. 3). Addison-Wesley Professional.

Martin, R. C. (2008). Clean code: A handbook of agile software craftsmanship. Pearson.

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press.

Olhede, S. C., & Wolfe, P. J. (2018). The growing ubiquity of algorithms in society: Implications, impacts and innovations. Philosophical Transactions of the Royal Society A, 376(2128).

O’Neil, C. (2016) Weapons of math destruction. Crown.

Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.

Reidsma, M. (2019). Masked by trust: Bias in library discovery systems. Library Juice Press.

Seaver, N. (2019). Knowing algorithms. In J. Vertesi, & D. Ribes (Eds.), DigitalSTS (pp. 412-422). Princeton University Press.

Swanson, T. A. (2004). A radical step: Implementing a critical information literacy model. portal: Libraries and the Academy, 4(2), 259-273.

Sweeney, L. (2013). Discrimination in online ad delivery. Communications of the ACM. 56(5), 44-54.

Tewell, E. C. (2018). The practice and promise of critical information literacy: Academic librarians’ involvement in critical library instruction. College & Research Libraries, 79(1), 10-35.

Warren, S., & Duckett, K. (2010). “Why does Google Scholar sometimes ask for money?”: Engaging science students in scholarly communications and the economics of information. Journal of Library Administration, 50(4), 349-372.


Copyright (c) 2021 Ian O'Hara

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.