It is now widely acknowledged that Google and Facebook shape the way we see the world. Matthew Reidsma, web services librarian at Grand Valley State University in Michigan, would like us to be equally wary about the 鈥渄iscovery systems鈥 used in libraries such as Ex聽Libris鈥 Summon and Primo, OCLC鈥檚 WorldCat Discovery and EBSCO鈥檚 EDS. His new book, Masked by Trust: Bias in Library Discovery (Litwin Books), provides many sobering examples of the results produced by these 鈥渟eemingly objective search tools鈥.
A search for 鈥渕ental illness鈥, for example, led straight to a Wikipedia article on the controversial 1961 book by the psychiatrist Thomas Szasz, The Myth of Mental Illness, which might suggest to users that 鈥渢he topic they are studying is nothing more than a myth鈥. Searches for 鈥9/11鈥 offered as their聽top result a book arguing that 鈥渢he official story can鈥檛 possibly be true鈥, while another search for 鈥淪eptember聽11鈥 made no reference to the 9/11 attacks but flagged up the date as 鈥渢he first day of the Coptic calendar鈥!聽A request for information about 鈥渨omen in prison鈥澛爌roduced聽results about 鈥渨omen in prison films鈥, an exploitation genre聽that sheds little light on the real experiences of female prisoners. Furthermore,聽the service聽offered as a related topic 鈥渟ex in films鈥, a subject聽that聽shares only the single word 鈥渋n鈥 with the original search.
Reidsma cites the notorious case when Google Photos automatically labelled images of two black friends as 鈥済orillas鈥, its algorithm somehow 鈥渄redg[ing] up hundreds of years of institutionalized racism鈥. Some of the discovery systems he examined also seemed to have prejudices built into them. One search for 鈥渋mmigrants are鈥澛爎eturned聽three results: a book title, a search on whether 鈥渋mmigrants are good for the economy鈥 and the straightforward 鈥渋mmigrants are bad鈥. The autosuggest for 鈥淎sians are鈥 came up solely with 鈥淎sians are good at math鈥. Perhaps oddest of all, a question about the Catholic practice of 鈥渓ay investiture鈥 offered as related topics 鈥淔uck鈥 and 鈥淕ay鈥.
In trying to explain why our algorithms sometimes 鈥渟pit out hate and bias鈥, Reidsma points to the lack of diversity among engineers and 鈥渁聽culture that glorifies efficiency above all else鈥. Even where companies agree to put things right, they treat any problems as 鈥渂ugs鈥 requiring technical fixes rather than as touching on deeper ethical issues. When a search for 鈥渟tress in the workplace鈥 returned a Wikipedia article on 鈥渨omen in the workplace鈥, for example, a manager at Ex聽Libris explained to the author this was a result of a phrase-matching system (and that, therefore, typing 鈥渉eroes in the workplace鈥 would also have directed users to 鈥渨omen in the workplace鈥). What this failed to address was the possible impact on young women, in 鈥渁 culture that devalues their contributions in the workplace both culturally and monetarily鈥,聽of their聽鈥渟ee[ing] working women equated with stress in the workplace in a supposedly objective, neutral tool that everyone in the University tells them will give them the most objective, scholarly information鈥.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰鈥檚 university and college rankings analysis
Already registered or a current subscriber?



