BBC News has just published a piece in which they tackle the fact that Google has been prominently displaying Holocaust denial content.
Shockingly, they also uncover other examples such as the screenshot above (apologies for reproducing it) in which Google’s knowledge graph is displaying concerning and highly offensive content.
In the BBC piece, I argue both that algorithms are not neutral, and that Google occupies a position of immense power because of the huge amount of digital content that it mediates. As such, they need to take responsibility for these results. They can’t just point to the algorithm and say ‘well, this wasn’t our intent’ (an argument that I make more directly in this recent piece: ‘Let’s make platform capitalism more accountable‘).
For people interested in the provenance of some of Google’s more questionable search results, please read these recent articles that Heather Ford and I have been working on:
Ford, H., and Graham, M. 2016. Semantic Cities: Coded Geopolitics and the Rise of the Semantic Web. In Code and the City. eds. Kitchin, R., and Perng, S-Y. London: Routledge. 200-214.
Ford, H. and Graham, M. 2016. Provenance, Power, and Place: Linked Data and Opaque Digital geographies. Environment and Planning D: Society and Space. doi:10.1177/0263775816668857 (pre-publication version here).
Graham, M. 2015. Why Does Google Say Jerusalem is the Capital of Israel? Slate.com Nov 30, 2015
See also, my earlier post on this topic: ‘On how Google frames, shapes and distorts how we see the world‘
Note: This post was originally published on Mark Graham's blog on . It might have been updated since then in its original location. The post gives the views of the author(s), and not necessarily the position of the Oxford Internet Institute.