Skip down to main content

Auditing for Transparency in Content Personalization Systems

Published on
15 Oct 2016
Written by
Philip Howard

Do we have a right to transparency when we use content personalization systems? Building on prior work in discrimination detection in data mining, I propose algorithm auditing as a compatible ethical duty for providers of content personalization systems to maintain the transparency of political discourse. I explore barriers to auditing that reveal the practical limitations on the ethical duties of service providers. Content personalization systems can function opaquely and resist auditing. However, the belief that highly complex algorithms, such as bots using machine learning, are incomprehensible to human users should not be an excuse to surrender high quality political discourse. Auditing is recommended as a way to map and redress algorithmic political exclusion in practice. However, the opacity of algorithmic decision making poses a significant challenge to the implementation of auditing.

Download here.

Mittelstadt, B. (2016). Automation, Algorithms, and Politics| Auditing for Transparency in Content Personalization Systems. International Journal Of Communication, 10, 12. Retrieved from

Related Topics