Responsible News Recommender Systems (ReNewRS)

Key Idea

Do online news recommender systems promote social polarization or even radicalization? This project investigates the influence of algorithmic news selection on shaping public opinion.

As information is increasingly being consumed online, news consumption is to a large extent being influenced by recommender systems that curate a news selection tailored to the user. It has been at the center of controversy whether individuals get trapped in so-called “filter bubbles” in which they only get to perceive information that amplifies their opinion while conflicting information is excluded. Consequently, it can be asked if news recommender systems are responsible for polarizing or even radicalizing users. The project examines the occurrence of filter bubbles through news recommender systems and their effects on users’ opinion formation in a series of experimental studies. 

With the aim to get insights beyond a single discipline, the project is carried out in interdisciplinary cooperation between media and communication studies, data science, and business informatics. Substantial knowledge from media and communication studies builds the foundation for the analysis of user effects, while the technical implementation requires the expertise of data and computer science. In using actually running recommendation systems, the project has a pioneering status in the field of online media effect research.

In order to be able to trace down potential polarizing effects, we will select news topics that allow the participants to take a clear position on a divisive issue and end up in what could be referred to as a “filter bubble”. Based on these topics we will draw article samples from a range of news sources with different political slant, which will then be used to run different versions of a prototypical news recommender system. Comparing the outcomes of different recommendation algorithms will enable us to examine whether and how news recommender systems should be configured in order to mitigate potential polarizing effects and to enable a more balanced opinion formation. From these empirical insights, guidelines for responsible news recommender systems will be derived.


The project is funded by the German Baden-Württemberg Stiftung (BW-Stiftung) under the funding scheme responsible artificial intelligence, with a duration from August 2020 to November 2022.

Project Partners and Personnel


Key Outcomes


Open Data: