Responsible News Recommender Systems (ReNewRS)
Key Idea
Do online news recommender systems promote social polarization or even radicalization? This project investigates the influence of algorithmic news selection on shaping public opinion.
As information is increasingly being consumed online, news consumption is to a large extent being influenced by recommender systems that curate a news selection tailored to the user. It has been at the center of controversy whether individuals get trapped in so-called “filter bubbles” in which they only get to perceive information that amplifies their opinion while conflicting information is excluded. Consequently, it can be asked if news recommender systems are responsible for polarizing or even radicalizing users. The project examines the occurrence of filter bubbles through news recommender systems and their effects on users’ opinion formation in a series of experimental studies.
With the aim to get insights beyond a single discipline, the project is carried out in interdisciplinary cooperation between media and communication studies, data science, and business informatics. Substantial knowledge from media and communication studies builds the foundation for the analysis of user effects, while the technical implementation requires the expertise of data and computer science. In using actually running recommendation systems, the project has a pioneering status in the field of online media effect research.
In order to be able to trace down potential polarizing effects, we will select news topics that allow the participants to take a clear position on a divisive issue and end up in what could be referred to as a “filter bubble”. Based on these topics we will draw article samples from a range of news sources with different political slant, which will then be used to run different versions of a prototypical news recommender system. Comparing the outcomes of different recommendation algorithms will enable us to examine whether and how news recommender systems should be configured in order to mitigate potential polarizing effects and to enable a more balanced opinion formation. From these empirical insights, guidelines for responsible news recommender systems will be derived.
Funding
The project is funded by the German Baden-Württemberg Stiftung (BW-Stiftung) under the funding scheme responsible artificial intelligence, with a duration from August 2020 to November 2022.
Project Partners and Personnel
- University of Mannheim, Chair of Data Science
- University of Mannheim, Institute for Media and Communication Studies
- FIZ Karlsruhe, Information Service Engineering
- KIT Karlsruhe, Information & Market Engineering
- Alumni
Key Outcomes
Publications:
- Andreea Iana, Mehwish Alam, Heiko Paulheim: A Survey on Knowledge-Aware News Recommender Systems. Accepted in: Semantic Web Journal.
- Mehwish Alam, Andreea Iana, Alexander Grote, Katharina Ludwig, Philipp Müller, Heiko Paulheim: Towards Analyzing the Bias of News Recommender Systems Using Sentiment and Stance Detection. WWW (Companion Volume) 2022: 448–457
- Katharina Ludwig, Alexander Grote, Andreea Iana, Mehwish Alam, Heiko Paulheim, Harald Sack, Christof Weinhardt, Philipp Müller: Divided by the Algorithm? The (Limited) Effects of Content- and Sentiment-Based News Recommendation on Affective, Ideological, and Perceived Polarization. Accepted in: Social Science Computer Review.
- Katharina Ludwig, Philipp Müller: Does social media use promote political mass polarization? A structured literature review.In: B. Krämer & P. Müller (eds.), Questions of Communicative Change and Continuity. In Memory of Wolfram Peiser (pp. 118–166). Baden-Baden: Nomos.
Open Data:
- GeNeG: German News Knowledge Graph. https://doi.org/10.5281/zenodo.6039372
- NeMig – A Bilingual News Collection and Knowledge Graph about Migration. https://doi.org/10.5281/zenodo.7442424