Are filter bubbles really to blame for social and political polarisation?
/Many media professionals and news consumers fear that algorithmic personalisation can end up trapping readers in so-called filter bubbles. However, our review of the related research shows that concerns over the issue may be exaggerated.
Since the internet activist and publisher Eli Pariser popularized the concept of the digital filter bubble in 2011, many people have become suspicious of the growing influence of algorithms.
Although algorithmic curation has largely contributed to the success of YouTube, Facebook, Amazon, Twitter and beyond, voices including the former US president Barak Obama and German President Steinmeier have expressed concern over the negative effects this technology can have on societies.
The common fear is that recommendation engines give users similar content in continuous feeds, thus creating ‘echo chambers’ that amplify the already powerful confirmation bias and block out other perspectives that could challenge the users’ opinions or at least put them into perspective.
The research lead during the CPN project, in the form of user surveys and expert interviews, showed that many news professionals and readers are also concerned about the concept of personalisation, suspicious of its potential to create ideological filter bubbles.
The echo chamber thesis seems largely founded on the inherent tribalism on social media. Social networks not only let us stay connected with old classmates and far-away relatives, but also create communities of likeminded people – for the better or worse. Moreover, social media companies are still struggling to respond to the spread of false information, extremist content and outright hate-speech on their platforms.
Segregation vs. viewpoint diversity
Given the widespread concerns, there is a surprisingly small number of scientific studies available about filter bubbles, but the research that currently exists draws a far more nuanced picture of the issue than many might imagine.
Many fear that audiences divide because of entirely diverging news media diets, based on political allegiances, with the United States and its infamous blue and red news feeds often mentioned as an example of this. Still, recent research shows that only a small portion of the American population is trapped in ideological chambers created by partisan media, with some reporting the number at as low as 8%.
In other western countries, media consumption is even less segregated. In Germany, the different political figures and their supporters predominantly consume and share content from the same few mainstream sources. Even in the Pre-Brexit UK a rather small number of people were at risk of being caught up in informational filter bubbles.
While many of today’s social and political tendencies that affect online discourse are alarming, social media sites can still act as spaces in which people encounter the most diverse information sources, and get exposed to viewpoints they don’t agree with, instead of being trapped in cosy ideological and cultural bubbles of real world social interactions. Age also seems to be a factor: older populations show the highest amount of political polarisation despite the low level of social media use in that age group.
Even members of extremist groups do not exist in hermetically enclosed informational bubbles, but on the contrary acknowledge that other opinions exist, and can take a certain pride in ignoring and contesting mainstream beliefs.
Product mix effect
Beyond social media, a study from 2018 questions the supposedly polarising effects of algorithmic filtering of the Google News feed: the search results for news in the sample group were relatively similar even if certain news publications have a significantly greater presence than others.
As for news applications, other research has shown that users of personalised news apps view higher number of sources and news categories, without a change in reported partisan content.
One important factor is serendipity, also known as the ‘product mix effect’, which keeps a consumer interested in the offered content while allowing the algorithm to evolve. First devised in retail, the product mix effect may influence our media diet just as it influences our shopping behaviour.
In a recent study a researcher team from the University of Amsterdam tested the effects of content personalisation within a single-source content setting. The results indicated that when applied to mainstream journalistic content, all common algorithmic filtering approaches resulted in recommendations that do not differ from human editorial recommendations in terms of diversity.
The study focused on diversity in terms of topics as opposed to diversity in terms of ideological viewpoints. This is a novel approach to diversity representation that might be more suitable for the European context with its highly pluralistic political and media landscapes.
Algorithmic transparency
While information diversity is certainly important for any society, the availability of counter perspectives in news and opinion pieces alone may not be the universal antidote to social fragmentation and political polarisation. A number of studies suggest that exposure to opposing views might, in fact, increase political polarisation.
Overall, it is too early to assess the exact impact technology has on political opinion building: the era of algorithmic filtering is still in its beginning stages, and much more research is needed to understand how content personalisation impacts societies’ cultural, geographic and socio-cultural patterns. But as application developers and media professionals, we nevertheless need to take seriously any potential risks related to new technologies, and strive to educate and empower our readers to become proactive and critical news consumers.
Similarly, the way we write about new technologies should reflect their realistic abilities and limitations. Neither an algorithm nor an editor will ever be able to include all perspectives into a curated media feed. On the other hand, we also need to create the best user experience to compete with popular applications. While social media sites may have gained more popularity as the go-to news sources, the amount of real news on those platforms tends to be very scarce.
The number of approaches that aim to provide diversity or at least transparency to news recommendation applications is fortunately growing – see for example BBC’s Public Algorithm, Diversity as a design and public service algorithms designs. For now, we may have more questions than answers about the relationship between algorithms and user behaviour, but such resources can point us in the right direction.
Photo by Lanju Fotografie on Unsplash