Online Echo Chambers Won’t Be Solved by Smarter Algorithms


The problem is bigger than getting better information in front of those who need to see it.


 

At the heart of social media lies a conspicuous paradox. While sites like Facebook have created more opportunities for human connection, their algorithms have a tendency to silo us in homogeneous spaces. The potential for engagement has never been greater, but the personalized nature of the newsfeed largely ends up reinforcing our own biases and ideological orientation. The more we tell Facebook and Google what we want to see through our clicks and shares, the more it skews our feed to serve up more of the same — be it cat videos or conspiracy theories. In the context of our political environment, this has mostly served to activate our tribalist impulses and sort us into angry, poorly informed partisans.

The obligation shared by Facebook and other social media proprietors to blunt the insularization of their platforms — otherwise known as the viewpoint diversity problem — remains a pressing topic for our time. Efforts to curate more cross-cutting news and crack down on fake news stories are democratically necessary, as is the increased pressure on these companies to monitor how their algorithms are being used and the degree to which they can be exploited by bad actors. But missing from the constant chatter of echo chambers and filter bubbles is something that won’t be solved by smarter algorithms. The problem is not just that people aren’t exposed to information that challenges their worldview, it’s that many would refuse to read and engage even if they were.

I’ve observed this for years in the debate over climate change. Presenting someone who doubts the science with accurate information rarely leads to a discussion of the merits of what was presented. It often devolves, rather, into a dispute over the merits of the source, followed by a repetition of the same shopworn misinformation that kicked off the exchange. A point by point debunking is only as useful as a recipient’s willingness to hear it. And if the link you offer raises partisan alarms, it’s liable to be dismissed out of hand. The Washington Post, The Guardian, Skeptical Science, hell, even NASA, doesn’t matter — once a particular source has been branded as suspect, that source is now off the table for consideration.

I stopped engaging with a certain family member on the issue once I realized he wouldn’t read the articles I was sending under threat of Guantanamo. For those for whom authenticity has become a measure of ideological compatibility, attempts to enlighten are futile. The mulish resistance to outside ideas and the mainstream press has grown more acute in the Trump Years — with open disdain for punditry coming from the top — but it was a virtual inevitability given the tribal nature of human behavior and the cognitive value we attach to closely held beliefs.

Algorithmic approaches can only do so much to compensate for shortcomings written in psychology, or to dislodge the rigid belief structures of low information consumers. Facebook and others could quite easily tune their news algorithm to ensure climate deniers are shown more factual stories in their feed. But this does nothing to ensure those same individuals won’t scroll right past them after seeing the headline and associated source. After all, safely ignoring information that threatens cognitive dissonance is precisely what generates echo chambers in the first place.

Liberals and conservatives alike are susceptible to such effects, and there’s more we all can do to cultivate a less partisan online experience. But for the people most in need of a richer, learned atmosphere — those given to grand conspiracizing and science denial, for instance — revamping the newsfeed is beside the point. That a particular headline or story agrees with the narrative they’re prepared to imbibe is all that matters.

How might we effect change in the short term? For people who take their cues from vested authorities, nothing will change unless those authorities begin to change their messaging. In the arena of climate change, this shift in signaling must come from conservative elites: from the Sean Hannities and Rush Limbaughs of the denial-o-sphere, from the opinion pages of the Murdoch press and Breitbart and WUWT, and any place else that doesn’t teach peer review or value expertise and intellectual honesty.

A sea change could also come about following a dramatic overhaul of who conservatives consider credible and what ideas they embrace. As David Roberts has argued, once rejecting climate science, socialized medicine, LGBTness, and the like are no longer part of the conservative playbook, we’ll see their relevance wane quickly. Unfortunately, this almost surely won’t happen so long as the incentive structures are so perversely misaligned, not just on the part of individuals — who might risk interpersonal blowback from deviating from their ideological cohort — but on the part of partisan elites and the media behemoths they represent.

Looking ahead, perhaps nothing short of a new intellectual renaissance will do, a revitalization of our innate curiosity and our unquenchable desire to understand the world around us. Only a society-engulfing recommitment to fact-conscious living, to free inquiry, to educational improvement, to acquiring knowledge for its own sake, and to objectivity and fairness can serve as a critical advance upon the blind partisanism and enculturated ignorance that so define our times.

This is why our Great Problem is not something Facebook, Google, and Big Data can solve alone. It’s not just about getting information in front of those who need to see it. It’s about fundamentally resculpting the epistemic resolve of ordinary Americans — a generational task if ever there was one.


 

Further reading:

This post was featured on HuffPost’s Contributor platform.

Image credit: Stephen Lam/Getty Images