The Reason Your Feed Became An Echo Chamber — And What To Do About It

The algorithms that serve up what you like can often create closed loops of their own, packed only with people who agree with you already.
The algorithms that serve up what you like can often create closed loops of their own, packed only with people who agree with you already.
The algorithms that serve up what you like can often create closed loops of their own, packed only with people who agree with you already.
The algorithms that serve up what you like can often create closed loops of their own, packed only with people who agree with you already.

The Reason Your Feed Became An Echo Chamber — And What To Do About It

WBEZ brings you fact-based news and information. Sign up for our newsletters to stay up to date on the stories that matter.

(Hiroshi Watanabe/Getty Images)

At the outset, the Internet was expected to be an open, democratic source of information. But algorithms, like the kind used by Facebook, instead often steer us toward articles that reflect our own ideological preferences and search results usually echo what we already know and like.

As a result, we aren’t exposed to other ideas and viewpoints, says Eli Pariser, CEO of Upworthy, a liberal news website. Pariser tells NPR’s Elise Hu that as websites get to know our interests better, they also get better at serving up the content that reinforces those interests, while also filtering out those things we generally don’t like.

“What most algorithms are trying to do is to increase engagement, increase the amount of attention you’re spending on that platform,” he says. And, while it’s a nice that we have an instrument to help us cope with the fire hose of information supplied by the Internet, that algorithm also carries some downsides. “The danger is that increasingly you end up not seeing what people who think differently see and in fact not even knowing that it exists.”

It’s what Pariser calls a “filter bubble.” And it’s something he tried to break out of himself, chronicling that experience in the book The Filter Bubble: What the Internet Is Hiding from You. The results were, well, mixed.

“I was medium-successful,” Pariser says. “It’s hard, and that’s partly because we know the people that we know, and those tend to be slanted in one ideological direction or another so you have to really work to find people who think differently.”

The difficulty of bursting this filter bubble extends to matters of race too, as NPR’s Gene Demby noted in an interview with Weekend Edition.

“When you see poll numbers about the vast space between the way people of color feel about policing — or any number of issues around equality — and where white people stand on those issues, it can be explained in part by the fact that we are not having the same conversations,” Demby said.

One way to get on the same page? Maybe it’s time to get off social media, Demby said — at least for a little bit.

“You would think that social media would bridge a bunch of divides, right?” Gemby said. “But maybe the ideal way these conversations need to happen is one-on-one with people who are equally vested in the relationship between the two people.”

Still, Pariser says, you can try to beat the algorithms to see the other side. Just make sure you actually want to read those opposite viewpoints, because the algorithms can tell when you don’t.

“I think these algorithms are very good at seeing are you following someone but never listening to them, or are you actively engaging and talking to them?” Pariser says. “So for me, one of the best things has been actually seeking out and finding folks who don’t think like me who I’m genuinely interested in, as people and thinkers.”

Copyright 2016 NPR. To see more, visit NPR.