At a recent Technology Entertainment and Design (TED) conference, speaker Eli Pariser discussed the nature of information distribution over the contemporary internet, marking it as an increasingly personal and notably removed process.
Speaking particularly of Facebook and Google, Pariser identified two of the internet’s most prominent information brokers as companies increasingly engaged in what he calls “information filtering”- information catered and delivered to users per their specifics tastes.
“There’s this kind of shift in how information is flowing online, and it’s invisible,” Pariser said at the conference. “If we don’t pay attention to it, it could be a real problem.”
Pariser’s warning comes after his discovery that Facebook, well known for catering advertisements to a user’s specific interests and narrowing friend “relevancy” based on levels of interactivity, is now algorithmically engaged in the silent editing out of information.
As Pariser frames it, as a self-professed progressive he began to discern a noticeable gap in conservative opinion within his flow of Facebook information.
“I noticed one day that the conservatives had disappeared from my Facebook feed,” he said.
He went on to discover that Facebook, via algorithmic “gatekeepers,” is systematically removing opposing, conflicting, and other information it deems not to the users tastes or political alignment from a given “feed.”
“This moves us very quickly to a world where the internet is showing us things we want to see, not necessarily what we need to see,” Pariser elaborated, expressing a concern that Facebook and others are catering too greatly to an instant gratification, consumerist society.
“Facebook isn’t the only place doing this invisible algorithmic editing of the web,” he said.
Expanding the issue beyond just Facebook, Pariser argued that, “There are a whole host of companies doing this kind of personalization … different people get different things.”
Even Google, Pariser claimed, citing a Google keyword search experiment in which different friends turned up radically different results, no longer yields identical information.
“There are 57 signals that Google looks at to personally tailor your query results,” he said. “There is no standard Google anymore.”
When asked whether aware of the type of information filtering and editing taking place over popular sites like Google and Facebook, senior political science major Baldeep Sidhu was taken aback at the realization.
“I’m really surprised,” Sidhu said when confronted with the idea that he was being silently censored. “I understand why they would do that, but I think that’s really dangerous because if you can’t get information that is external to your vices it creates bias.”
Sidhu was not alone in his surprise. Senior political science and environmental policy double major Baxter Boeh-Sobon was equally taken aback to learn that Facebook and others were silently editing his information flow.
“It appeals to your comfort zone,” Boeh-Sobon said after learning about the practice. “But you are being limited. The reason I hadn’t realized it I guess was because I feel so comfortable with the information I see.”
Sophomore electrical engineering major Edwin Wong, also saw it as a possible form of censorship when challenged with the idea.
“I can see it as a possible subliminal manipulation method,” he said.
No doubt these Davis students are not alone in their realization. Many presumably have no idea that Facebook, Google and others are silently creating information bubbles catered to each individual user.
“If algorithms are going to curate the world for us, then we need to make sure they’re not just keyed to relevance,” Pariser said. “We need to make sure that they also show us things that are uncomfortable, or challenging, or important. The best editing gives us a bit of both. Some information vegetables, some information desert.”
JAMES O’HARA can be reached firstname.lastname@example.org.