Eli Pariser
THE FILTER BUBBLE
What the Internet Is Hiding from You
To my grandfather, Ray Pariser, who taught me that scientific knowledge is best used in the pursuit of a better world. And to my community of family and friends, who fill my bubble with intelligence, humor, and love.
A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.
Mark Zuckerberg, Facebook founder
We shape our tools, and thereafter our tools shape us.
Marshall McLuhan, media theorist
Few people noticed the post that appeared on Googles corporate blog on December 4, 2009. It didnt beg for attentionno sweeping pronouncements, no Silicon Valley hype, just a few paragraphs of text sandwiched between a weekly roundup of top search terms and an update about Googles finance software.
Not everyone missed it. Search engine blogger Danny Sullivan pores over the items on Googles blog looking for clues about where the monolith is headed next, and to him, the post was a big deal. In fact, he wrote later that day, it was the biggest change that has ever happened in search engines. For Danny, the headline said it all: Personalized search for everyone.
Starting that morning, Google would use fifty-seven signalseverything from where you were logging in from to what browser you were using to what you had searched for beforeto make guesses about who you were and what kinds of sites youd like. Even if you were logged out, it would customize its results, showing you the pages it predicted you were most likely to click on.
Most of us assume that when we google a term, we all see the same resultsthe ones that the companys famous Page Rank algorithm suggests are the most authoritative based on other pages links. But since December 2009, this is no longer true. Now you get the result that Googles algorithm suggests is best for you in particularand someone else may see something entirely different. In other words, there is no standard Google anymore.
Its not hard to see this difference in action. In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing crude oil into the Gulf of Mexico, I asked two friends to search for the term BP. Theyre pretty similareducated white left-leaning women who live in the Northeast. But the results they saw were quite different. One of my friends saw investment information about BP. The other saw news. For one, the first page of results contained links about the oil spill; for the other, there was nothing about it except for a promotional ad from BP.
Even the number of results returned by Google differedabout 180 million results for one friend and 139 million for the other. If the results were that different for these two progressive East Coast women, imagine how different they would be for my friends and, say, an elderly Republican in Texas (or, for that matter, a businessman in Japan).
With Google personalized for everyone, the query stem cells might produce diametrically opposed results for scientists who support stem cell research and activists who oppose it. Proof of climate change might turn up different results for an environmental activist and an oil company executive. In polls, a huge majority of us assume search engines are unbiased. But that may be just because theyre increasingly biased to share our own views. More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click.
Googles announcement marked the turning point of an important but nearly invisible revolution in how we consume information. You could say that on December 4, 2009, the era of personalization began.
WHEN I WAS growing up in rural Maine in the 1990s, a new Wired arrived at our farmhouse every month, full of stories about AOL and Apple and how hackers and technologists were changing the world. To my preteen self, it seemed clear that the Internet was going to democratize the world, connecting us with better information and the power to act on it. The California futurists and techno-optimists in those pages spoke with a clear-eyed certainty: an inevitable, irresistible revolution was just around the corner, one that would flatten society, unseat the elites, and usher in a kind of freewheeling global utopia.
During college, I taught myself HTML and some rudimentary pieces of the languages PHP and SQL. I dabbled in building Web sites for friends and college projects. And when an e-mail referring people to a Web site I had started went viral after 9/11, I was suddenly put in touch with half a million people from 192 countries.
To a twenty-year-old, it was an extraordinary experiencein a matter of days, I had ended up at the center of a small movement. It was also overwhelming. So I joined forces with another small civic-minded startup from Berkeley called MoveOn.org. The cofounders, Wes Boyd and Joan Blades, had built a software company that brought the world the Flying Toasters screen saver. Our lead programmer was a twenty-something libertarian named Patrick Kane; his consulting service, We Also Walk Dogs, was named after a sci-fi story. Carrie Olson, a veteran of the Flying Toaster days, managed operations. We all worked out of our homes.
The work itself was mostly unglamorousformatting and sending out e-mails, building Web pages. But it was exciting because we were sure the Internet had the potential to usher in a new era of transparency. The prospect that leaders could directly communicate, for free, with constituents could change everything. And the Internet gave constituents new power to aggregate their efforts and make their voices heard. When we looked at Washington, we saw a system clogged with gatekeepers and bureaucrats; the Internet had the potential to wash all of that away.
When I joined MoveOn in 2001, we had about five hundred thousand U.S. members. Today, there are 5 million membersmaking it one of the largest advocacy groups in America, significantly larger than the NRA. Together, our members have given over $120 million in small donations to support causes weve identified togetherhealth care for everyone, a green economy, and a flourishing democratic process, to name a few.
For a time, it seemed that the Internet was going to entirely redemocratize society. Bloggers and citizen journalists would single-handedly rebuild the public media. Politicians would be able to run only with a broad base of support from small, everyday donors. Local governments would become more transparent and accountable to their citizens. And yet the era of civic connection I dreamed about hasnt come. Democracy requires citizens to see things from one anothers point of view, but instead were more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead were being offered parallel but separate universes.
My sense of unease crystallized when I noticed that my conservative friends had disappeared from my Facebook page. Politically, I lean to the left, but I like to hear what conservatives are thinking, and Ive gone out of my way to befriend a few and add them as Facebook connections. I wanted to see what links theyd post, read their comments, and learn a bit from them.
But their links never turned up in my Top News feed. Facebook was apparently doing the math and noticing that I was still clicking my progressive friends links more than my conservative friendsand links to the latest Lady Gaga videos more than either. So no conservative links for me.
I started doing some research, trying to understand how Facebook was deciding what to show me and what to hide. As it turned out, Facebook wasnt alone.