The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think
a book by Eli Pariser
(our site's book review)
In December 2009, Google began customizing its search results for each user. Instead of giving you the most broadly popular result, Google now tries to predict what you are most likely to click on. According to MoveOn.org board president Eli Pariser, Google's change in policy is symptomatic of the most significant shift to take place on the Web in recent years—the rise of personalization. In this groundbreaking investigation of the new hidden Web, Pariser uncovers how this growing trend threatens to control how we consume and share information as a society—and reveals what we can do about it.
Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook—the primary news source for an increasing number of Americans—prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like The Washington Post devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos.
Compare the amount of info available to the amount YOU get to see—do you really want Big Brother censors deciding what you should see and what should be hidden from you??!!
In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs—and because these filters are invisible, we won't know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas. Forget the Internet broadening your knowledge, your positions, your beliefs. Instead, it will cast them in cement. The Internet will feel like "been there, done that."
While we all worry that the Internet is eroding privacy or shrinking our attention spans, Pariser uncovers a more pernicious and far-reaching trend on the Internet and shows how we can—and must—change course. With vivid detail and remarkable scope, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think reveals how personalization undermines the Internet's original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world.
Pariser's books feels much more timely and urgent given the results of the 2016 U.S. presidential election
Pariser's books feels much more timely and urgent given the results of the 2016 U.S. presidential election wherein the narrowing of our web experience obviously had a lot of influence on ideas, opinions, fake news, and voting. The main thesis is that personalization of the internet is far more all-encompassing than we would like to believe. Another theme is that personalization has vast consequences in how we organize—or petrify—our views on the world. The book is timely and it is very intriguing but a bit scary.
Personalization undermines the Internet's original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world
There are some good corporate citizens like Amazon, that openly provides you with "recommendations" and lets you decide if you want personalized results. When the decision is left to someone else without your permission, that is when it becomes extremely dangerous and takes on something resembling "Brave New World."
Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers—they do not respect your privacy—they respect only the money your personal data is worth
Our technological tools (including Google) seek to identify and present only the content which it knows we will be interested in, reducing or eliminating the opportunity to be exposed to new information, new ideas, and other viewpoints. This means that the idiot Republicans that "doubt" climate change science in order to please Big Oil will—in searches—encounter only climate change denier results, and not only will they fail to wise up by being presented with good info that shows them the error of their ways, they will harden their positions as it appears to them that the whole world is full of climate change deniers and only a few morons believe in climate change, even though the exact opposite is the truth. So how are people supposed to learn new things if the Internet keeps rubbing their old things in their faces?
Due to personalization, the idiot Republicans that 'doubt' climate change science in order to please Big Oil will—in searches—encounter only climate change denier results
How can democracy function in an environment where instead of informed discussion of issues we get, via feedback loops like echoes, support for our current position only and denigration of the opposing position? The point of discussion is to get people to honestly consider both positions in a fair and balanced manner while discussing issues, and then reach reasonable conclusions. This is democracy 101. But in our Brave New World with greedy megacorporations exploiting our info for money, democratic concerns and privacy concerns are flushed down the toilet and economic concerns trump all else. See Democracy—an American Delusion.
Many types of censoring are affecting us, but censoring most of the Internet may be the worst yet—worse than censoring the news
Everything from your Google searches to the type of political content you are most likely to read on Facebook gets tracked. Websites tailor everything to match your preferences without your knowledge, including ads. This is the era of personalization, and according to Eli Pariser, nothing is a secret anymore. Since the NSA surveils our emails, searches, browsing, etc., we can confidently conclude that privacy in the world is dead.
By 2008, the idea of communications privacy in the United States had literally become a joke—our government watches your every move
Security cameras, surveillance of your financial transactions, radio frequency spy chips hidden in consumer products, tracking of your Internet searches, and eavesdropping on your e-mail and phone calls. Without your knowledge or consent, every aspect of your life is observed and recorded, and the NSA is working on spy cameras disguised as harmless little bugs. But who is watching the watchers?
Fly #353242252 reporting: Citizen #312,756,972 doesn't seem to be hiding a thing—my conclusion is that she's clean; but just to be sure I think I'll hang around a bit longer!
The future of consensus opinion, culture and perceptions are all going to be based, not by what propaganda is spoon fed to you, but based on what isn't shared with you. Mass mind control through omission is where the world of search engines are headed. This book is a great primer to introduce you to some of the ways that this process is unfolding. This is worse than censorship via an overreaching government (China, Russia, North Korea, Saudi Arabia). This is censorship from overreaching megacorporations who are doing it to get richer as we get more targeted ads and consume more.
J.P. Morgan and his rich buddies took control of U.S. mainstream media a century ago, and, for important news stories we've gotten propaganda rather than truth ever since, and we're guzzling the Kool-Aid
From 1917 on, 11 years before Propaganda was written, the U.S. democracy's freedom of the press became progressively compromised by censorship from the powers-that-be like J.P. Morgan and the corporatocracy, and later by the the OSS (1942-1947) and then the CIA (1947-now), which gets its directions from the shadow government, which used propaganda to control the sheep-citizens. See Freedom of the Press—an American Delusion.
Society is being dumbing down, sold a false reality of things via propaganda, and told what to think, not how to think
The result of this dumbing down is that the citizens are walking around with their heads up their butts, knowing nothing, learning nothing, thinking nothing
The consequence of the rise of personalization is that we are being dumbed down, and hyper focus and bias displaces: general knowledge, context, contrast, discovery, serendipity and ultimately innovation and creativity. We'll be sheep being Google-sheared.
The consequence of the rise of personalization is that we are being dumbed down—we'll be sheep being Google-sheared
We are being dumbed down, and the morons who believe the world is flat get these views reinforced due to personalization, so they'll never wise up—see Modern flat Earth societies
Ultimately, a small group of American companies may unilaterally dictate how billions of people work, play, communicate, and understand the world. Protecting the early vision of radical connectedness and user control should be an urgent priority for all of us. In The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, the focus is on how search engines and advertising are keying in on our selections and tastes, so they can sell more. The filter bubble insures that our search results give us only a small sampling of available info, and Google gets to decide what portion of available info they wish to show you. When the media used to control what we saw and learned, the CIA filters dictated what was censored, and—guess what?—they are still doing it. Now Google, Amazon, Facebook, and Apple censor as well. See Freedom of the Press—an American Delusion.
The consequence of the rise of personalization is that we are being dumbed down—we are witnessing the dumbing of America
Our government censors or covers up anything they don't want the public getting wise to, like their unreported imperialism—the U.S. press has a truth filter
As you can see, the Internet has a filter applied to it in the guise of helping us, but the real reason is to fill Google bank accounts and data company bank accounts. The media has a filter on it as well—a truth filter, as you can see above. Trump has a filter on the media as well. If reporters say anything he disagrees with, it's fake news and he insults or threatens these reporters. Because there is so much fake news, the Internet itself is a truth filter that filters out truth whenever the fake news purveyors manage to use SEO site optimization and SEM marketing better than the truth reporters so the lies upstage the truth. See Lies, Incorporated: The World of Post-Truth Politics, Weaponized Lies: How to Think Critically in the Post-Truth Era, and Shadow Elite: How the World's New Power Brokers Undermine Democracy, Government, and the Free Market.
"Ha: Why do you think people were so upset when you pointed out that their feeds and search results were being filtered?
Pariser: People still thought that everybody sees the same things through Google and everyone sees all of the posts on Facebook and the Facebook news feed. When you can demonstrate how inaccurate that is, it’s really surprising. It’s sort of like being told that your glasses edit out certain people as you’re walking down the street.
Ha: If you were Mark Zuckerberg would you change the terms and conditions, or the algorithm?
Pariser: I feel like the right answer is to change the terms and conditions to make it more transparent, but I’d be very tempted to mess with the algorithms, because it’s such a fascinating, huge thing. Facebook gets to decide how hundreds of millions or billions of people spend hours and hours every month, just through that algorithm, and media companies rise and fall by the sword of the Facebook algorithm. They’ve got this really incredible power to affect how the future of media looks, because when you control distribution, you can influence creation." (Source: Who rules the Internet? The answer might surprise you, Thu-Huong Ha, ideas.ted.com)
Due to personalization, each Google user sees the Internet and the world in a way custom-tailored for him or her like the blind men and the elephant who each explored a different part
". . . when Google returns search results, even for the exact same query, no one sees the exact same list. Instead, Google uses algorithms and collected data to tailor search results to individual interests. In fact, search results, recommendations from websites like Amazon or Netflix, and even the advertisements at the edge of the screen are all personalized based on collected traceable data we create simply by using what the internet has to offer. . . . the more we consume simple, flashy stories, the fewer complex stories with important information for our duties as citizens are within our online eyesight for consumption. Further, we may become limited to the ideologies we most identify with, denying ourselves the opportunity to broaden our understanding by brushing up against different points of view. The filter bubble may give us too much of what we want and not enough of what we need . . . Pariser calls on individuals, companies, and the government to take steps to manage the effects of the filter bubble. He calls on individuals to recognize the existence of a filter bubble and to take that into account when using the Internet. Pariser asks individuals to widen their scope of interest to allow for serendipity and occasionally add data that causes algorithms to adjust what is most likely to come up first. We can do this by jumping away from our usual websites and routines to seek out information on subjects we don’t usually run across." (Source: Review: The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think, Bryn Neenos, Electronic Media & Politics)
". . . the filter bubble—being surrounded only by people you like and content that you agree with. And the danger is that it can polarise populations creating potentially harmful divisions in society.
Today, Eduardo Graells-Garrido at the Universitat Pompeu Fabra in Barcelona as well as Mounia Lalmas and Daniel Quercia, both at Yahoo Labs, say they’ve hit on a way to burst the filter bubble. Their idea is that although people may have opposing views on sensitive topics, they may also share interests in other areas. And they’ve built a recommendation engine that points these kinds of people towards each other based on their own preferences.
The result is that individuals are exposed to a much wider range of opinions, ideas and people than they would otherwise experience. And because this is done using their own interests, they end up being equally satisfied with the results (although not without a period of acclimatization). 'We nudge users to read content from people who may have opposite views, or high view gaps, in those issues, while still being relevant according to their preferences,' say Graells-Garrido and co." (Source: How to Burst the "Filter Bubble" that Protects Us from Opposing Views, Emerging Technology from the arXiv, Technology Review)
- See How (and why) To Turn Off Google's Personalized Search Results
- And see How do I disable Facebook's instant personalisation permanently?
- And see Opt out of Pinterest's personalization feature
- And see How To Prevent Amazon from Tracking Your Browsing and Recording Your Visits to 3rd Party Sites
- Manage Personalized Recommendations in iTunes Store, App Store, and iBooks Store for iPhone, iPad, or iPod touch
- Netflix: If you don’t want to be bothered by the movie preview ads, you can opt out from Netflix’s preview tests through your Account Settings. (Account -> Test Participation.) There is not much else you can change.