Review of The Filter Bubble by Eli Parizer

I just finished reading The Filter Bubble: What the Internet Is Hiding from You by Eli Pariser. It offeres a very interesting exploration into using various search tools and how we find the information that is central to our daily lives.

His main argument has to do with how “filters bubbles” emerge from the algorithms that supply the search results or news feeds for social media websites. Since 2009, Google for example supplies search results that are geared specifically to the user making the query. Gone are the days of obtaining “absolute” Google search results based on our terms (where everyone would see the same results). Now, the results we see are “relative” to our likes and features, as seen by Google – our browser, the location of where we are, and about 50 other variables Google uses to identify us as individuals. So, if two people type in the same keywords, they would see different results based on who they are. The Facebook “News Feed” works the same way, and Pariser has reason to believe that this is applied to other websites as well.

Here are some reading notes and quotes I found really interesting:

The filter bubble introduces 3 dynamics (p.9-10): “we are already in it”, “it is invisible” and “you don’t choose to enter the bubble”. In conjunction of how much information we produce, this leads to what Steve Rubel calls the attention crash (p.11).

On “our information diet” : “By definition, a world constructed from the familiar is a world in which there’s nothing to learn.” (p.15) In Robert Putman’s Bowling Alone, we are loosing (p.17) the “bonding capital” (being alike, creating bridges) and “bridging capital” (being able to talk to people not like us).

Facebook’s EdgeRank uses three variables: affinity (how much time we send interacting with someone); the relative weight of the content (relationship status updates vs. pokes); and recency (p.38).

“If trust i news agency is falling, it is rising in the new realm of amateur and algorithmic curation” (p. 66)

The CIA book on information analysis by Heuer (p. 81): The psychology of intelligence analysis also, check out this free version from the CIA website.

“Personalization can get in the way of creativity and innovation in three ways. First, the filter bubble artificially limits the size of our “solution horizon” – the mental space in which we search for solutions to problems. Second, the information environment inside the filter bubble will tend to lack some of the key traits that spur creativity. Creativity is a context dependent trait: We’re more likely to come up with new ideas in some environments than others; the contexts that filtering creates aren’t the ones best suited to creative thinking. Finally, the filter bubble encourages a more passive approach to information, which is a odds with the kind of exploration that leads to discovery.” (p. 94) Mentions The Art of Creation by Arthur Koestler.

Creativity generally has two parts: generative thinking (reshuffling and recombining) ; convergent thinking (survey options) (p. 103)

“If a self-fulfilling prophecy is a false definition of the world that through one’s actions becomes true, we’re now on the verge of self-fulfilling identities” (p. 112). […] “On sirens and children” by Yochai Benkler (p.112) “Autonomy, Benkler points out, is a tricky concept: To be free, you have to to be able not only to do what you want, but to know what’s possible to do.”

“fundamental attribution error. We tend to attribute peoples’ behavior to their inner traits and personality rather than to the situation they’re placed in,” (p. 116)

“In the future, we want to be all well-rounded, well-informed intellectual virtuoso, but right now we want to watch Jersey Shore. Behavioral economists call this present bias – the gap between your preferences for your future self and your preferences in the current moment.” (p. 117)

“Priming effect” (p. 124) – getting people to learn a sequence of words with a theme primes them to think in a way.

“With information as with food, we are what we consume. […] Your identity shapes your media, and your media then shapes what you believe and what you care about. […] You become trapped in a you loop” (p. 125)

“If identity loops aren’t counteracted through randomness and serendipity, you could end up stuck in the foothills of your identity” (p.127) – adapted from Matt Cohler’s “Local-Maximum Problem” – when trying to maximize something – try to go up a mountain, you should always rise – byt you could be stuck on a hill next to the mountain.

Overfitting: being stuck in a class that does not fit us – “a regression to the social norm” (p. 129) “But the overfitting problem gets to one of the central, irreducible problems of the filter bubble: Stereotyping and overfitting are synonyms” (p.131) The problem of finding a pattern in the data that is there and the problem of finding a pattern that is really not there.

David Hume and Karl Popper in the induction problem (p.133) All swans I see are white, therefore all swans are white.

“Fyodor Dostoyevsky, whose Notes from the Underground was a passionate critique of the utopian scientific rationalism of the day.” (p. 135) “But algorithmic induction can lead to a kind of information determinism” (p.135)

“China’s objective isn’t so much to blot out unsavory information as to alter the physics around it – to create friction for problematic information and to route public attention to progovernment forums. While it can’t block all of the people from all of the news all of the time, it doesn’t need to. «What the government cares about,» Atlantic journalist James Fellows writes, «is making the quest for information just enough of a nuisance that people generally won’t bother» The strategy, says Xiao Qiang of the University of California at Berkeley, is «about social control, human survailance, peer pressure, and self-censorship.»” (p.139)

“James Mulvenon, the head of the Center for Intelligence Research and Analysis, puts it this way: ” There’s a randomness to their enforcement, and that creates a sense that they’re looking at everything.” (p. 140)

On governments manipulate the truth “Rather than simply banning certain words or opinions outright, it’ll increasingly revolve around second-order censorship – the manipulation of curation, context, and the flow of information and attention.” (p.141)

Sir Francis Bacon = “Knowledge is power” “If knowledge is power, then asymmetries in knowledge are asymmetries in power” (p. 147)

David Bohm On Dialogue “To communicate, Bohm wrote, literally means to make something common” (p.162-3) Jurgen Habermas “the dean of media theory for much of the twentieth century, had similar views”

«Kranzberg’s first law: “Technology is neither good or bad, nor is it neutral”» (p.188)

“In this book, I’ve argued that the rise of pervasive, embedded filtering is changing the way we experience the Internet and ultimately the world. […] Technology designed to give us more control over our lives is actually taking control away.” (p. 218-9)

“Appointing an independent ombudsman and giving the world more insight into how the powerful filtering algorithms work would be an important first step.” (p. 231)

Ce contenu a été mis à jour le 2012-07-24 à 3:48 pm.