I also question if this will affect our ability as marketers to reach our audiences. Google's ever-evolving algorithm makes search engine optimization a challenge, but trying to infiltrate personal preferences (as interpreted by Google) is another task entirely.
Below I've posted Wired's article on Pariser's talk. Hopefully it will give you something to think about without keeping you up at night.
-----------
TED 2011: Junk Food Algorithms and the World They Feed Us
Wired.com | By Kim Zetter | March 4, 2011
LONG BEACH, California — With the birth of the internet came the promise — or so the myth goes — of broadened horizons and fantastic new opportunities to connect people and viewpoints in ways we could not have imagined.
But in reality, political activist Eli Pariser told the Technology Entertainment and Design conference on Wednesday, the internet has quickly encased us in personalization bubbles where increasingly the only people and ideas we encounter are the ones we already know.
Pariser, a senior fellow at the Roosevelt Institute and former executive director of MoveOn.org, said the Facebooks and Googles of the internet are tweaking their algorithms to personalize user experience, filtering content that shows us only what it thinks we want to see rather than all we can — and should.
“It’s your own personal, unique universe of information that you live in online,” Pariser said. “What’s in it depends on who you are and what you do. But the thing is, you don’t decide what gets in, and you don’t see what gets edited out.”
Take his Facebook page, for example. Pariser used to receive comments and links from readers on both sides of the political spectrum. Then one day he noticed his conservative friends had disappeared; only links from his liberal friends remained. Facebook, without asking him, had seen that he clicked more often on links from left-leaning friends and simply edited out the rest. The site used an algorithm that hides from view the kinds of content it has determined, from your past activity, that you are less likely to interact with.
Facebook isn’t alone in doing this kind of personalization, of course. Two people in different regions with different interests will receive different Google results when typing in the same search terms. To test it, Pariser asked friends in different locations to search on the protests in Egypt and send him screenshots of their results. While his friend Scott’s results were full of links about the protests, Daniel’s results were not.
Google’s algorithm considers 57 different elements in catering its search results for you, and as a result, “there is no standard Google search anymore,” said Pariser, who is writing a book on the political and social effects of web personalization.
Being exposed to different viewpoints and information is good for us, because it speaks to our various selves and competing interests. But too much personalization threatens to make us one-dimensional.He pointed to research on Netflix queues that examined how some films move quickly to the top of a user’s queue while others languish at the bottom of the list, never to be viewed. Iron Man, for example, zips right up to the top and out the Netflix door to the user’s doorstep, while a documentary like Waiting for Superman never makes it to the user’s mailbox.
The Netflix queue exposes an ethics struggle between our more impulsive selves and the better selves we strive to be, the research showed.
“We all want to be someone who has watched Rashomon,” Pariser said, “but right now we want to watch Ace Ventura for the fourth time.”
Filters that focus on what we click on first or most often, he said, cater to these impulses and end up serving us information junk food instead of a balanced information diet.
The internet was supposed to rid us of traditional media gatekeepers who controlled the flow of information. “But what we’re seeing is a passing of the torch from human gatekeepers to algorithmic ones,” he said.
If algorithms are going to curate the world and decide what we get to see, he said, then they should show us not just things we think we want to see but also things that are uncomfortable and challenging and that include other points of view.
He closed with an appeal to the Google and social media executives in the audience.
“We really need for you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility,” he said. “We need to make sure that they’re transparent so we can see what gets through and what doesn’t, and so we can decide what gets through and what doesn’t.
“We need the internet to connect us and introduce us to new ideas and people and different perspectives,” he continued, “and it’s not going to do that if it leaves us all isolated in a web of one.”
Photo: Eli Pariser (James Duncan Davidson/TED)