I have often marveled at how good Google–and Bing and Amazon–are at tailoring search results to my tastes and desires. Sometimes the uncanny accuracy can be a tad disconcerting, but on the whole, the personalization makes search far more useful. Until recently, I had no idea I was being cheated.
But in an op-ed in The New York Times, Eli Pariser, board president of MoveOn.com argues that personalized filtering is not only bad for me, it actually threatens democracy. Pariser’s argument, set forth at greater length in his new book, The Filter Bubble: What the Internet Is Hiding from You, is that individuals and society as a whole are done a disservice when search tools show them what they actually want to see. “Democracy depends on the citizen’s ability to engage with multiple viewpoints,” he writes. “The internet limits such engagement when it offers up only information that reflects your already established point of view. While it’s sometimes convenient to see only what you want to see, it’s critical at other times that you see things that you don’t.”
This is errant nonsense. There’s a theory about that the proliferation of specialized, sometimes ideologically narrow, sources on the net coupled with search filtering is leading to an increasingly polarized society. Our civic life is distressingly polarized, but this trend long predates the rise of the web, and I have yet to see anything resembling rigorous research that supports the notion that narrowly ideological web sites (or broadcasts) are a cause rather than a symptom.
More disturbing is the implicit call for the government to do something about this situation if Google and the rest don’t give us easier ways to dial back the filtering. Pariser writes: “Companies that make use of these algorithms must take this curative responsibility far more seriously than they have to date. They need to give us control over what we see — making it clear when they are personalizing, and allowing us to shape and adjust our own filters.”
There’s a very strong “eat your peas” element to all this. Pariser criticizes Facebook’s Mark Zuckerberg for saying that people may find a squirrel dying in their yard more important than people dying in Africa, and that “leaves us staring at our front yard instead of reading about suffering, genocide, and revolution.” True. But I have to admit that even as a person deeply engaged with the world, right now I find the family of foxes that has taken up residence under my garden shed a whole lot more interesting than the latest news from Libya or Syria. And I don’t mind saying so.
I am not reflexively opposed to regulation, or even a little coercion to get companies to do the right thing, when the need is obvious, usually as evidenced by a market failure. But that is hardly the case here. To the extent that there are problems with search results, they tend to be the result of web site owners gaming the search algorithms, and this endless warfare between unstoppable forces and immovable objects is being dealt with by Google in the normal course of business. A change in filtering policy, voluntary or forced, could only be justified if there were evidence of customer dissatisfaction. And this simply does not exist, outside of a small circle of internet theorists. Perhaps people should care more about human suffering, but if they don’t that’s their business, not mine, or Pariser’s, or Google’s to change.
That fact is, there are times when I don;t want filtering, typically because I am looking for something out of my norm. and when that happens, I know how to turn it off. The easiest way, with most browsers, is to open an “private” or “anonymous” browser session. Then, the results will neither be affected by past searches nor will it affect future searches. It’s easy enough to do–and I’ll bet most people find very little use for it (except for searches they want to hide from their search history.)