The Invisible Hand and the Daily Me

In his 1995 book Being Digital Nicholas Negroponte came up with the term “The Daily Me” to describe news and information tailored to the recipient’s interests and biases.  In his 2002 book Republic.com Cass Sunstein explained the Daily Me as a filter:

It is some time in the future. Technology has greatly increased people’s ability to “filter” what they want to read, see, and hear. General interest newspapers and magazines are largely a thing of the past. The same is true of broadcasters. The idea of choosing “channel 4” or instead “channel 7” seems positively quaint. With the aid of a television or computer screen, and the Internet, you are able to design your own newspapers and magazines. Having dispensed with broadcasters, you can choose your own video programming, with movies, game shows, sports, shopping, and news of your choice. You mix and match.  You need not come across topics and views that you have not sought out . . . The market for news, entertainment, and information has finally been perfected. Consumers are able to see exactly what they want. When the power to filter is unlimited, people can decide, in advance and with perfect accuracy, what they will and will not encounter. They can design something very much like a communications universe of their own choosing.

In an article discussing the book Sunstein feared that “from the standpoint of democracy, filtering is a mixed blessing.”  He continued:

First, people should be exposed to materials that they would not have chosen in advance. Unanticipated encounters, involving topics and points of view that people have not sought out and perhaps find irritating, are central to democracy and even to freedom itself. Second, many or most citizens should have a range of common experiences. Without shared experiences, a heterogeneous society will have a more difficult time addressing social problems and understanding one another.

Sunstein’s provocative premise generated a fair amount of commentary.  My Google search of <“cass sunstein” “republic.com” “daily me”> produced 825 “relevant” results and the concept of the Daily Me continues to resonate, but without great vigor–only 480 relevant Google hits.

Perhaps it should resonate more.  It came immediately to mind when I read this passage from Sue Halpern, Mind Control & the Internet, The New York Review of Books, 23-Jun-11:

The [Google] search process, in other words, has become “personalized,” which is to say that instead of being universal, it is idiosyncratic and oddly peremptory. “Most of us assume that when we google a term, we all see the same results—the ones that the company’s famous Page Rank algorithm suggests are the most authoritative based on other page’s links,” Pariser observes. With personalized search, “now you get the result that Google’s algorithm suggests is best for you in particular—and someone else may see something entirely different. In other words, there is no standard Google anymore.” It’s as if we looked up the same topic in an encyclopedia and each found different entries—but of course we would not assume they were different since we’d be consulting what we thought to be a standard reference.

Among the many insidious consequences of this individualization is that by tailoring the information you receive to the algorithm’s perception of who you are . . . Google directs you to material that is most likely to reinforce your own worldview, ideology, and assumptions . . . In this way, the Internet, which isn’t the press, but often functions like the press by disseminating news and information, begins to cut us off from dissenting opinion and conflicting points of view, all the while seeming to be neutral and objective and unencumbered by the kind of bias inherent in, and embraced by, say, the The Weekly Standard or The Nation.

The insidious difference, of course, is that we construct our own Daily Me through some degree of conscious choice, while personalized searches use our choices invisibly to define responses.  Reading The Wall Street Journal editorial page spikes your blood pressure so you get news feeds from The Huffington Post.  HuffPo makes your brain hurt but Fox News makes sense.  You care nothing about politics but Lolcats get you through the day. You make an affirmative decision about what to read, what to visit, what to ignore.  While I know that SEO games search results, I assumed that if Glenn Beck and I did the same Google search at the same moment we would obtain the same results.  I’m alarmed that that”s not necessarily the case.

ISTTF Report

The Internet Safety Technical Task Force recently released its final report, titled “Enhancing Child Safety and Online Technologies.” The Task Force prepared the report at the request of 50 state Attorneys General “to determine the extent to which today’s technologies could help to address  . . . online safety risks” including “the dangers of sexual solicitation, online harassment, and bullying, and exposure to problematic and illegal content.”  The Task Force concluded “the risks minors face online are complex and multifaceted and are in most cases not significantly different than those they face offline, and that as they get older, minors themselves contribute to some of the problems.”  (Today’s class discussion about cyberbullying, involving students far closer to the problem than me, echoed this conclusion.”  Cherry-picking from the 278-page report’s Executive Summary:

  • The Internet increases the availability of harmful, problematic and illegal content, but does not always increase minors’ exposure. Unwanted exposure to pornography does occur online, but those most likely to be exposed are those seeking it out, such as older male minors.
  • Social network sites are not the most common space for solicitation and unwanted exposure to problematic content, but are frequently used in peer-to-peer harassment, most likely because they are broadly adopted by minors and are used primarily to reinforce pre-existing social relations.
  • [Minors] who are most at risk often engage in risky behaviors and have difficulties in other parts of their lives. The psychosocial makeup of and family dynamics surrounding particular minors are better predictors of risk than the use of specific media or technologies.

The Task Force reviewed numerous technologies intended to protect child safety online “including age verification and identity authentication, filtering and auditing, text analysis, and biometrics,”  a review that produced “a state of cautious optimism.” The report did “note that almost all technologies submitted present privacy and security issues that should be weighed against any potential benefits. Additionally, because some technologies carry an economic cost and some require involvement by parents and teachers, relying on them may not protect society’s most vulnerable minors.”  The Task Force “cautions against overreliance on technology in isolation or on a single technological approach.”