Small Town Dirt, Served Fresh Daily

Small towns in rural America are putting their own spin on social networking: “they write and read startlingly negative posts, all cloaked in anonymity, about one another” on sites hosted by Topix such as Mountain Grove Forum.  The New York Times reports in “In Small Towns, Gossip Turns to the Web, and Turns Vicious” that

Topix, a site lightly trafficked in cities, enjoys a dedicated and growing following across the Ozarks, Appalachia and much of the rural South, establishing an unexpected niche in communities of a few hundred or few thousand people — particularly in what Chris Tolles, Topix’s chief executive, calls “the feud states.” One of the most heavily trafficked forums, he noted, is Pikeville, Ky., once the staging ground for the Hatfield and McCoy rivalry.

Anonymity, website immunity under federal law from liability for defamatory content created by third parties, and long-time social connections and population stasis of small towns combine to make online gossip popular, riveting, and divisive. And good business:

Topix said it received about 125,000 posts on any given day in forums for about 5,000 cities and towns. Unlike sites like Facebook, which requires users to give their real name, Topix users can pick different names for each post and are identified only by geography. About 9 percent are automatically screened out by software, based on offensive content like racial slurs; another 3 percent — mostly threats and “obvious libel,” Mr. Tolles said — are removed after people complain.

“Astonishing” Criminal Liability for YouTube Video

In another European case I’ve blogged about before (herehere, and here), yesterday an Italian court convicted three Google executives of criminal privacy violations in a case arising out of a 2006 YouTube video of the bullying of an autistic boy, posted to YouTube by his abusers.  The court imposed suspended three- to six-month sentences on three of the executives charged, acquitting them of defamation along with another executive facing only the defamation charge.  Google, which said it plans to appeal, called the result “astonishing.”  One of the convicted defendants–who is Google’s global privacy counsel–said “[t]he judge has decided I’m primarily responsible for the actions of some teenagers who uploaded a reprehensible video to Google video.” Google’s senior vice president and chief legal officer and its chief financial officer were also convicted.  The Wall Street Journal article stated “[t]he trial could help define whether the Internet in Italy is an open, self-regulating platform or if content must be better monitored for abusive material.”

U.S. law, specifically Section 230 of the Communications Decency Act, would shield Google from liability because the actionable video was created and posted online by a third party. To put it in the language of Section 230,  Google would not be liable because it was not the video’s information content provider; it was not “responsible, in whole or in part, for the creation or development of” the video.  U.S. law recognizes the impracticability–or impossibility–of screening tens of thousands of posts and other items created by Internet users.  This case, and the French case discussed in the prior post, show how far the Internet has come from that described in John Perry Barlow’s Declaration of the Independence of Cyberspace:

Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind . . .  You have no sovereignty where we gather . . . I declare the global social space we are building to be naturally independent of the tyrannies you seek to impose on us. You have no moral right to rule us nor do you possess any methods of enforcement we have true reason to fear.

Or not.

Google Trial Begins

In 2006 a video was posted on YouTube of four Italian teenagers taunting another teenager with Down’s Syndrome.  Some months later Italian authorities notified Google of the video.  Google removed it, but its troubles did not end.  Civil and criminal charges were brought against four Google executives for violating Italian privacy law by not obtaining consent to show the video from all those who appear in it, and for failing to have adequate content filters in place to prevent the video’s  posting.  After many delays the criminal trial started this week in Milan.  The defendants face imprisonment for up to three years.  Section 230 of the Communications Decency Act would shield Google and its executives from civil and criminal liability if the event happened in the U.S., but Italian law does not provide similar protection.

Internet Law Sampler

Here are a few of things accumulating in my browser tabs:

  • Google:  Most takedown notices are illegitimate.  (This is over a month old.  Where have I been?)  “According to a story in PC World, Google says 57 percent of the takedown notices it has received under the Digital Millennium Copyright Act were sent by businesses trying to undermine a competitor.  About 37 percent of the notices weren’t valid copyright claims.”  Remember these figures when discussing whether CDA Section 230 should be amended to include a notice-and-take-down requirement.   A takedown provision’s incentives to lash back at critical but protected speech would do little to enhance the quality of online discussion and add more fodder for disputes.
  • Philip Markoff allegedly killed a woman in a Boston hotel room in a robbery gone awry, a liaison arranged through Craigslist’s erotic services section.  Markoff is also accused of robbing two other women in Craigslist-mediated hotel hook-ups.  The bulletin board website’s role has led, predictably, to calls for Craigslist to monitor its postings for prostitution and other similar activities.  An article on Law.com quotes Connecticut Attorney General Richard Blumenthal as saying “Craigslist has the means — and moral obligation — to stop the pimping and prostitution in plain sight.”  Blumenthal spearheaded a similar effort last year that led to a consent agreement between Craigslist and 40 state attorneys general in which Craigslist agreed to collect phone and credit card numbers from those who advertise erotic services.  Of this agreement Blumenthal said “Requiring phone numbers, credit cards and identifying details will provide a roadmap to prostitutes and sex traffickers – so we can track them down and lock them up.”  The accountability procedures had an immediate effect; erotic service ads declined by 80%–and in some cities by 90%–after Craigslist adopted them.  Still, after the Boston murder, Blumenthal believes Craigslist should be doing more.  What, or how much more, Craigslist should do is not clear to me.  Calling Markoff the “Craigslist Killer” is inevitable and more alliteratively appealing than “BU Medical School Killer,” but unfair to Craigslist.  The site is not responsible for this murder.  As the Supreme Court stated in Aschcroft v Free Speech Coalition-in a different context, but the principle still works–“[t]here are many things innocent in themselves, however, such as cartoons, video games, and candy, that might be used for immoral purposes, yet we would not expect those to be prohibited because they can be misused.”  That something–a candy bar, an Internet bulletin board, file-sharing technology–can be used for unlawful ends should rarely be grounds for limiting its use.  I understand the human desire after bad things happen to “do something!” but most often what we choose to do has a poor fit with what should be our objectives.
  • The National Security Agency wants to be your cyberspace supercop.  It is in the middle of a bureaucratic battle over which federal agency has responsibility for Internet security.  The same agency that conducted the Bush Administration’s warrentless wiretapping, wants to have the power to access every network in the country for the ostensible purpose of security.  The same agency that can only peek at our purely domestic communications if it has judicial approval wants the unfettered right to see it all.   Seems like a bad idea.

Speech versus Privacy

The Boston Sunday Globe Ideas section carried a longer article by Drake Bennet, titled “Time for a Muzzle,” asking whether it is “time to rethink free speech” online.  Bennet focuses on privacy interests affected by “[t]he online world of lies and rumor [that] grows ever more vicious,”  rooting his discussion in the famous Brandeis/Warren Harvard Law Review article “that served as the foundation for most of the state laws that today protect privacy.”  (Students in my upcoming privacy law seminar should take note.)   He cites one proposal, to give people greater control over their personal information; he also cites the concern that such control could “lead[] to an almost comical set of limits.”  The article should interest anyone following the  issues I’ve raised here in a recent posts.

CMLP on Juicy Campus’s Demise

Citizen Media Law Project says farewell to JuicyCampus.com, wistfully noting that the site’s passing means there will be no lawsuit to test the scope of Section 230.  As the CMLP post notes, the 9th Circuit’s decision in Fair Housing Council v Roommates.com provides an argument that JuicyCampus’s encouragement of defamatory postings mades it a co-developer or co-creator of actionable content.  We’ll never know.  Internet law students from a few semesters back will recall the thinly-disguised exam question posing this very issue.  Their consensus, by a narrow margin?  My hypothetical website would lose its immunity from liability, based on facts somewhat more pro-plaintiff than the real site.

Limits on Limited Liability?

On the other hand . . . the Internet is filled with nasty stuff.  Juicy Campus is gone–it was “launched as a cesspool, and it died because it never evolved into anything else” (source)–but it was never more than a pimple on the Internet’s hate-spewing, vicious, inane, juvenile, lowest-common-denominator underbelly.  Comments on most every site that allows them and unmoderated discussion threads devolve with numbing rapidity into the worst of human discourse.  Two laws, the First Amendment and Section 230 of the Communications Decency Act, enable this environment by shielding the speakers of all but actionable speech (primarily defamation and obscenity) from liability and the websites hosting the speech from liability for all speech, actionable or not, as long as they are not responsible in whole or in part for creating or developing the speech.

Section 230 may be vulnerable.  Parents and school administrators concerned about cyber-bullying and persons victimized by defamatory or otherwise offensive anonymous speech that enters the queue of the Internet’s permanent playlist, among others, wring their hands and wail “can’t someone do something?”  Responding to this question invariably yields bad results. If we do “something” it is usually what is easiest, what is most popular, what produces the best feeling of short-term accomplishment.  We rarely attack the problem’s root or think clearly about whether no action is better than misplaced action.

Section 230 is neither perfect nor sacrosanct but I am wary of any attempts to limit its scope, whether they come in through the front door or the back.  The lawsuit by two former Yale Law School students against their anonymous defamers and attackers on AutoAdmit.com may, as this article from Portfolio.com states, “forc[e] internet intermediaries to bear greater responsibility for what they carry.” The lawsuit may also be “an all-expenses-paid elitist temper tantrum . . . [that fails] to differentiate between the really wicked and some of the tamer flamers.”  The linked article by David Margolick recounts the events leading up to the lawsuit and the results so far.  This suit suffers from a weakness common in  defamation suits against anonymous online posters. The posts are disgusting, shocking, and offensive, but they are opinion and not defamatory.  Even putative statements of fact (such as allegations that one of the AutoAdmit plaintiffs had herpes) can be non-actionable if the context in which they are made is “juvenile and hyperbolic.”

It is the plaintiffs’ “meager catch” that threatens website immunity:  “[t]he fact that so few prey were netted could prompt calls to modify Section 230(c), if only to give victims of internet abuse more of a chance.”

Web Liability Without Section 230

In 2006 four Italian youths used a cellphone to video their taunting of a boy with Down syndrome, and then posted the video on Google-owned YouTube.  Four Google executives, including Google’s senior vice president and chief legal officer and its former CFO, now face criminal charges of defamation and privacy violation in Italy over posting of the video. (The trial was to start last week but the start was delayed.) European Union data collection laws give the subject of a photograph or video rights over its use.  The executives had no direct role in posting the video, of course, but there is no equivalent to the First Amendment protection of speech in Italy or the EU, and no equivalent to Section 230 of the Communications Decency Act  limiting liability of websites for defamatory or unlawful content created by third parties.