Vanderwarker’s Pantheon

The Boston Globe raved today about my friend Peter Vanderwarker‘s photography exhibit at the Boston Athenaeum.  (Peter shot the image of the faux-Prada storefront on the postcard in my office window.) Titled “Vanderwarker’s Pantheon: Minds and Matter in Boston” the exhibit consists of 35 large-format photographs of people and places that define Boston for Peter.  Peter’s images are arresting, bold, and beautiful.  If you like photography, want to learn more about Boston, want to visit the Athenaeum–a Boston Brahmin location in its own right, at 10 1/2 Beacon Street–, or some combination of the above then check out the show.  It runs through May 2.

Music Industry Panel Debate

Next Tuesday, March 3, from 7:00-9:00 PM in the School of Management auditorium I am moderating a panel discussion–titled “What Lies Ahead”–on the future of the music industry. Audience Q & A will follow the moderated discussion.   For more information see the event website or the poster below.  Register for the event by joining the Facebook group.

what-lies-ahead

Civility on the Upswing?

Since I’ve posted often about the poor quality of Internet discourse readers might welcome a different point of view. Civility comes to Net by Don Aucoin, in the February 21 Boston Globe, says “[t]here is a quiet but growing movement to forge a truce in . . . [the] ‘arms race of name-calling’ on the Web.”  He cites development on social-networking sites like Facebook of “an informal code of conduct” in which trolls “are either ignored or told to get lost.”  A possible cause of this budding civility could be awareness that what one says online can affect employment, college, and graduate school admission prospects.  I have not noticed this trend, but I could be looking in the wrong places.

Regulating Intermediaries

Declan McCullagh wrote a few days ago about two bills proposed in Congress to enact a federal law that would require ISPs, wi-fi hot spots, and home users, among others,  “to keep records about users for two years to aid police investigations.”  The stated goal is, of course, to “keep[] our children safe.”  The companion bills (one filed in each of the House and Senate) are titled the “Internet Stopping Adults Facilitating the Exploitation of Today’s Youth Act”–in other words, the Internet Safety Act.  Apparently these bills will allow Youth to continue to facilitate the exploitation of other youth.

The bills’ operative language:  “A provider of an electronic communication service or remote computing service shall retain for a period of at least two years all records or other information pertaining to the identity of a user of a temporarily assigned network address the service assigns to that user.”  This means I will need to keep records about house guests who go online through my wireless network, as will Starbucks, Boston University, airport wi-fi, everyone who assigns dynamic IP addresses with DHCP.  I’m not set up to keep such records, and for large ISPs or wi-fi access point providers retaining the data required by these bills could be burdensome.

Another brick in the we-gotta-do-something wall.

CMLP on Doninger v Niehoff

Citizen Media Law Project is a wonderful resource.  I assign CMLP materials in the Internet law course and turn to it often for news and analysis of Internet speech, copyright, and privacy issues.  A post on January 29 addressed the Avery Doninger case, which I’ve followed for some time.  (I learned of the case through Andy Thibault, author of Law & Justice in Everyday Life, creator of the Cool Justice column and blog, and a friend from college.  I donated money to Doninger’s legal defense fund and received a heartfelt, appreciative thank you note from her.  Good manners go a long way.)  As a high-school junior in Connecticut Doninger expressed frustration in a blog post over a decision by her school’s principal, and called the principal a “douchbag.”  The school punished Doninger by preventing her candidacy for senior class secretary.  She sued in federal court to be permitted to run for class secretary, resulting in the first in a string of related decisions by the Connecticut U.S.D.C. and the Second Circuit.  Anyone interested in First Amendment speech issues generally, and speech issues in public schools in particular, should read the CMLP post and the linked decisions and documents.

About-Face

Facebook handled its latest privacy kerfuffle more adroitly than prior dust-ups. (Posts here, here, here, and here.) It started when consumer-rights blog Consumerist publicized a recent change in Facebook’s terms of service.  Facebook eliminated the right of users to remove their content–the profiles they created, pictures they posted, etc.–and added a provision giving Facebook the right to retain a user’s content even after the user’s content was terminated. As Consumerist characterized the changes, “anything you upload to Facebook can be used by Facebook in any way they deem fit, forever, no matter what you do later.” That’s a scary thought, and it rightly stirred up Facebook users.  At first Facebook couched the changes in ways less threatening than they were perceived–explaining, for instance, that after a user terminated his account the comments he posted on another user’s wall would remain on the site, not that Facebook wanted to use pictures of students doing jelly shots until they were old enough for AARP. Then Facebook caved, reverting to the terms of use in effect before these changes.  The site’s chief privacy officer “characterized the event as a misunderstanding, stemming from a clumsy attempt by the company to simplify its contract with users . . .”

This controversy goes to the heart of the latent ambiguity in Web 2.0 applications.  I create the framework, you add the content, I manipulate/mine/exploit the content for my financial gain.  It is a seductive trap.  Users go to the site and see their profiles, their walls, their pictures, their friends, their lives online.  Sites like Facebook are structures on which users hang whatever interests them, and if enough users hang interesting stuff then more users will come.  They can be brilliant examples of the profound, transformative power of the Internet, the network of networks manifested as a community of communities.  The users provide the material from which it is all woven together but once that material is on the site’s servers its ownership can be murky.  It’s a sure bet that most Web 2.0 terms of use give the sites rights in user-created content that do not correspond with the users’ expectations.  Facebook has trampled users’ expectations before and will do so again.  It’s an inevitable result of its business plan.

Pirate Bay on Trial

Torrent clearing-house Pirate Bay is on trial in Sweden for copyright theft.  If convicted its owners could be imprisoned for up to two years and fined $145k. The plaintiffs, producers of movies, music, and video games, are also seeking over 10 million Euros in damages.  Pirate Bay’s defense is that it does not host any copyright-protected content on its servers and thus cannot have committed copyright theft.  Echoing the U.S. Supreme Court’s holding in the Sony decision Pirate Bay’s counsel said “It is legal to offer a service that can be used in both a legal and illegal way.”

Pirate Bay has gotten the best of the case so far.  On the second day of trial prosecutors dropped “all charges relating to ‘assisting copyright infringement,'” which were the most serious charges in the case.  Remaining are charges of “assisting making available copyrighted content.”  The infringement charges were dropped, according to the linked article in The Guardian, because prosecutors were “unable to prove in court that illegally distributed files had used The Pirate Bay site.”  Huh?  In other words, the prosecution brought its copyright infringement case without evidence of copyright infringement?

That’s embarrassing.  A co-defendant said the prosecutor didn’t understand the Pirate Bay technology.  Not a good day at the office.

Dealing with the evisceration of their case the copyright owners are putting on a brave face.  The article reports that the prosecution claimed “that dropping the charges . . . would simplify the case against The Pirate Bay.”  That doesn’t even pass the straight-face test.  Music company legal counsel said “[i]t’s a largely technical issue. It changes nothing in terms of our compensation claims and has no bearing whatsoever on the main case against The Pirate Bay. In fact it simplifies the prosecutor’s case by allowing him to focus on the main issue, which is the making available of copyrighted works.”

I know nothing about Swedish copyright law.  Under U.S. law the “making available” theory is weak without proof of actual distribution of a copyright work, as posts here have noted in connection with the RIAA’s case against Jammie Thomas and others decided in the past year. We’ll see how it fares in Sweden.

ISTTF Report

The Internet Safety Technical Task Force recently released its final report, titled “Enhancing Child Safety and Online Technologies.” The Task Force prepared the report at the request of 50 state Attorneys General “to determine the extent to which today’s technologies could help to address  . . . online safety risks” including “the dangers of sexual solicitation, online harassment, and bullying, and exposure to problematic and illegal content.”  The Task Force concluded “the risks minors face online are complex and multifaceted and are in most cases not significantly different than those they face offline, and that as they get older, minors themselves contribute to some of the problems.”  (Today’s class discussion about cyberbullying, involving students far closer to the problem than me, echoed this conclusion.”  Cherry-picking from the 278-page report’s Executive Summary:

  • The Internet increases the availability of harmful, problematic and illegal content, but does not always increase minors’ exposure. Unwanted exposure to pornography does occur online, but those most likely to be exposed are those seeking it out, such as older male minors.
  • Social network sites are not the most common space for solicitation and unwanted exposure to problematic content, but are frequently used in peer-to-peer harassment, most likely because they are broadly adopted by minors and are used primarily to reinforce pre-existing social relations.
  • [Minors] who are most at risk often engage in risky behaviors and have difficulties in other parts of their lives. The psychosocial makeup of and family dynamics surrounding particular minors are better predictors of risk than the use of specific media or technologies.

The Task Force reviewed numerous technologies intended to protect child safety online “including age verification and identity authentication, filtering and auditing, text analysis, and biometrics,”  a review that produced “a state of cautious optimism.” The report did “note that almost all technologies submitted present privacy and security issues that should be weighed against any potential benefits. Additionally, because some technologies carry an economic cost and some require involvement by parents and teachers, relying on them may not protect society’s most vulnerable minors.”  The Task Force “cautions against overreliance on technology in isolation or on a single technological approach.”