Schneier on Airport Security Profiling

It’s been a long time since I cited security expert Bruce Schneier, who brings rational thought and common sense to discussions dominated by fear and gut reactions. The Trouble With Airport Profiling asks “Why do otherwise rational people think it’s a good idea to profile people at airports?” Responding to a proposal that TSA address its airport security efforts to “Muslims, or anyone who looks like he or she could conceivably be Muslim” Schneier argues that such profiling would put air travelers at greater risk:

  • It is not accurate.

Post 9/11, we’ve had 2 Muslim terrorists on U.S airplanes: the shoe bomber and the underwear bomber. If you assume 0.8% (that’s one estimate of the percentage of Muslim Americans) of the 630 million annual airplane fliers are Muslim and triple it to account for others who look Semitic, then the chances any profiled flier will be a Muslim terrorist is 1 in 80 million. Add the 19 9/11 terrorists — arguably a singular event — that number drops to 1 in 8 million. Either way, because the number of actual terrorists is so low, almost everyone selected by the profile will be innocent.

  • It is under-inclusive.

[T]o assume that only Arab-appearing people are terrorists is dangerously naive. Muslims are black, white, Asian, and everything else — most Muslims are not Arab. Recent terrorists have been European, Asian, African, Hispanic, and Middle Eastern; male and female; young and old.

  • It is too easy to avoid.

A wolf in sheep’s clothing is just a story, but humans are smart and adaptable enough to put the concept into practice.

  • It carries significant social and political costs.

Schneier on Security in 2020

Security in 2020 is a fascinating, provocative post from security expert Bruce Schneier’s latest newsletter.  He briefly looks at the current focus of IT security, (each concept he discusses is captured in in what he acknowledges are invented “ugly” words): deperimeterization — “dissolution of the strict boundaries between the internal and external network” — , consumerization — “consumers get the cool new gadgets first, and demand to do their work on them” — , and decentralization — cloud computing.  Then he projects developing trends:  deconcentration — “general-purpose computer is dying and being replaced by special-purpose devices” — , decustomerization — “we get more of our IT functionality without any business relationship” — , and depersonization — “computing that removes the user, either partially or entirely.”  Get past the IT-professional jargon.  Each term nails a distinct trend.

Discussing the delivery of IT services without fee-based relationships he says

We’re not Google’s customers; we’re Google’s product that they sell to their customers. It’s a three-way relationship: us, the IT service provider, and the advertiser or data buyer. And as these noncustomer IT relationships proliferate, we’ll see more IT companies treating us as products. If I buy a Dell computer, then I’m obviously a Dell customer; but if I get a Dell computer for free in exchange for access to my life, it’s much less obvious whom I’m entering a business relationship with. Facebook’s continual ratcheting down of user privacy in order to satisfy its actual customers — the advertisers — and enhance its revenue is just a hint of what’s to come.

With respect to “computing that removes the user”–things talking to things–he says

The “Internet of things” won’t need you to communicate. The smart appliances in your smart home will talk directly to the power company. Your smart car will talk to road sensors and, eventually, other cars . . . The ramifications of this are hard to imagine . . . But certainly smart objects will be talking about you, and you probably won’t have much control over what they’re saying.

Schneier’s summation:

One old trend: deperimeterization. Two current trends: consumerization and decentralization. Three future trends: deconcentration, decustomerization, and depersonization. That’s IT in 2020 — it’s not under your control, it’s doing things without your knowledge and consent, and it’s not necessarily acting in your best interests.

Worth reading for anyone interested in how technology shapes our lives.  Especially Internet law students.

Toxic Data

Tomorrow I start teaching a half-semester seminar on privacy law in the honors program.  Here’s Bruce Schneier with a timely piece about data, “the natural by-product of every computer mediated interaction.  It stays around forever, unless it’s disposed of.  It is valuable when reused, but it must be done carefully.  Otherwise, its after-effects are toxic.”   Schneier warns that future generations will look back on our heedless treatment of data as we look back with dismay on our forebears’ cavalier treatment of industrial pollution.

Schneier on Impersonation

Bruce Schneier wrote a terrific piece about impersonation and identity authentication at He uses the various “physical tokens” we carry in our wallets to argue that “[d]ecentralized authentication systems work better than centralized ones:” loss or compromise of your credit or health club membership card does not compromise your driver’s license or library card. He concludes “[t]his is one of the reasons that centralized systems like REAL-ID make us less secure.”

Security Choice

Bruce Schneier wrote recently about airport security after a screener seized a 6-oz jar of past sauce from his luggage:  “the official confiscated it, because allowing it on the airplane with me would have been too dangerous. And to demonstrate how dangerous he really thought that jar was, he blithely tossed it in a nearby bin of similar liquid bottles and sent me on my way.”  He goes on to discuss “the two classes of contraband at airport security checkpoints: the class that will get you in trouble if you try to bring it on an airplane, and the class that will cheerily be taken away from you if you try to bring it on an airplane.”  Airport security need not catch all of the former as long as the risk and consequences of detection are enough to deter one from attempting to bring them aboard.  That’s not true of the latter type of contraband:  “[b]ecause there are no consequences to trying and failing, the screeners have to be 100 percent effective. Even if they slip up one in a hundred times, the plot can succeed.”  He concludes that airport security should choose:  “[i]f something is dangerous, treat it as dangerous and treat anyone who tries to bring it on as potentially dangerous. If it’s not dangerous, then stop trying to keep it off airplanes.”


Here’s a companion piece to the Schneier article from The Atlantic:  The Things He Carried

Don’t Enjoin the Messenger

Two weeks ago three students from MIT appeared at DEFCON in Las Vegas to present their successful hack of the Massachusetts Transit Authority’s electronic fare system–the “Charlie Card.” The MBTA went to federal court to enjoin publication of students’ presentation, claiming it would violate the Computer Fraud and Abuse Act. The court granted the injunction on August 9, only to lift it yesterday, ruling that the MBTA was not likely to succeed on its CFAA claim. Follow the story’s arc here, here, here, and here–and then read Bruce Schneier’s timely (8/7) essay from The Guardian. Schneier’s piece discusses the successful hack of the London subway’s Oyster smartcard by students from the Netherlands. The Oyster card’s maker, NXP Semiconductors, sued to prevent publication of the hack; it lost. The Oyster card uses the same chip–the “Mifare Classic”–used by Boston and other transit systems. Schneier writes “[t]he security of Mifare Classic is terrible . . . it’s kindergarten cryptography. Anyone with any security experience would be embarrassed to put his name to the design. NXP attempted to deal with this embarrassment by keeping the design secret.” In ruling against NXP the Dutch court said “[d]amage to NXP is not the result of the publication of the article but of the production and sale of a chip that appears to have shortcomings.” (Emphasis supplied)

These two cases follow a familiar pattern: Company A does a crap job designing or delivering a good or service to Company B; someone blows the the whistle on Company A’s mis- or malfeasance; Company B blames the whistleblower for leaking news of flaw instead of blaming Company A for its lousy performance. Here the Dutch court got it right, and the U.S. court is heading in the right direction.

Privacy and Security

A story in yesterday’s Wall Street Journal titled NSA’s Domestic Spying Grows as Agency Sweeps Up Data (subscription required) reports that–

According to current and former intelligence officials, the spy agency now monitors huge volumes of records of domestic emails and Internet searches as well as bank transfers, credit-card transactions, travel and telephone records. The NSA receives this so-called “transactional” data from other agencies or private companies, and its sophisticated software programs analyze the various transactions for suspicious patterns. Then they spit out leads to be explored by counterterrorism programs across the U.S. government, such as the NSA’s own Terrorist Surveillance Program, formed to intercept phone calls and emails between the U.S. and overseas without a judge’s approval when a link to al Qaeda is suspected.

The NSA’s enterprise involves a cluster of powerful intelligence-gathering programs, all of which sparked civil-liberties complaints when they came to light. They include a Federal Bureau of Investigation program to track telecommunications data once known as Carnivore, now called the Digital Collection System, and a U.S. arrangement with the world’s main international banking clearinghouse to track money movements.

The effort also ties into data from an ad-hoc collection of so-called “black programs” whose existence is undisclosed, the current and former officials say. Many of the programs in various agencies began years before the 9/11 attacks but have since been given greater reach. Among them, current and former intelligence officials say, is a longstanding Treasury Department program to collect individual financial data including wire transfers and credit-card transactions.

An NSA spokeswoman stated that the Agency “strictly follows laws and regulations designed to preserve every American’s privacy rights under the Fourth Amendment to the U.S. Constitution.” If you find comfort in that statement, consider this description of how the Agency uses its expanded domestic surveillance authority to pursue leads:

If a person suspected of terrorist connections is believed to be in a U.S. city — for instance, Detroit, a community with a high concentration of Muslim Americans –the government’s spy systems may be directed to collect and analyze all electronic communications into and out of the city. The haul can include records of phone calls, email headers and destinations, data on financial transactions and records of Internet browsing. The system also would collect information about other people, including those in the U.S., who communicated with people in Detroit.

The information collected “doesn’t generally include the contents of conversations or emails.” Generally. That’s a word we lawyers use to say “most of the time we don’t, unless we do.” Even without such content the NSA can identify the parties to phone calls and emails, their locations, and their cell phone numbers. The telecoms enable the NSA’s efforts either by copying all data through their switches to share with the NSA, or by ceding control to the NSA over the switches. The White House is pushing a bill that would immunize the telecoms from liability for privacy claims arising from this data collection. The NSA domestic surveillance program includes elements of and technology from the Pentagon’s Total Information Awareness initiative that Congress defunded in 2003 following criticism of TIA’s potential for civil rights abuses. Before it was killed the Pentagon renamed TIA to Terrorist Information Awareness to make it seem less creepy. Now the NSA is implementing TIA through its “black budget,” beyond effective non-NSA scrutiny.

The Journal story reminded me of a recent Wired column by the always-prescient Bruce Schneier: What Our Top Spy Doesn’t Get: Security and Privacy Aren’t Opposites. Schneir’s column focuses on a proposal from National Intelligence Director Michael McConnell to monitor all–“that’s right, all–” Internet communications:

In order for cyberspace to be policed, internet activity will have to be closely monitored. Ed Giorgio, who is working with McConnell on the plan, said that would mean giving the government the authority to examine the content of any e-mail, file transfer or Web search. “Google has records that could help in a cyber-investigation,” he said. Giorgio warned me, “We have a saying in this business: ‘Privacy and security are a zero-sum game.'”

This states it as baldly as one can. This administration’s top intelligence personnel consider every increase in security to require a corresponding decrease in privacy. As Scheier states “I’m sure they have that saying in their business. And it’s precisely why, when people in their business are in charge of government, it becomes a police state.” Scheier says privacy versus security is a false dichotomy, that the true dichotomy is between liberty and control–and that “liberty requires both security and privacy.”

Schneier on Irrational Responses

Bruce Schneier’s sensible observations on security are always worth reading. Sometimes his observations resonate more deeply, such as this commentary in Wired: Virginia Tech Lesson: Rare Risks Breed Irrational Responses. After the Virginia Tech shootings I wrote in Sense and Senselessness about the urge to “do something” after horrific events and how both pro- and anti-gun control advocates both seized these shootings to promote their respective agendas. Schneier makes the same points in a pithy and clear-eyed overview of this phenomenon, coining this formula: “Novelty plus dread equals overreaction.”