Privacy and Security Are Not a Zero Sum Game

Many in the security and public policy arena implicitly and often explicitly argue that greater security requires that we give up some, or a lot, of our privacy. Yes, it is necessary and a good thing that we have to take off our shoes to board an airplane.

In Ars Technica, Jon Stokes says that line of thinking is patently untrue. Having the government build massive databases with the intent of gathering and centralizing tons of information about citizens is a terrible idea, since it then becomes easier for it to be compromised. Remember, the first rule of security is that you need to worry most about the people inside the tent. Such a massive security undertaking increases the damage turncoats can do.

From “Analysis: Metcalfe’s Law + Real ID = more crime, less safety” in Ars Technica:

“We have a saying in this business: ‘Privacy and security are a zero-sum game.'” Thus spake security consultant Ed Giorgio in a widely-quoted New Yorker article on the US intelligence community’s plans to vacuum up and sift through everything that flies across the wires. But Giorgio is wrong—catastrophically wrong. The story of Fidencio Estrada, a drug runner who bribed Florida Customs agent Rafael Pacheco to (among other things) access multiple federal law enforcement databases on his behalf, suggests that when it comes to the government collecting data on innocent civilians for law enforcement purposes, privacy and security are essentially the same thing.

The factual background in the 11th Circuit Court of Appeals’ recent decision to uphold a lower court’s conviction of Estrada details how in early 2000, Pacheco accessed DHS’s billion-record Treasury Enforcement Communications System (TECS) database looking for any information that the feds had on Estrada. (Hat tip to CNET’s Declan McCullough, whose blog post brought this story to my attention.) Pacheco also went into the FBI’s National Crime Information Center (NCIC) database in order to dig up information on the warrants that were out for Estrada’s arrest. Pacheco then fed the info back to Estrada, who was better able to elude law enforcement in as he plied his narcotics trade.

Estrada and Pacheco were eventually busted, sentenced, and are currently doing time for their crimes, but their story shows exactly why the United States’ headlong rush to build government databases full of data on noncriminals (i.e., mere suspects, like OneDOJ, and the completely innocent, like Real ID) are such a spectacularly awful idea. All it takes is one bad apple with the right level of access, and the entire database is compromised.

With great (network) power comes great vulnerability
Here’s an ugly prediction that you can take to the bank: as the amount of data that the feds collect on innocent civilians grows, so will the number of people who are victims of crimes that were made possible by unauthorized access to a government database. I’m not just talking about identity theft, though that is a huge danger with Real ID, but violent crimes as well. As I explained in the OneDOJ post linked above, this prediction is just Metcalfe’s Law at work:

This is, of course, a fundamental problem inherent in the very nature of any massive, centralized government data-sharing plan that spans multiple agencies and connects untold numbers of state and federal law enforcement officers: the usefulness of such a system to any one individual (a white hat or a black hat) grows roughly with the square of the number of participants who are using it to share data (Metcalfe’s law). So the more white hats that any of these programs manage to connect to each other, the more useful the network as a whole will be to the small handful of black hats who gain access to it at any point.

That such databases will be “useful” to black hats means any number of things—useful for identity thieves, and useful for terrorists who seek to impersonate lawful citizens.

While I’m citing laws and trends from the world of computing that shortly will have a direct impact on all of our ability to carry out our lives in relative safety, let me bring up two more trends worth factoring into our deteriorating privacy/security equation: the rapidly diving cost-per-bit of mass storage and the increasing amount of bandwidth available on networks both public and private.

So the government wants to collect tons of detailed data on citizens in these large databases; meanwhile, the speed at which an attacker could siphon off that data is increasing, as is the frightening but real possibility that ever-larger swaths of that database can fit onto a single lost or purloined hard drive.

But perhaps all this talk of government databases squeezed onto hard drives that then fall into the wrong hands is just fear-mongering, and that’s probably best left to professionals.

Print Friendly, PDF & Email

One comment

  1. sa

    Since we’re quoting wikipedia definitions of emerging concepts today, here’s “security through obscurity”.

    http://en.wikipedia.org/wiki/Security_through_obscurity

    “Security through obscurity” is a concept used in computers (mainly for securing large systems). The concept states that the first step towards security is obscurity (which is similar to privacy).

    Basically the principle says: “You can’t mess with something if you don’t know anything about it or if it’s hidden from you.”

    Most programmers agree that this is only a FIRST principle in computer security, and further measures are usually necessary. First, hide the doorway. Then, also lock it, in case someone DOES stumble across it.

    Applicable to what Yves is discussing.

Comments are closed.