Lawrence Lessig, in his book Code and Other Laws of Cyberspace, brings up a very powerful argument. He states:
The code of cyberspace is becoming just another tool of state regulation. Indirectly, by regulating code writing, the government can achieve regulatory ends, often without suffering the political consequences that the same ends, pursued directly, would yield.
We should worry about this. We should worry about a regime that makes invisible regulation easier; we should worry about a regime that makes it easier to regulate. We should worry about the first because invisibility makes it hard to resist bad regulation; we should worry [about] the second because we don’t yet […] have a sense of the values put at risk by the increasing scope of efficient regulation.
Lessig specifically targets state regulation as a point of concern, but the reality is that it is the private sector that presents some of the biggest reasons to worry about data protection and privacy. Regulation, as Lessig also notes, consists not only of the law, but also of “social norms, the market, and architecture.” In this case, he refers to “architecture” as the actual way that something is built and designed. He states that “Changes in any one will affect the regulation of the whole. Some constraints will support others; some may undermine others. A complete view, however, should consider them together.”
Thus the private sector, which more closely relates to and influences social norms, the market, and architecture, plays an even larger role in privacy and data protection than the laws that states create to regulate them. And today, in our highly mobile and data-hungry world, information directly translates into profit. Should we be concerned about all of the information that various services, such as Facebook and email, possess and share? To a large degree, the answer is yes — the extent to which our personal information and private lives are intertwined with and broadcast over the internet is a theme that this course constantly circles back to. In most cases, however, some information is much more important than other information, and absolute anonymity may not actually be the most desirable outcome.
Herbert Burkert notes that mobilization and involvement builds bonds within a community, in which case complete anonymity would be more detrimental to one’s life than a privacy breach. His main concern with PETs, or privacy-enhancing technologies, is that PETs need more modes so that users can pick and choose when to be anonymous and when to share, citing the example of political privacy found in casting ballots vs. forming petitions.
For those cases when you do want absolute security, encryption is the primary method turned to for data protection, though even this method is not foolproof. “Encryption methods generally consist of an algorithm and a key. The algorithm is a one-way function that, when invoked with the original text (or plaintext) and a key used as an input, produces the ciphertext.” Encryption comes in a variety of different methods, with each changing the degree of difficulty needed to share keys and the computer power necessary to decipher codes. The more laborious the encryption, the less vulnerable data is to brute force attacks – attacks “in which every possible decryption key is tried until one is found that works.” Naturally, strong encryption tools are not something that most governments, particularly the United States government, want to fall into the wrong hands, and often regulations are put in place to try to limit who has access to extremely powerful encryption abilities. Indeed, encryption services would not be available to the public if not for the private sector. Richard Barth and Clint Smith write that “Only when the international financial services industry became more automated in the 1970s did it begin to incorporate strong encryption to secure payment and clearing systems.”
However, many of these arguments take place behind-the-scenes. The most obvious consumer problems arise when people are not given the choice between using a service and protecting their information. The main way that services get around this stumbling block is through the “consent approach,” in which “rather than put effort into avoiding the use of personal information or into adjusting the system to specific legal requirements, [services] simply seek to get the subject’s consent for whatever they wish to do with personal information.” These terms of services are generally lengthy, obtusely written, and liable to change without notice. Most often, major uproars occur when something obvious changes in the way that people use a specific service. For example, Youtube’s recent changes that force commenters to connect their accounts with a Google+ account have incited a good deal of public ire….and stick figure crusades.
People often are already aware of what information they would like to make public, and what information that would like to remain private. Unfortunately, what most people don’t realize is just how much they’ve already shared with the world.
 Lawrence Lessig, Code and Other Laws of Cyberspace (1999), 99.
 Ibid, 87.
 Herbert Burkert, “Privacy-enhancing Technologies: Typology, Critique, Vision,”136.
 David Phillips, “Cryptography, Secrets, and the Structuring of Trust,” 249.
 Ibid, 251.
 Richard Barth and Clint Smith, “International Regulation of Encryption: Technology Will Drive Policy,” in Borders in Cyberspace, ed. Kahin Nesson, 283.
 Burkert, “Privacy-enhancing Technologies: Typology, Critique, Vision,”128.