Why Privacy and Freedom Can Sometimes Be Opposed

I was recently listening to an interview with Ann Cavoukian on Singularity 1 on 1, in which she began by claiming that privacy and freedom are fundamentally aligned. This may have been true historically. But looking forward, I suspect privacy and freedom are actually opposed. I know that may seem counterintuitive, so let me explain.

First of all, when talking about privacy, we can’t just focus on government vs. the individual. This is the old paradigm and it is changing. The tools of surveillance are rapidly being democratized. This might seem to be a strange point of view given the massive mountains of data currently controlled by just a few gatekeepers such as Google, Facebook, and yes, the US government. But any description of right now is inherently fleeting given the rapid pace of technological change. And data is multiplying so rapidly that we will all soon be sitting on massive mountains of data. We will all be sensing, recording, and storing everything we come into contact with. For this reason, we need to focus on the implications of individuals spying on other individuals. Ideally we should strive to have one privacy policy that applies to everyone equally, whether that person is a member of the government or not. And we should expect (and hope) that individuals will aggressively spy on their own government officials. After all, government secrecy is just the flip-side of individual privacy, and both are threatened by new technologies.

Second, privacy as an abstract concept is best represented by the image of a wall. Privacy is boundaries, borders, and lines of demarcation that say you can’t look here, listen here, or go here. Privacy tells us what we can’t do. Privacy is in many ways the opposite of freedom. As the tools of surveillance get democratized, one response that we might have is to institute what Ann Cavoukian calls “privacy by design.” This implies embedding privacy controls into the information infrastructure itself. This means presumably, including lots of rules about what individuals are not allowed to do. To me, such a program represents a potential threat to freedom. Because the question one has to ask is, who writes these rules and enforces them? Who therefore reserves the power to evade them? The likely answer to all of these questions is: the large tech companies who build the privacy controls, and the governments that coerce those companies into cooperating. Thus “privacy by design” is the surest way to preserve the status quo. Today we already have a large asymmetry when it comes to surveillance technologies. If we want to further institutionalize this asymmetry, then by all means we should get to work on centralized privacy controls. However if we want a maximally free and equal society, we may need to abandon the idea of privacy controls entirely and push for a “sousveillance” scenario where everyone has equal ability to surveil everyone else.

Let’s make this more concrete with an example. Consider your face. Your face is a dead giveaway as to who you are. You carry it with you everywhere you go. If you are a fan of privacy, you probably don’t want people to know all of the places you go. But if you go anywhere where there are other people, it is quite possible that those people will record your face. Now here’s the question. It’s your face. Do other people have a right to record it, copy it, and share it without your permission? By default, they certainly have that ability. But maybe they shouldn’t. Maybe we all should have special veto power over those who might record our faces. Maybe we all should be able to go into a special preferences window and set “facial privacy” to “on” and thereby automatically scramble any recordings taken of us by other people. Maybe this would be a nice example of “privacy by design.”

But how on earth would one enforce such a scheme? How does another person’s camera recognize the privacy settings you’ve chosen for your own face? We would need the other person’s camera and your face to somehow communicate with each other. Which means we need some kind of unified privacy standard. But that’s not good enough, because what if the other person doesn’t want to adopt that standard? Well then we have to make him adopt that standard. Essentially we’d have to mandate that all devices honor certain privacy features. And as a corollary, we’d have to make it illegal to alter your own device’s factory settings, since we can’t have people using hacks to get around the privacy controls. Protecting privacy rapidly introduces all the same thorny issues that we run into in the intellectual property debates. And at the end of it all, what have we accomplished? Sure, we’ve made it a bit easier for one person to hide his face. But what about the other person’s rights? What about the right to record what you see with your own eyes in an unscrambled fashion? And what about the fact that governments and hackers are just going to breeze right past these controls anyway? We haven’t really protected anyone’s privacy, so much as just made it a bit more bureaucratic and complicated to take a picture of someone else.

Now I’m not saying we necessarily have to completely abandon all privacy. But we do have to realize that protecting privacy is a balancing act. Every privacy control we enact is a new wall we’ve built. And when it comes to the information infrastructure, we should build walls with great care.

How Government Surveillance is Like Piracy

Many civil libertarians are up in arms about the NSA snooping revelations. And there are serious issues with the secrecy and oversight elements that I’m going to ignore here. But the fact that they are snooping doesn’t surprise me and, in itself, doesn’t bother me. I see privacy as a dead issue. Like my co-blogger Jon Perry and many other thinkers, I’m concerned that we must fight to allow citizen “sousveillance” and protect due process rather than chasing after technically infeasible privacy.

But there’s a way the NSA debate is like the piracy debate. The problem with a file sharer isn’t that he or she copied, but that the copy was done without permission. The NSA can be characterized as doing the same thing: copying data without permission. In both cases, a fundamental quality of digital technology — frictionless, nonrivalrous copying — enables the behavior. In both cases, the authority to grant permission is the key issue.

A pirate uploads a movie without authorization from the studio; the NSA downloads an email (OK, all the emails) without authorization from the user.

In both cases, the real-world analogues for which we have established law are not adequate. It is not quite correct to say that downloading a file is ‘stealing’ in the traditional sense of that word (whatever the moral equivalents might be, there is a physical difference between stealing something rivalrous and copying something nonrivalrous and it is hardly trivial). It is not quite adequate to say that the fourth amendment protects us from unreasonable ‘search and seizure,’ when one is talking about data. Data can be searched and copied without being seized or stolen in the physical sense of those terms. What protection are we afforded from seizure-less search? What about theft that robs someone only of their product’s artificial scarcity, not of any physical good?

If Everyone Has Something to Hide, Then It’s Not Surveillance that is the Problem

Alex Tabarrok at Marginal Revolution recently wrote a post called No One is Innocent:

“I broke the law yesterday and again today and I will probably break the law tomorrow. Don’t mistake me, I have done nothing wrong. I don’t even know what laws I have broken. Nevertheless, I am reasonably confident that I have broken some laws, rules, or regulations recently because its hard for anyone to live today without breaking the law. Doubt me? Have you ever thrown out some junk mail that came to your house but was addressed to someone else? That’s a violation of federal law punishable by up to 5 years in prison…

“One of the responses to the revelations about the mass spying on Americans by the NSA and other agencies is “I have nothing to hide. What me worry?” I tweeted in response “If you have nothing to hide, you live a boring life.” More fundamentally, the NSA spying machine has reduced the cost of evidence so that today our freedom–or our independence–is to a large extent at the discretion of those in control of the panopticon…”

All good points. Government surveillance now has the ability to find dirt on everyone. However, it is not necessarily surveillance that is the problem in this scenario. Rather, isn’t it bad laws that are at fault? If we are all by definition criminals, something is wrong with our legal structure. Surveillance just exposes what has always been a big problem. As we move into a world with less privacy, we are going to need fewer and more lenient laws, or else society will grind to a halt.

Imagine every person who used illegal drugs, broke a traffic rule, or violated copyright was immediately caught and punished. I’m guessing that in a matter of days at least half the American public would end up on the wrong side of the law. That’s because these laws are poorly designed. They always have been.

The same principle holds true when talking about cultural norms. If surveillance technologies are used to out a closeted homosexual against his will, then what is to blame? Is it the surveillance technologies? Or is it the screwed up culture that demonizes gays and forces them to hide in the first place?

I believe that more than anything else, a society with less privacy is going to have to become more relaxed. Most likely we’ll end up more tolerant of drug use, atypical sexual behavior, and minor rules infractions. And in many ways that might be a very good thing.

“Now With Enhanced Privacy!”

In a previous article, I mentioned how privacy as a commodity will only increase in value. This is because in a surveillance-heavy future, privacy will become more scarce. Therefore, we can expect new products to arise and fulfill this market need. Increasingly, products will advertise their privacy-enhancing features (whether or not these privacy enhancing features actually work). I see inklings of this trend already in mass market products like “Snapchat” which turn self-destructing data into a feature. Likewise, when Google+ first appeared on the scene, it attempted to distinguish itself from Facebook with its privacy-enhancing “circles.” And now that Facebook and Google appear to have been compromised by the NSA’s Prism program, the door is open for a new social network to step up that claims to better protect us from government eyes (again, whether or not it actually can). This principle applies offline as well. In the near future, we can expect bars and other businesses that institute “no-surveillance” policies as part of the way they attract clientele.

Ironically, Corporate Control Might Be the Only Way to Save Privacy

The new “Stop the Cyborgs” site fears “a future in which privacy is impossible and corporate control total.” But actually these two outcomes are probably mutually exclusive. Corporate control in the form of a single unified operating system would make enforcing privacy actually feasible. For example, if Google controlled the OS for all smart glasses then they  would have the ability to enact privacy controls that could, for example, automatically blur the faces of people who don’t want to be recorded. But if there is vibrant competition among operating systems, and lots of viable open source alternatives, then the notion that any such controls could ever be enforced goes out the window. Personally, if I had to choose between no privacy and corporate control, I would choose no privacy.

Yes, Privacy is Dying But That Doesn’t Have to Mean Corporate Control

I do like this cute sticker. And I think there will be a legitimate market in the future for "null spaces": bars and other businesses that are branded as surveillance free.

I’ve been expecting backlash against the coming end of privacy and now with the imminent arrival of Google Glass it may finally be manifesting. The site “Stop the Cyborgs“, in addition to making cute stickers for “No Surveillance Devices” and “Google Glass is Banned on these Premises”, is arguing against what it calls “a future in which privacy is impossible and corporate control total.” I actually agree with the first part of their future. Privacy is on its way out and we should get used to that fact. I’m not so sure that’s a bad thing. It might just keep us more honest in a positive way.

As for the second part of their prediction, about corporate control total…that’s completely unclear as a consequence. That only becomes true if we are all locked into the same operating system and if furthermore said operating system decides to behave in a controlling way towards its customers. To me this is an argument against monopoly and in favor of competition (especially from open source!). Not against Glass or other surveillance technologies.

Can We Really Expect Privacy Controls in a Transparent Society?

Hank Pellissier of IEET recently posted an article entitled “100% Honesty, Transparency, Disclosure – is this the “naked future” we want?” In the article, Pellissier describes the most extreme version of a transparent future: a world where you walk into a party and literally everyone knows everyone else’s thoughts.

It’s a fun article and an interesting thought experiment, but as you read through the text it becomes apparent that Pellissier is not really talking about a transparent future. In fact, his described utopia involves extremely robust privacy protections. The article makes several references to the idea of private and public mind files, implying that as an individual you still get to be the final arbiter of who does and doesn’t have access to your information. At one point he describes the level of sharing that would be necessary in choosing a marriage partner.

“Marriage partner? Private files that are usually off-limits are opened to peruse priorities like “long-term loyalty,” “patience,” interest trends,” and “annoying habits.”

Now this is a speculative world, and I’m not sure how these “mind files” are supposed to work. But a key feature of files is that they are easy to copy. Open your files to someone once, and those files are now out of your control. People can potentially copy and reshare that data at will.

But more importantly, Pellissier ignores how much will be inferable about us from our external behavior. A computer does not need to read your mind to determine your personality traits. If we imagine a world rife with sensors and information sharing, then there will be a wealth of data available on all of us. And you can bet that data will be parse-able in such a way that any “annoying habits” of mine will be able to be determined whether or not I voluntarily open up my private mind files.

At one point, Pellissier describes a discussion with his daughter:

“When I proposed my 100% transparency utopia to my family, my 12-year-old daughter rebelled. “We’d be robbed!” she exclaimed. “Bad guys would know our address and where we hide the key!” No, I explained. Mind-sharing would contain options, with public or private settings for different data, like Facebook. Everyone could be as secretive as they wished.  Shy, paranoid, and mystery-loving people could mingle together, laboriously extracting information from each other in old-fashioned Luddite ways.”

If this were my daughter, my response would have been different. First I would have explained that our address (and quite possibly the location of our key) would already be readily available to anyone interested in doing us harm. For this thought experiment to make any sense, we have to picture a sensor-rich, camera-heavy, highly networked world. In such a future how can you possibly expect to hide the location of the property that you return to every single night? You don’t think any cameras or GPS devices are going to capture you doing so? I’m pretty sure all it takes for a bad guy to find someone’s address now is a little bit of light googling. And that’s today.

Fortunately, I would explain to my daughter, if these bad guys do decide to rob us, their crime will be fully recorded and traceable to them, for the exact same reasons listed above. So most likely the bad guys won’t bother us, given that they face a near certainty of being caught.

It sounds nice to imagine a future with robust privacy settings, where we all can dicate what is and isn’t private. But deep down I really don’t see how that can ever be viable. To achieve this would require a locked down future where we are all running the same operating system. You need a unified system or else you can’t enforce any of these supposed privacy controls. And at the point we have a unified system, we are at the mercy of the programmers and how they decide to handle the inevitable conflicts of interest that will arise.

We can learn a lot by looking at the modern day intellectual property debacle. This chapter from Free Culture describes a documentary filmmaker who accidentally captured a few seconds of The Simpsons playing on a TV in the background of a shot. Fox ended up demanding 10,000 dollars in payment for use of the copyrighted material. In this moment, a simple act of documenting the world somehow crossed over and became infringement.

Now imagine I am in a bar with friends. I glance across the room and happen to witness a gay couple talking and laughing. All I do is glance for a second, but that’s enough time for my glasses to record and store their presence. I am recording the whole night at the bar because it is a special night, my last evening in Los Angeles before a long trip. Later that evening, I upload part of the video so I can share a funny thing someone said to me.

It just so happens that contained within the clip I upload are the faces of this gay couple at the bar. Modern face recognition means their faces can be tied to their real identities. The result: I may have just unintentionally outed these people.

We can think of this video clip as a “mind file.” Who owns it? Do I own it? After all the experience happened to me. But at the same time, the clip contains potentially sensitive information belonging to someone else. Specifically data about their sexual orientation and activities on a given night.

In a truly transparent society, the answer is simple: Tough luck for the gay couple at the bar. They are in a public place; they should not be expecting privacy. Instead of trying to protect their secrecy, we should be evolving as a society to the point where sexual orientation is a non-issue.

But in a privacy-controlled pseudo-transparent society, like the one Pellissier describes, the answer is not so clear. Can the gay couple send me a take down notice? Are all the faces of all the people in the background of my home videos automatically going to be blurred? There is not necessarily an elegant answer.

In this way, truly protecting your privacy may require you having veto power over what other people choose to do with their own recorded memories. Such veto power sounds to me like the biggest privacy invasion of all.

Jamais Cascio: “Opacity is the New Oil”

Jamais Cascio writes:

“In other words, information and data aren’t scarce, they’re increasing rapidly and dramatically.

“But a related phenomenon is scarce, is declining in availability and increasing in value: opacity. Being hidden. Privacy.

“Information isn’t the new oil; opacity is the new oil. The ability to be opaque — the opposite of transparent — is increasingly rare, valuable, and in many cases worth fighting for. It’s also potentially quite dangerous, often dirty, and can be a catalyst for trouble. In short, it’s just like oil. (Which makes me wonder when we’ll have a new OPEC — Organization of Privacy Enabling Companies.)” (link)

As I’ve written before, our economic system is fundamentally based on the sale of scarce commodities. So if our current economic model is going to continue into the future, we are going to have to find new scarce resources to monetize. Privacy is just such a resource. Technological trends are poised to increase rather than decrease the scarcity of privacy. Therefore I will not be surprised if privacy is one of the major commodities people will be willing to pay for in the future.

Defending Against Future Spambots and the Erosion of Privacy

When the internet was first becoming widely adopted, spam seemed like it might become a big problem. Luckily spam filters got a lot better, and today I encounter very few issues.

But for a spam or bot filter to work, it has to be able to reliably tell the difference between a human and a non-human. And since we can expect that bots will get progressively more human-like with time, I wonder if certain filters are going to become over-taxed.

A recent article in Wired points out that we may finally be approaching a time when a chatterbot could pass a Turing Test. The article argues that that we are going to have such a vast amount of data to draw upon that the bots might be able to answer previously unanswerable questions.

“Suppose, for a moment, that all the words you have ever spoken, heard, written, or read, as well as all the visual scenes and all the sounds you have ever experienced, were recorded and accessible, along with similar data for hundreds of thousands, even millions, of other people. Ultimately, tactile, and olfactory sensors could also be added to complete this record of sensory experience over time,” wrote French in Science, with a nod to MIT researcher Deb Roy’s recordings of 200,000 hours of his infant son’s waking development.

“He continued, “Assume also that the software exists to catalog, analyze, correlate, and cross-link everything in this sea of data. These data and the capacity to analyze them appropriately could allow a machine to answer heretofore computer-unanswerable questions” and even pass a Turing test.”

Keep in mind a spam bot does not need to pass a Turing Test wholesale to become a nuisance. It just has to be good enough to slip through filters.

The obvious solution is to just employ more advanced filters. However filters can become a nuisance themselves. I still have important messages routed to my spam folder with some frequency. And sometimes it takes me an embarassing three times to pass a captcha test.

More importantly at a certain point, there is a limit to what filters can do to weed out bots based on behavior alone. A functional equivalence between humans and bots means there are no salient differences for the filter to identify.

Again there is a clear solution. Specifically, applications will increasingly need to start relying on proprietary or public “lists of trusted humans.” This is nothing new. When you are on Facebook for example, everyone has been vouched for as “real” by the system. But there are plenty of other more anonymous places on the web where such verification systems are not in place, and that is a large part of their charm. I suspect a trend toward bot-human equivalence will further endanger such havens of anonymity. Better bots are likely to be one more reason why privacy as we know it will have to vanish.

Three Links on The End of Privacy

1. Five Ways Google Glasses Could Change Our World

“Out on a Saturday night acting drunk and a bit foolish? Or maybe stone cold sober on a Monday afternoon and accidentally (and embarrassingly) trip over in spectacular style? Someone’s AR glasses could record the moment, use facial recognition to identify you and publicly shame you by sharing it online, tagged to your Twitter account so that the whole world could laugh at your misfortune and know exactly who you were.”

Link via Mark Lewis.

2. Thomas Frey on Rewriting Our Social Norms

“We can understand this better if we look at the economics of perversion. If only one person has photos of a naked celebrity, it becomes a valuable commodity. If you can find the same photos on 10,000 different websites, the value of that photo approaches zero, and our thinking about perversion suddenly shifts along with the proliferation. Nakedness becomes the new norm.”

3. VR Integration Requires Total Transparency

“Think about how many cameras and sensors it’s going to take to make our environments “aware” of itself and us, so it can enable such VR interactivity.

“Then think about being able to walk onto an airplane without having to pass security, because no one with explosives would be able to get within ten miles of the airport. Think about being able to walk down the darkest alleyway in NYC in perfect safety, because there are no more muggers, because everyone knows that it’s impossible to escape arrest it you try…  Think about a million other uses for VR that we will demand, and the endless other potentials made possible by a self aware environment.

“Think about it, and maybe you’ll understand why I laugh at those who continue to believe that we will never become a ‘Transparent Society.'”