The new “Stop the Cyborgs” site fears “a future in which privacy is impossible and corporate control total.” But actually these two outcomes are probably mutually exclusive. Corporate control in the form of a single unified operating system would make enforcing privacy actually feasible. For example, if Google controlled the OS for all smart glasses then they would have the ability to enact privacy controls that could, for example, automatically blur the faces of people who don’t want to be recorded. But if there is vibrant competition among operating systems, and lots of viable open source alternatives, then the notion that any such controls could ever be enforced goes out the window. Personally, if I had to choose between no privacy and corporate control, I would choose no privacy.
I’ve been expecting backlash against the coming end of privacy and now with the imminent arrival of Google Glass it may finally be manifesting. The site “Stop the Cyborgs“, in addition to making cute stickers for “No Surveillance Devices” and “Google Glass is Banned on these Premises”, is arguing against what it calls “a future in which privacy is impossible and corporate control total.” I actually agree with the first part of their future. Privacy is on its way out and we should get used to that fact. I’m not so sure that’s a bad thing. It might just keep us more honest in a positive way.
As for the second part of their prediction, about corporate control total…that’s completely unclear as a consequence. That only becomes true if we are all locked into the same operating system and if furthermore said operating system decides to behave in a controlling way towards its customers. To me this is an argument against monopoly and in favor of competition (especially from open source!). Not against Glass or other surveillance technologies.
Hank Pellissier of IEET recently posted an article entitled “100% Honesty, Transparency, Disclosure – is this the “naked future” we want?“ In the article, Pellissier describes the most extreme version of a transparent future: a world where you walk into a party and literally everyone knows everyone else’s thoughts.
It’s a fun article and an interesting thought experiment, but as you read through the text it becomes apparent that Pellissier is not really talking about a transparent future. In fact, his described utopia involves extremely robust privacy protections. The article makes several references to the idea of private and public mind files, implying that as an individual you still get to be the final arbiter of who does and doesn’t have access to your information. At one point he describes the level of sharing that would be necessary in choosing a marriage partner.
“Marriage partner? Private files that are usually off-limits are opened to peruse priorities like “long-term loyalty,” “patience,” interest trends,” and “annoying habits.”
Now this is a speculative world, and I’m not sure how these “mind files” are supposed to work. But a key feature of files is that they are easy to copy. Open your files to someone once, and those files are now out of your control. People can potentially copy and reshare that data at will.
But more importantly, Pellissier ignores how much will be inferable about us from our external behavior. A computer does not need to read your mind to determine your personality traits. If we imagine a world rife with sensors and information sharing, then there will be a wealth of data available on all of us. And you can bet that data will be parse-able in such a way that any “annoying habits” of mine will be able to be determined whether or not I voluntarily open up my private mind files.
At one point, Pellissier describes a discussion with his daughter:
“When I proposed my 100% transparency utopia to my family, my 12-year-old daughter rebelled. “We’d be robbed!” she exclaimed. “Bad guys would know our address and where we hide the key!” No, I explained. Mind-sharing would contain options, with public or private settings for different data, like Facebook. Everyone could be as secretive as they wished. Shy, paranoid, and mystery-loving people could mingle together, laboriously extracting information from each other in old-fashioned Luddite ways.”
If this were my daughter, my response would have been different. First I would have explained that our address (and quite possibly the location of our key) would already be readily available to anyone interested in doing us harm. For this thought experiment to make any sense, we have to picture a sensor-rich, camera-heavy, highly networked world. In such a future how can you possibly expect to hide the location of the property that you return to every single night? You don’t think any cameras or GPS devices are going to capture you doing so? I’m pretty sure all it takes for a bad guy to find someone’s address now is a little bit of light googling. And that’s today.
Fortunately, I would explain to my daughter, if these bad guys do decide to rob us, their crime will be fully recorded and traceable to them, for the exact same reasons listed above. So most likely the bad guys won’t bother us, given that they face a near certainty of being caught.
It sounds nice to imagine a future with robust privacy settings, where we all can dicate what is and isn’t private. But deep down I really don’t see how that can ever be viable. To achieve this would require a locked down future where we are all running the same operating system. You need a unified system or else you can’t enforce any of these supposed privacy controls. And at the point we have a unified system, we are at the mercy of the programmers and how they decide to handle the inevitable conflicts of interest that will arise.
We can learn a lot by looking at the modern day intellectual property debacle. This chapter from Free Culture describes a documentary filmmaker who accidentally captured a few seconds of The Simpsons playing on a TV in the background of a shot. Fox ended up demanding 10,000 dollars in payment for use of the copyrighted material. In this moment, a simple act of documenting the world somehow crossed over and became infringement.
Now imagine I am in a bar with friends. I glance across the room and happen to witness a gay couple talking and laughing. All I do is glance for a second, but that’s enough time for my glasses to record and store their presence. I am recording the whole night at the bar because it is a special night, my last evening in Los Angeles before a long trip. Later that evening, I upload part of the video so I can share a funny thing someone said to me.
It just so happens that contained within the clip I upload are the faces of this gay couple at the bar. Modern face recognition means their faces can be tied to their real identities. The result: I may have just unintentionally outed these people.
We can think of this video clip as a “mind file.” Who owns it? Do I own it? After all the experience happened to me. But at the same time, the clip contains potentially sensitive information belonging to someone else. Specifically data about their sexual orientation and activities on a given night.
In a truly transparent society, the answer is simple: Tough luck for the gay couple at the bar. They are in a public place; they should not be expecting privacy. Instead of trying to protect their secrecy, we should be evolving as a society to the point where sexual orientation is a non-issue.
But in a privacy-controlled pseudo-transparent society, like the one Pellissier describes, the answer is not so clear. Can the gay couple send me a take down notice? Are all the faces of all the people in the background of my home videos automatically going to be blurred? There is not necessarily an elegant answer.
In this way, truly protecting your privacy may require you having veto power over what other people choose to do with their own recorded memories. Such veto power sounds to me like the biggest privacy invasion of all.
Jamais Cascio writes:
“In other words, information and data aren’t scarce, they’re increasing rapidly and dramatically.
“But a related phenomenon is scarce, is declining in availability and increasing in value: opacity. Being hidden. Privacy.
“Information isn’t the new oil; opacity is the new oil. The ability to be opaque — the opposite of transparent — is increasingly rare, valuable, and in many cases worth fighting for. It’s also potentially quite dangerous, often dirty, and can be a catalyst for trouble. In short, it’s just like oil. (Which makes me wonder when we’ll have a new OPEC — Organization of Privacy Enabling Companies.)” (link)
As I’ve written before, our economic system is fundamentally based on the sale of scarce commodities. So if our current economic model is going to continue into the future, we are going to have to find new scarce resources to monetize. Privacy is just such a resource. Technological trends are poised to increase rather than decrease the scarcity of privacy. Therefore I will not be surprised if privacy is one of the major commodities people will be willing to pay for in the future.
When the internet was first becoming widely adopted, spam seemed like it might become a big problem. Luckily spam filters got a lot better, and today I encounter very few issues.
But for a spam or bot filter to work, it has to be able to reliably tell the difference between a human and a non-human. And since we can expect that bots will get progressively more human-like with time, I wonder if certain filters are going to become over-taxed.
A recent article in Wired points out that we may finally be approaching a time when a chatterbot could pass a Turing Test. The article argues that that we are going to have such a vast amount of data to draw upon that the bots might be able to answer previously unanswerable questions.
“Suppose, for a moment, that all the words you have ever spoken, heard, written, or read, as well as all the visual scenes and all the sounds you have ever experienced, were recorded and accessible, along with similar data for hundreds of thousands, even millions, of other people. Ultimately, tactile, and olfactory sensors could also be added to complete this record of sensory experience over time,” wrote French in Science, with a nod to MIT researcher Deb Roy’s recordings of 200,000 hours of his infant son’s waking development.
“He continued, “Assume also that the software exists to catalog, analyze, correlate, and cross-link everything in this sea of data. These data and the capacity to analyze them appropriately could allow a machine to answer heretofore computer-unanswerable questions” and even pass a Turing test.”
Keep in mind a spam bot does not need to pass a Turing Test wholesale to become a nuisance. It just has to be good enough to slip through filters.
The obvious solution is to just employ more advanced filters. However filters can become a nuisance themselves. I still have important messages routed to my spam folder with some frequency. And sometimes it takes me an embarassing three times to pass a captcha test.
More importantly at a certain point, there is a limit to what filters can do to weed out bots based on behavior alone. A functional equivalence between humans and bots means there are no salient differences for the filter to identify.
Again there is a clear solution. Specifically, applications will increasingly need to start relying on proprietary or public “lists of trusted humans.” This is nothing new. When you are on Facebook for example, everyone has been vouched for as “real” by the system. But there are plenty of other more anonymous places on the web where such verification systems are not in place, and that is a large part of their charm. I suspect a trend toward bot-human equivalence will further endanger such havens of anonymity. Better bots are likely to be one more reason why privacy as we know it will have to vanish.
“Out on a Saturday night acting drunk and a bit foolish? Or maybe stone cold sober on a Monday afternoon and accidentally (and embarrassingly) trip over in spectacular style? Someone’s AR glasses could record the moment, use facial recognition to identify you and publicly shame you by sharing it online, tagged to your Twitter account so that the whole world could laugh at your misfortune and know exactly who you were.”
Link via Mark Lewis.
“We can understand this better if we look at the economics of perversion. If only one person has photos of a naked celebrity, it becomes a valuable commodity. If you can find the same photos on 10,000 different websites, the value of that photo approaches zero, and our thinking about perversion suddenly shifts along with the proliferation. Nakedness becomes the new norm.”
“Think about how many cameras and sensors it’s going to take to make our environments “aware” of itself and us, so it can enable such VR interactivity.
“Then think about being able to walk onto an airplane without having to pass security, because no one with explosives would be able to get within ten miles of the airport. Think about being able to walk down the darkest alleyway in NYC in perfect safety, because there are no more muggers, because everyone knows that it’s impossible to escape arrest it you try… Think about a million other uses for VR that we will demand, and the endless other potentials made possible by a self aware environment.
“Think about it, and maybe you’ll understand why I laugh at those who continue to believe that we will never become a ‘Transparent Society.’”
The intersection of privacy and technology gets a lot of press. It seems at least once a week an article comes out along the lines of this “Girls Around Me” story.
The dialogue about technology and privacy seems to place people into three camps:
- The “victims.” These are people who are unaware of how their technology works. An example would be the “poor” girls in the above story who apparently do not realize that their location and Facebook profiles are easily searchable by would-be pick up artists.
- The “educated.” These are your reasonably tech savvy folks who know how to fiddle with their Facebook privacy settings and delete their Google search histories. These people make full use of available technologies, but take precautions to configure their preferences so that certain aspects of their lives remain protected. Like the author of the above article, these people tend to advocate privacy education as being the best solution.
- The “relinquishers.” These people simply opt out of potentially privacy-eroding technologies. (They definitely aren’t on Facebook, for example.) Interestingly this category unites both tech-fearing luddites and tech-loving nerds such as the “linux aficionado” mentioned in the above article.
If I had to place myself in one of the above categories, I’d choose number two. But if I’m true to my own beliefs, what I really think is this:
Privacy is going away. And no amount of fiddling with settings, educating yourself, or opting out is going to help.
Think of the following list of technology trends. Then imagine these technologies mature and linked up with each other:
- better integration of global positioning systems
- improved and ubiquitous face recognition
- smaller and more pervasive cameras
- smaller and higher capacity hard drives for storing video and other recorded data
- more widespread cloud and network access
- better algorithms for search and data analysis
- improved 3D scanning and modeling
- phones embedded in glasses and contacts
I’m probably leaving some things out. But I think if you run the thought experiment and put all this together here’s the world you get in very short order:
- Everything you do in public will be recorded from multiple angles, online, and searchable by people armed only with a few fragments of data about you (first name and city for example.)
- Anything you do in private with other people present will probably also be recorded in some form with a high chance of leakage out into the world. That is unless you take great pains to prevent this from happening.
- Anything you do completely alone will potentially be spied upon unless you are extremely rigorous about protecting yourself. Moreover, your likely behavior during such “blackout periods” will often be inferable from the surrounding recorded periods in your life.
In this scenario, opting out of social networks and configuring privacy settings will not help you. Opting out will not prevent your face and location from being recorded by other people. And opting out will not prevent other people or impersonal algorithms from tagging this data with your name.
I anticipate a future where most crimes are impossible to get away with. A future where adulterers, liars, and gossipers get caught immediately. Where there is no longer a clear division between work, family, and social life. Where large numbers of people will have naked pictures, or at least body scans, available somewhere online. Where your entire personal history will be recallable at a moment’s notice.
Because I believe this, I have adopted the opposite strategy from what some people are recommending. I am not trying to protect my privacy by fiddling with settings. Instead I am readying myself for the end game and acclimating myself to a future with no privacy. I am actually trying to share more information, be more open, be less secretive, and be the same person, at all times, regardless of what company I am in. I am trying to construct a life for myself where I truly have nothing to hide. And I am doing this not because I necessarily want to, but because it seems like the wise transition to start making given the reality of these technologies.
One more thing that people won’t be able to do to obtain money in the future: steal it. In North Dakota, news broke today that unmanned aircraft are aiding police in civilian arrests, but that’s just one example of a massive technology-driven surveillance explosion that includes voluntary acts like carrying a cell phone or posting on social networks as well as ever more installed and automated sensors out there watching. How would you act differently in a world where the comings, goings, and doings of basically any person are by default available to law enforcement all the time?