Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Our Phones Are Not Just Confessionals

They are portable, personal data stores, and it is as that access to them is being sought.

littleiapps.com

Recently, both Apple CEO Tim Cook and FBI Director Jim Comey have said that the American people should weigh in on the questions raised by the Apple-FBI debate. That claim makes one thing clear: It’s essential for the general public to appreciate the issues at stake.

This controversy isn’t an argument about whether the government should be able to thoroughly surveil a criminal. Nor is it really an argument about any technology much more modern than writing. The core of the controversy is the right of law-abiding citizens to be secure in their own private thoughts.

The core of the controversy is the right of law-abiding citizens to be secure in their own private thoughts.

Imagine a powerful ruler demanding that a priest divulge what parishioners said during confession. Our society has decided that the priest would not be expected to betray the people’s trust. But what if the ruler demanded to learn only what was said by a man who had clearly committed a horrific and evil crime (in the interests of possibly preventing other such crimes and ensuring the safety of the rest of the flock)? Would we compel the priest to tell? (And what if that ruler had previously placed microphones in confessionals, had been caught lying about it, and was now promising not to use this individual request as a legal or political wedge to get more such concessions in future?)

You might argue that our phones — or, more broadly, all our technological tools that can be safeguarded by encryption — are not confessionals. What are they, then?

Ask yourself whether you know anyone who suffers from stress and seeks to cope with it by unburdening themselves of their weightiest thoughts through a personal journal. Ask yourself if you, or anyone you know, finds order in a chaotic word through written meditations, reflections or creative writing of any form. Then ask yourself how free you would feel continuing such practices in an electronic world where you could have no reasonable assurance that your private writing was indeed private.

Those who would effectively prohibit widespread access to strong encryption, would thereby deny peace of mind to any innocent people who might otherwise have sought such solace: It is the equivalent of claiming that such people have no right to keep a written secret. And that basic right is what’s truly at stake here, for in spite of the uninformed opinions of those who might believe otherwise, there is simply no way for any security workaround (whether it be direct, or via a seemingly more convoluted route like that requested by the FBI) to be restricted to an individual case.

Prohibiting widespread access to strong encryption is the equivalent of claiming that such people have no right to keep a written secret.

We are used to the idea that law enforcement should be able to surveil communication when they demonstrate a need to do so. But when it comes to phones as personal data stores, we are crossing a line from surveillance of communication with others, to demanding access to data that might never be communicated. Denying people access to strong encryption would mean just that: It is equivalent to claiming that nobody has a right to confidently keep a written secret. Such a claim poses clear risks of harm for many innocents, while ultimately posing little more than a dubious inconvenience for those who would seek to employ encryption for evil ends. Anyone with the necessary skill and training could conceivably keep their words private from before their fingertips ever touched a keyboard. Coded language, for example, has long preceded modern encryption. And what if the words are written in a language only the writer could decipher? Would it be right to compel someone to divulge the meaning of such private thoughts, simply because the thoughts were written down?

This question lies at the heart of a distinction that is easily ignored. A phone is not just a communication device; it is also a data store. As such, it is increasingly an extension of our minds and memories, and it is more like a personal journal, one that is ever more intimately linked to our minds and our bodies. We are likely to have more such personal data stores accompanying more people through their daily routines. In fact, the privacy of such personal data stores, whether they are borne on our person, worn on our clothes or otherwise integrated with our bodies and minds to any degree, is even more important than the privacy of a consciously written journal. These new personal journals are ever richer in intimate detail, and ever less written by our hands.

Phones are more than communication devices. They are portable, personal data stores, and it is as such personal data repositories that access to them is being sought. More than journals, they are nearly ever-present witnesses to our daily movements and increasingly serve as extensions of our memories. Such devices are, and will continue to become, ever more an intimate companion in whom people unwittingly and increasingly confide. So whether one is aware of it or not, a phone can easily be seen as a portable confessional. This means that the line being crossed by legal ploys to gain access “just this once,” or naive pseudo-technical arguments about the need for a lawful bypass to strong encryption, is a line that is pushing beyond the realm of the privacy of a person’s communication, and toward a truly obscene intrusion upon an individual’s most private data and even thoughts.

The efforts of the FBI and encryption-demonizing politicians may be rooted in the best of intentions, but they are based on a blatantly unsound technological premise, and the slope upon which they place us leads down a morally repugnant path.

But perhaps the confessional is the wrong analogy today. What is being sought by the FBI is supposedly a single workaround for strong security in a single case. Such a single workaround (whether it’s the software or the legal precedent) can never be guaranteed not to pose a risk for many innocent citizens, and cannot be uncreated. So perhaps a better analogy is that the FBI isn’t like the king seeking access to the confessional, but is instead just a well-meaning agent of justice asking a vaccine lab to create and use a biological weapon that they hope might help against a single evil person, and promising that the risk is worth it and that it’ll never be misused by the wrong people (unlike, say, TSA luggage-lock master keys, federal employee security-clearance records, health databases, retail consumer data, and so on).

It is indeed essential that more people understand the issues at stake here. Fundamentally, this isn’t about one truly horrific crime and the justice its victims so rightly deserve, it’s about whether that justice is served by subverting basic data security for everyone who uses a consumer device featuring strong encryption. It’s about an individual’s right to use strong encryption in a personal data store.

Should personal data storage become so ubiquitous and intimately tied with our lives and actions that it becomes an extension of our memories, hampering popular access to strong encryption destroys an individual’s ability to feel somewhat secure in using such digital memories. The threat of invasive access to such devices is no less obscene than the threat of invasive access to memories that we keep to ourselves, for all it does is mildly inconvenience criminals at the expense of denying everyone the peace of mind of being able to unburden their minds, and hindering their embrace of technologies that might just make someone’s world a little less overwhelming.

The efforts of the FBI and encryption-demonizing politicians may be rooted in the best of intentions, but they are based on a blatantly unsound technological premise, and the slope upon which they place us leads down a morally repugnant path.


Ahmed Amer is an associate professor of computer engineering at Santa Clara University, studies data storage technologies and has also worked on developing low-cost wearable computing and augmented-reality devices. Reach him @aamer.

This article originally appeared on Recode.net.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh