Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

SCOTUS Gets It: Encryption Is a Basic Security Feature, Not a Sign You Have Something to Hide

The nine in black delivered a straightforward message to the government: If you want our data, “get a warrant.”

Bruce Bortin/Flickr

If you kept all your private physical information in a locked trunk, the government would need a warrant to search it.

“Most people cannot lug around every piece of mail they have received for the past several months, every picture they have taken, or every book or article they have read,” the Chief Justice John Roberts said in Riley v. California. If people did, “they would have to drag a trunk behind them.”

Searching a trunk should, and does, require a warrant. The Supreme Court’s ruling in Riley v. California says that modern “trunks” like phones, computers and cloud services should be no different. In straightforward language, the Chief Justice demolishes the notions that have obscured the basic fact that the constitution entitles us to “the right to be secure” in our “persons, houses, papers and effects.”

This right does not depend on our retreating to the technology of another century. With this case, the nine in black have delivered a surprisingly straightforward message to the government. If you want our data, “get a warrant.”

Importantly for digital privacy and security companies, the Supreme Court had no time for the government’s complaints that common security features — like encryption — justified dispensing with Constitutional protections. It is clear that the justices saw encryption correctly, as an ordinary and prudent privacy protection, not as some exotic feature designed to thwart the needs of law enforcement.

Lower courts should take notice. On the same day as the Supreme Court’s decision, the Massachusetts Supreme Judicial Court ruled that a person could be forced to divulge keys to encrypted files without violating the Fifth Amendment right to self-incrimination. While the issue is still far from clear (other courts have gone the other way), the prevailing wind from the Supreme Court is plainly in the opposite direction.

Communications technology is no longer new, or exotic, and the price for using basic information services — or using common-sense security measures like encryption — should not be giving up your basic rights. Perhaps the most maddening notion out there is that privacy intrusions are our fault, for using entirely commonplace technology.

People these days don’t care about privacy. Look at them with their devices, their e-mail, their messaging. Don’t they understand how much data they are creating? They are simply asking to be tracked and searched.

With this ruling, the Supreme Court had no time for such ideas. Digital is different, but the difference means there should be more, not less, privacy protection. Rules that strike the right balance for physical papers and analog telephones do not work when applied in the digital world. To say otherwise is to ignore the obvious, “like saying a ride on horseback is materially indistinguishable from a flight to the moon,” as the Chief Justice put it.

The Supreme Court gets it — unanimously. We need rules that protect privacy in today’s world. Searching a trunk requires a warrant; now, so does searching a phone or device. Securing a trunk requires a lock, and that’s what emerging digital-privacy companies and encrypted-email services provide. (Disclosure: I am an advisor to one such company, Virtru).

It’s just common sense.

Tim Edgar is a visiting fellow at Brown University. He served under President Obama as the first director of privacy and civil liberties for the White House National Security Staff, focusing on cyber security, open government, and data-privacy initiatives. He also advises companies on privacy issues, including Virtru, mentioned in this piece.

This article originally appeared on Recode.net.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh