— by Polydamas
Forget what you think you know about the latest courtroom battle between the Federal Bureau of Investigations and the Apple computer company about the unlocking of the Islamic San Bernardino terrorists’ iPhone. Everything you think you understand about the controversy is not so.
This has nothing to do with terrorism. This has nothing to do with national security. This has everything to do with statism. Whenever liberals and conservatives agree on anything, it is a sure sign that statism and authoritarianism are on the march. Americans are about to lose yet another liberty to the Leviathan state forever.
Tim Cook, the Chief Executive Officer of Apple, Inc., wrote an eloquent public letter on February 16, 2016, titled “A Message to Our Customers” which is reproduced in full below (http://apple.co/20YEk9k). Mr. Cook is entirely correct that the matter “has implications far beyond the legal case at hand”. Put simply, what the federal government demands from Apple, by relying on an obscure legal writ dating back to the late 18th century, is that Apple bow its figurative head and swear servitude to the government.
The All Writ Act states that “The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law”. This writ dates back to 1789 and even predates the Bill of Rights which was ratified on December 15, 1791. The expansive “necessary and appropriate” language of the writ resembles the “necessary and proper” language in the United States Constitution. The “necessary and proper” language has been very widely used over the past two and a quarter centuries by statist judges to increase the powers of the Leviathan government at the expense of the liberties of the American people.
Now, the federal government wants to force Apple to develop, at Apple’s own expense, a software tool with which Apple can circumvent the data security measures which it embedded in its iPhone operating system in the aftermath of Central Intelligence Agency contractor Edward Snowden’s revelations of widespread government surveillance and recent audacious thefts of private and confidential information that were previously discussed here. The federal government wants Apple to disable the operating system’s built-in time delays after the entry of incorrect passwords and the automatic wipe of iPhone data after the entry of ten incorrect password attempts. These modifications to the iPhone’s operating system will then allow the federal government to use brute force hacking software that will enter rapidly every possible combination of numbers until it finds the right password.
The issue at hand is one of coercion. There is no fundamental difference in the coercion applied by the federal government to Apple and government agents coercing a safe manufacturer who built the most intrusion resistant safe to break into his own client’s safe. If the federal government is permitted to strong arm Apple, what is to stop the same federal government from employing the same coercion to order a doctor to administer a poison to her patient in violation of the Hipocratic Oath? What is to prevent the federal government from ordering an attorney to divulge the secrets and confidences of his client and to testify against his client? Or prohibit the government from forcing an architect to destroy the building that he lovingly built?
Columnist Robert Robb makes the same excellent point in his Arizona Republic column titled “Apple is Right About Terrorist’s iPhone” which is reproduced in full below (http://bit.ly/1PM1iIp). He aptly states: “The government, in the course of a criminal investigation or intelligence gathering related to terrorism, can’t conscript private parties into its service. The government can require Apple to turn over anything relevant it possesses. It has no authority, in the course of an investigation, to require Apple to create something that doesn’t exist. And neither does a judge. That’s contrary to our system of law.”
The notion advanced by the federal government that it absolutely requires the assistance of Apple’s software engineers to unlock the Islamic San Bernardino terrorists’ iPhone is plainly preposterous. As revealed by Edward Snowden, the federal government employs an army of computer mavens, hackers, and snoops of extraordinary expertise. The federal government can also hire and pay knowledgeable private sector individuals to break into the iPhone in question. Based upon the same analogous principle, the federal government employs its own assassins and can hire independent contractor hit men to do its wet works that it does not need to draft and to corrupt the integrity of ordinary people. The same is true of safe manufacturers, Apple, doctors, attorneys, and other professions. Put simply, the integrity of every person and every business is at stake here. If the federal government can order any person or business to serve it rather than the person’s or business’ own chosen customers, we then live in a dictatorship.
If the federal government is successful at coercing Apple to do its bidding under the rubric of public safety, there is nothing to stop the various states from passing their own statutory versions of the All Writs Act and to conscript Apple to open iPhones at the behest of the states to assist in criminal investigations and prosecutions. On the same grounds, the various cities and municipalities can also follow suit and similarly conscript Apple to assist them. There will be the inevitable slippery slope which will extend the scope of Apple’s conscription into government service from the war against Islamic terrorism and national safety to a host of other federal crimes. Before long, the envelope will be pushed to require Apple to assist states and local governments to find runaway teenagers, missing children, and elderly people with mental infirmities who have wandered off. Further, the precedent of the federal government, state, and local governments managing to coerce Apple, the wealthiest and most powerful company in the world, will allow the same governmental entities to do the same or worse to any smaller and less powerful company.
The Cato Institute’s Julian Sanchez wrote on February 19, 2016 a marvelous column titled “This Is the Real Reason Apple Is Fighting the FBI” (http://bit.ly/1L2fP3R). He makes the cogent point that “the high stakes of Apple’s resistance to the FBI’s order [are] not whether the federal government can read one dead terrorism suspect’s phone, but whether technology companies can be conscripted to undermine global trust in our computing devices.” The federal government has been hostile to the possession of public key cryptography by private persons and persecuted Philip Zimmerman throughout the 1990s for the development of his groundbreaking software Pretty Good Privacy or PGP. It is no secret that the government impossibly wants to stuff the cryptography genie back into its bottle by requiring manufacturers to install backdoors into their products whose keys will be given to government spooks. The government wants to spy like George Orwell’s 1984’s Big Brother on its people using computers, smart phones, computer-integrated smart cars, televisions, and appliances. To achieve its objectives and to erect the modern version of Jeremy Bentham’s Panopticon, a glass prison of all-seeing guards, it needs to conscript tech companies as its willing or unwilling accomplices.
Nearly 240 years ago, the American people fought the Revolutionary War to throw off the yoke of England’s King George III and his tyrannical rule. We now have a new king with greater technological powers than ever wielded by any previous king in history and the government demands ever greater powers. Lord Acton was entirely correct that absolute power corrupted absolutely, and this is the case here as well.
==============
February 16, 2016
A Message to Our Customers
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
The Need for Encryption
Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.
All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.
Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.
For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
The San Bernardino Case
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.
When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
The Threat to Data Security
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
A Dangerous Precedent
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.
We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.
Tim Cook
================
Apple is right about terrorist’s iPhone
Robert Robb
The Arizona Republic
February 21, 2016
On Tuesday, a federal judge ordered Apple to help the FBI break into the iPhone of one of the killers in the San Bernardino, Calif., shootings.
The implication of what the federal government is demanding that Apple do regarding the smart phone of the San Bernardino terrorist is not being fully understood or properly framed.
The government isn’t demanding relevant information or material that Apple possesses. Such demands, backed by court orders if necessary, are ordinary and appropriate in criminal investigations. Apple has already provided the government what it has that fits the usual kind of document demands, including information the terrorist had stored in Apple’s cloud service.
However, the terrorist quit backing up his phone on the cloud several weeks before the attack. The FBI can’t figure out how to bypass the security features on the phone. Apple currently has no way of doing it either.
So, the government sought, and received, a court order requiring Apple to create software that could be loaded onto the phone that would enable the FBI to circumvent the security protections and access the phone’s content. Such software doesn’t currently exist.
Apple has vowed to fight the order, saying that the government is basically ordering it to develop a backdoor to its phones that would compromise the security of the data of all its customers.
Apple can’t turn over something that doesn’t exist
That, however, isn’t the real point. The government, in the course of a criminal investigation or intelligence gathering related to terrorism, can’t conscript private parties into its service. The government can require Apple to turn over anything relevant it possesses. It has no authority, in the course of an investigation, to require Apple to create something that doesn’t exist. And neither does a judge. That’s contrary to our system of law.
The question of whether there should be a general statute, enacted by Congress, requiring phone manufacturers to include a backdoor that would enable law enforcement in the course of an investigation to override security features is a different matter entirely. The manufacturers claim that would open up their phones to criminal hackers. Law enforcement officials assert differently.
I’m inclined to believe the manufacturers on such a technology question. Regardless, a general law hasn’t been passed by Congress. And until it is, the government shouldn’t be able to conscript Apple to create backdoor software that doesn’t currently exist through a court order in an individual criminal investigation, regardless of how important and high profile.
================
This Is the Real Reason Apple Is Fighting the FBI
Julian Sanchez
Cato Institute
February 19, 2016.
The first thing to understand about Apple’s latest fight with the FBI—over a court order to help unlock the deceased San Bernardino shooter’s phone—is that it has very little to do with the San Bernardino shooter’s phone.
It’s not even, really, the latest round of the Crypto Wars—the long running debate about how law enforcement and intelligence agencies can adapt to the growing ubiquity of uncrackable encryption tools.
It’s a fight over the future of high-tech surveillance, the trust infrastructure undergirding the global software ecosystem, and how far technology companies and software developers can be conscripted as unwilling suppliers of hacking tools for governments.
Rather, it’s a fight over the future of high-tech surveillance, the trust infrastructure undergirding the global software ecosystem, and how far technology companies and software developers can be conscripted as unwilling suppliers of hacking tools for governments. It’s also the public face of a conflict that will undoubtedly be continued in secret—and is likely already well underway.
First, the specifics of the case. The FBI wants Apple’s help unlocking the work iPhone used by Syed Farook, who authorities believe perpetrated last year’s mass killing at an office Christmas party before perishing in a shootout with police. They’ve already obtained plenty of information about Farook’s activities from Apple’s iCloud servers, where much of his data was backed up, and from other communications providers such as Facebook. It’s unclear whether they’ve been able to recover any data from two other mobile devices Farook physically destroyed before the attack, which seem most likely to have contained relevant information.
But the most recent data from Farook’s work-assigned iPhone 5c wasn’t backed up, and the device is locked with a simple numeric passcode that’s needed to decrypt the phone’s drive. Since they don’t have to contend with a longer, stronger alphanumeric passphrase, the FBI could easily “brute force” the passcode—churning through all the possible combinations—in a matter of hours, if only the phone weren’t configured to wipe its onboard encryption keys after too many wrong guesses, rendering its contents permanently inaccessible.
So the bureau wants Apple to develop a customized version of their iOS operating system that permits an unlimited number of rapid guesses at the passcode—and sign it with the company’s secret developer key so that it will be recognized by the device as a legitimate software update.
Considered in isolation, the request seems fairly benign: If it were merely a question of whether to unlock a single device—even one unlikely to contain much essential evidence—there would probably be little enough harm in complying. The reason Apple CEO Tim Cook has pledged to fight a court’s order to assist the bureau is that he understands the danger of the underlying legal precedent the FBI is seeking to establish.
Four important pieces of context are necessary to see the trouble with the Apple order.
1. This offers the government a way to make tech companies help with investigations. Law enforcement and intelligence agencies have for years wanted Congress to update the Communications Assistance for Law Enforcement Act of 1992, which spells out the obligations of telephone companies and Internet providers to assist government investigations, to deal with growing prevalence of encryption—perhaps by requiring companies to build the government backdoors into secure devices and messaging apps. In the face of strong opposition from tech companies, security experts and civil liberties groups, Congress has thus far refused to do so.
By falling back on an unprecedentedly broad reading of the 1789 All Writs Act to compel Apple to produce hacking tools, the government is seeking an entry point from the courts it hasn’t been able to obtain legislatively. Moreover, saddling companies with an obligation to help break their own security after the fact will raise the cost of resisting efforts to mandate vulnerabilities baked in by design.
2. This public fight could affect private orders from the government. Several provisions of the federal laws governing digital intelligence surveillance require companies to provide “technical assistance” to spy agencies. Everything we know suggests that government lawyers are likely to argue for an expansive reading of that obligation—and may already have done so. That fight, however, will unfold in secret, through classified arguments before the Foreign Intelligence Surveillance Court. The precedent set in the public fight may help determine how ambitious the government can be in seeking secret orders that would require companies to produce hacking or surveillance tools meant to compromise their devices and applications.
3. The consequences of a precedent permitting this sort of coding conscription are likely to be enormous in scope. This summer, Manhattan District Attorney Cyrus Vance wrote that his office alone had encountered 74 iPhones it had been unable to open over a six-month period. Once it has been established that Apple can be forced to build one skeleton key, the inevitable flood of similar requests—from governments at all levels, foreign and domestic—could effectively force Apple and its peers to develop internal departments dedicated to building spyware for governments, just as many already have full-time compliance teams dedicated to dealing with ordinary search warrants.
This would create an internal conflict of interest: The same company must work to both secure its products and to undermine that security—and the better it does at the first job, the larger the headaches it creates for itself in doing the second. It would also, as Apple’s Cook has argued, make it far more difficult to prevent those cracking tools from escaping into the wild or being replicated.
4. Most ominously, the effects of a win for the FBI in this case almost certainly won’t be limited to smartphones. Over the past year I worked with a group of experts at Harvard Law School on a report that predicted governments will to respond to the challenges encryption poses by turning to the burgeoning “Internet of Things” to create a global network of surveillance devices. Armed with code blessed by the developer’s secret key, governments will be able to deliver spyware in the form of trusted updates to a host of sensor-enabled appliances. Don’t just think of the webcam and microphone on your laptop, but voice-control devices like Amazon’s Echo, smart televisions, network routers, wearable computing devices and even Hello Barbie.
The global market for both traditional computing devices and the new breed of networked appliances depends critically on an underlying ecosystem of trust—trust that critical security updates pushed out by developers and signed by their cryptographic keys will do what it says on the tin, functioning and interacting with other code in a predictable and uniform way. The developer keys that mark code as trusted are critical to that ecosystem, which will become ever more difficult to sustain if developers can be systematically forced to deploy those keys at the behest of governments. Users and consumers will reasonably be even more distrustful if the scope of governments’ ability to demand spyware disguised as authentic updates is determined, not by a clear framework, but a hodgepodge of public and secret court decisions.
These, then, are the high stakes of Apple’s resistance to the FBI’s order: not whether the federal government can read one dead terrorism suspect’s phone, but whether technology companies can be conscripted to undermine global trust in our computing devices. That’s a staggeringly high price to pay for any investigation.
Julian Sanchez is a senior fellow at the Cato Institute.