Privacy v. Security: The Debate

Published: February 22, 2016

By Jim Lichtman
Image
Read More

comey-cook

Late Friday, CNN Tech reported (Feb. 19) that The House Energy and Commerce Committee has invited Apple CEO Tim Cook and F.B.I. Director James Comey “to explain to Congress and the American people the issues at play and how they plan to move forward.” This came about after Apple refused the F.B.I.’s request to create a program to unlock a known terrorist’s iPhone. Having reached an impasse, a California federal judge issued a court order to Apple to cooperate. Apple CEO Tim Cook says that the security of all iPhones would be compromised if such a program were created.

The Committee said that since Apple defied a court order, the issue has reached a “critical juncture.”

In an editorial (Feb. 18), The New York Times sided with Apple: “…writing new code would have an effect beyond unlocking one phone. If Apple is required to help the F.B.I. in this case, courts could require it to use this software in future investigations or order it to create new software to fit new needs. It is also theoretically possible that hackers could steal the software from the company’s servers.”

Last Friday, I pointed out a USA Today report that said that “The FBI [cannot crack the terrorist’s iPhone themselves] because to put new software on an Apple device, it must be digitally signed with a special key that only Apple knows. Without that digital key, the phone would reject the software.”

My question for Cook and the Times Editorial Board: If there’s already a special key that only Apple knows, what’s to prevent hackers from getting it?

In the Business Section last Friday, The Times asked readers this question: Should the government be able to unlock your iPhone?

“So let me get this straight, [wrote Chad in Oregon]: ‘If Apple was the proprietor of a storage locker facility, and the government obtained a court-issued warrant to search the locker of a criminal suspect, Apple would not unlock the storage unit subject to the government’s warrant?’ ”

“Rich Comstock said on Facebook: ‘Switch out iPhone and replace it with the word “house.” If two suspected felons were wanted for murder and the F.B.I., or police felt there was compelling (and corroborating) evidence located at the house, they would place these facts before a judge to obtain a legal and appropriate search warrant — all coming under the Fourth Amendment and Miranda rights.’ ”

“Benjamin Casey replied, ‘Actually the F.B.I. is asking for the keys to everyone’s house and cameras inside to see what’s going on all the time.’ ”

“Fredd in Denver, who said he had over 30 years experience working in the software and database industry, wrote: ‘If you provide a back door, it’s only a matter of time before nefarious people exploit it. … All the government will end up doing is to weaken security for tens of millions to catch a few, and it won’t be long before those few figure out how to write their own encryption to circumvent the back door problem.’ ”

“Matthew Schenker, who called himself a ‘coder,’ suggested that there was a way to satisfy both the F.B.I. and the technology company. He wrote: ‘I know there is always a way to code something with ethical limitations. For example, Apple could create a temporary, self-destructing device ID that must be issued from Apple and is only accessible within a certain time frame, and with a certain algorithm. Apple could hold that temporary ID and grant access on a case-by-case basis, rendering the ‘back door’ useless without it. This might sound complicated, but from a coding perspective it’s not.’ ”

In fact, in a letter to the editor (Feb. 19), Frank Spencer-Molloy offered this solution:

“Maybe it’s time to try a Solomonic solution to the standoff between Apple and the Federal Bureau of Investigation over a phone used by one of the killers in the San Bernardino, Calif., terror attack.

“Apple maintains that the content on each of its customers’ iPhones is uniquely encrypted and that it possesses no master key to unlock the data. The F.B.I. offers a workaround. It apparently wants Apple to load a new operating system onto the disputed phone, minus the feature that allows owners to have all their information erased if someone makes 10 failed passcode attempts.

“The F.B.I. would then use a computer to achieve entry by trying combinations of letters and numbers. The task would be herculean: There are hundreds of millions of possible combinations in a 32-character passcode alone.

“It seems to me that using such a method would isolate the defeat of Apple’s security to this one phone.

“But to ensure that the tool has no chance of being used indiscriminately by law enforcement or slipping into the hands of criminals, the judge could modify her order.

“Tell Apple to delete the passcode protection on the phone in its own laboratories.

“The F.B.I. would be present but would have no access to the programming fix. Then let Apple conduct the ‘brute force’ maneuver to guess the passcode, transfer the phone data to a hard copy and then destroy the altered software or place it in a vault.

“Perhaps Apple would object to being forced to serve as an arm of law enforcement. But every day companies are pressed into complicity when they obey subpoenas to hand over phone or credit card records of a criminal suspect.

“Apple would be able to guarantee privacy to its law-abiding customers, while the government would not be crippled in its obligation to look for further potential accomplices to a horrible mass murder.”

Frank’s idea seems to have merit worth exploring. As I said last Friday, I am a strong supporter of personal privacy. It is simply fundamental in this country. However, it should not come at the cost of national security.

“No date has been set for the [Congressional] hearing,” CNN reports, “but the committee asked Cook and Comey to reply by Feb. 24.”

Comments

  1. I knew you would be writing about this topic, Jim. Sounds like the easiest answer to appease each side is Spencer-Molloy’s:
    “Tell Apple to delete the passcode protection on the phone in its own laboratories…and then destroy the altered software or place it in a vault.”

    Good ethical discussion!

  2. I apologize for commenting on two editorials in a row, but the over-all situation is emblematic of how our nation has radically transformed since my childhood days in World War II when everyone served from soldiers to grade-school kids who collected newspapers, saved grease, scrounged for scrap metal, knitted afghan squares for the wounded and gas, milk, butter, meat and eggs were rationed. My mom spent three days a week with the Red Cross and dad was the neighborhood Air Raid Warden as well as superintendent of schools. We all knew there was a common enemy and to win, everyone had to contribute. A common enemy every bit as dangerous exists today and yet there are those who would shelter and even facilitate it, in the name of “privacy”. I find it incredibly sad.

Leave a Comment