Over the past week-plus, we have seen an escalation between Apple and the FBI in regards to unlocking the iPhone belonging to one of the shooters involved in the San Bernardino, Calif. terrorist attack in December.
The latest on the fight is that both the FBI and Apple CEO Tim Cook have been invited to a hearing of the House Energy and Commerce Committee on privacy and national security “to explain to Congress and the American people the issues at play and how they plan to move forward.”
There are people with strong opinions on both sides of the fence as to what is right and what is wrong in this instance. This is something worth keeping an eye on as this case continues to progress for anyone who owns a smartphone, is concerned with privacy or works on the Internet. Be sure to stay tuned into Threatpost for all the breaking news on security and privacy.
In case you’re behind on the news, here is a quick overview to help you pick a side in the debate.
— Fabio Assolini (@assolini) February 22, 2016
Back StoryReportedly, Apple gave the FBI data that was backed up to the iCloud service from the San Bernardino shooter’s phone. The last copy was made in October 19, when the criminal allegedly stopped backing up the phone. The FBI wants the newer data to fill in the gaps and has a court order, which tells Apple how exactly the company should help with the investigation.
That is to say:
1) disable the functionality that wipes the memory when more than 10 wrong passcodes are entered in a row;
2) create a software that can enter passwords automatically;
3) disable the delay between each passcode entry.
In other words, FBI wants to bruteforce the passcode and asks Apple to turn off all security constraints. If Apple agrees, it would only be a matter of time for the FBI to crack the code, for example, you can break a 4-digit PIN in several hours.
Apple CEO Tim Cook published a message to customers saying that the company had already shared with law enforcement agencies all data it had. Cook rightly noted that the FBI asked Apple to make a “master key;” “Now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
There are obviously a number of open questions that may need some clearing up on, so we’ll do our best to summarize below.
What laws does the government rely in this case on?
It’s a good question. The government is using the All Writs Act that was signed into law in 1789. Basically, this act helped establish the judiciary system in the US, which was rather young at the time. The act was giving federal courts the power to issue orders that do not fall under a pre-existing law. Gizmodo published a detailed review of the act and how it is used by the justice system that is worth the read.
Obviously, Congress wasn’t considering iPhone security at the time it passed All Writs Act of 1789 https://t.co/dqAgGLN5WM #AWA #security
— Just Security (@just_security) February 22, 2016
Why does the FBI need Apple to hack the iPhone?
Apparently, FBI agents can’t do this as Apple’s security measures work good.
But there are other facts.
Recently Apple invited journalists to separate conference calls, which were held under strict rules: nobody was to cite Apple experts word for word or disclose their names. The company revealed that FBI accidentally reset the San Bernardino shooter’s iCloud password. If the bureau did not do that the phone would automatically synchronize with iCloud and make a newer backup, and Apple would readily share it with the agents. After the reset this scenario was impossible to fulfil.
Where do the parties stand?
Apple’s public position was revealed in February 16: FBI’s idea was equal to the backdoor development. This solution endangered Apple’s clients, so the company didn’t want to comply.
FBI responded quietly with a court claim in February 19. According to the bureau, Apple could help but decided against in favor of protecting its brand.
Can Apple do what FBI requires?
Maybe. Tim Cook’s announce doesn’t include a clear response to this question: “too dangerous to create,” “something we simply do not have” and so on. Of course, Apple is the developer of iPhone software and hardware, so it can do a lot. Following the results of the “secret” Skype-call with Apple employees, Gizmodo confirms that it is technically possible for the company to create this kind of software in demand. But will Apple do it and what will the consequences be?
According to the FBI, there is nothing to fear. The bureau claims, that Apple can bound the hack to the terrorist’s iPhone only. Moreover, allegedly the FBI doesn’t ask the company to share this software with their specialists.
From Apple’s point of view, if such solution is created, cybercriminals will devote resources trying to recreate it and produce their own backdoor. In this case iPhones would lose their reputation as a secure device.
ICYMI: The FBI is scaring people into being worried about weird implausible theories of terrorism. https://t.co/ymatij7Qk8
— the grugq (@thegrugq) February 20, 2016
This story is in the right place at the right time. The Apple vs FBI fight is going in the middle of the global dispute, which tries to draw a distinction between privacy concern and national interests, including crime and terrorist attacks investigation.
So why does encryption matter?
You see, encryption is made of math, not magic. It’s impossible to weaken it for a charmed er, select, circle only. Sooner or later other people will find the soft spot and nobody guarantees that bad guys will not detect it before the good ones (as mentioned above).
Would a golden key actually solve encryption issues? https://t.co/2JUAypdDf3 #apple #FBiOS pic.twitter.com/O8btU4j7Xy
— Kaspersky Lab (@kaspersky) February 19, 2016
In fact, every pressure that threatens privacy, puts encryption at risk as well and correspondingly, security of data and communications gets on the firing line. The consequences could be crucial.