What the FBI is attempting is to use 'All Writs Act' from 1789 which authorizes Federal courts to issue "all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." Are there limits to what a judge can order a person, or a company, to provide?
A warrant describes "the place to be searched, and the persons or things to be seized." I would not expect a judge can draft a warrant for something which doesn't actually exist, and then force someone to create it.
This is not about providing physical access, or about producing documents which are in your possession. This is whether the government can usurp your workforce to make you create something that only you are capable of creating, against your will, not because there's actually a law which says you have to provide that capability, but simply because some investigator has probable cause that given such a tool they could use it to find evidence of a crime!
If 'All Writs' somehow does give the government the ability to enslave software developers to creating this particular backdoor, what is there to legally differentiate this request from, for example, one that would function over WiFi or LTE remotely?
There's been a lot of discussion about the 'secure enclave' and how this particular attack isn't possible on the iPhone 6. I think that's missing the point.... If 'All Writs' can force Apple to open a black-hat lab responsible for developing backdoor firmware for the 5C, then it can do the same for the 6. For example, why not force Apple to provide remote access to a suspect's device over LTE while the device is unlocked / in use? While we're at it, the iPhone has perfectly good cameras and microphones, let's force Apple to provide real-time feeds.
Think about the sheer quantity of networked devices which exist (or will exist) in an average home which could be used in the course of an investigation. If they can force Apple to create a 5C backdoor, I can't see any reason they can't apply the same logic to WiFi cameras, Xbox Kinects, or even your cars OnStar. Heck, even TV remotes come with microphones and bluetooth now... And don't get me started on Amazon Echo!
Fundamentally, the question is can you force a device manufacturer to implement backdoors into their products to be used against their own customers? Notably, service providers have already lost that battle, they are required to architect their systems to be able to spy on their users and provide that data to law enforcement, often through specially design real-time dashboards. At least in that case it is based on duly enacted legislation with that specific intent.
But this is something really quite shocking -- can investigators, simply through obtaining a warrant, force companies to re-design the personal devices that we own and keep with us almost every moment of the day to spy on us? I truly hope not.
Well stated. That's the crux. The technical difficulty of any given hack is going to vary and is ultimately irrelevant. The idea that the government can commandeer a company's resources towards its ends, especially when those ends compromise the security of a larger community, is a dangerous one.
Once in possession, they'd be free to contract an independent entity to modify the firmware and resign it with Apple's own key. It can't be all that hard to NOP out the code that increments the unlock attempt counter and associated delay mechanisms.
People on the Internet tend to believe that technology poses confounding problems for the law, but the law has been dealing with technical challenges for centuries. See, for instance, any case involving a complicated medical issue.
Can you name some of them?
IANAL, but the Federal rules of evidence section 706 explicitly states, "But the court may only appoint someone who consents to act."
The problem is different now than it was centuries ago when technology was oil, gas, and steam powered. Now it is electric, meaning information moves at the speed of light all around the world simultaneously.
Such technology is getting better, faster, and cheaper at exponential rates leading to ephemeralization. Meanwhile paper-based political processes have stalled and remain slow as ever.
Say a paper-based political system begins the process of banning a new technology. By the time they finally get around to completing their process, 5 better, faster, and cheaper technologies have already been invented making the old one irrelevant. This is an increasingly important problem to deal with considering the existential nature of paper-based political processes and the rate of technological change.
Paper-based information systems and processes are simply too slow to keep up with the speed of the electric medium. It's like trying to race a lightning bolt.
In a way, this would resemble war-time confiscation of production capability. You have a car factory, for instance, and the government tells you that from now on, you'll be producing tanks, thank you very much.
I think this was not strictly speaking what happened in the US during Second World War, because companies were willing enough to produce war material for the US government in exchange for considerable sums of money. But for instance, the Skoda factory in Czechoslovakia was just confiscated and directed to making military vehicles.
This almost certainly ends up at the Supreme Court, but due to the national security implications of the case it's plausible certiorari petition could be made by either Apple or the government. And yet we have a 4-4 court right now.
[1] Or really any small tweak. I can remember at least a couple of times being asked to provide someone with a special build of software I worked on that, e.g., logged something it didn't ordinarily log to help debug an issue a customer was having that we couldn't reproduce in-house. Can't say I ever thought of that as creating new software, even if I added some fprintf statements that weren't there before.
It is being asked to help exploit an existing backdoor in a device they produced; a device that was used by a person who slaughtered many people.
I really believe that there should be a way for law-enforcement to get access to specific devices in response to a court order as long as the solution doesn't involve weakening the encryption for everybody else.
I'm absolutely against backdoors, secret* keys or similar crap. But physically access a single device in order to make brute-forcing it possible, that seems acceptable to me as that won't affect any other device.
That would be similar to a court order allowing law enforcement to enter your premises and take out the safe in order to pry it open at some other location where specialised equipment is available.
If this is all law enforcement wants, then maybe it's time to hand this over before law enforcement wants even more which will doubtless pave the way for mass surveillance of devices.
* until they leak. Then everybody has access.
The FBI is already paving that way with this case. They don't overly care about access to this particular iPhone. They're taking this case through the courts so that they can establish a precedent that allows them to force manufacturer cooperation to unlock any phone.
Edit: If they really cared about access to this individual phone, they wouldn't be going through the courts to get it; they'd be talking to the NSA TAO or other LEO with advanced forensic capability. As several people have pointed out, this iPhone 5C does not have a Secure Enclave and probably does not present a significant challenge to forensically analyze, to people that know what they're doing. They're going through the courts on this so they can get carte blanche to access iPhones 5S and above, which no LEO currently has capabilities to inspect.
Further edit: This is Farook's work phone. His main, personal phone was found destroyed in a dumpster near the site of the attacks. I find it incredibly unlikely the FBI really cares much about the contents of this individual phone, they just want a high-profile test case to expand their surveillance capabilities.
This is an analysis, not an objective and demonstrable fact.
I could just as well argue that yes, the FBI really does care a lot about this particular iPhone, and that's why the asked-for update is to be keyed to this iPhone and only this iPhone.
At the same time, even assuming that is true, we're talking about the FBI going through a legal process, reviewed by a judge, to get the data off one phone at a time. If that's how it works every time, I don't see a problem; that is how the system is supposed to work. I am kind of baffled as to why we're cheerleading the fact that Apple is refusing to perform what appears to be a perfectly reasonable request that is being made in accordance with the law. If you are operating under the presumption that the government is always a bad-faith actor, then we have much, much bigger problems.
Also, apparently this 'precedent' has already been set; according to a link in the article, Apple had previously offered custom firmware images to law enforcement after a court order that bypassed the lock screen on earlier iPhones.
http://www.cnet.com/news/how-apple-and-google-help-police-by...
Apple chose themselves to have total control over the device, signing and ecosystem - now it backfires.
I would think it's more similar to the following: the government has gone to the safe manufacturer to help it open a safe it has a warrant for, has access to and can move, but can't open.
The government is asking to develop a method to modify the safe so that it can be opened. The safe manufacturer says that it they did so the same method would be able to be used on all of their safes, and thereby make all of their products less secure.
I would imagine that a reasonable safe manufacturer would bring up the same objection.
If the safe manufacturer doesn't want to be put in this position, it should make it so there is no such modification possible. Which as far as I understand it is what Apple did with their safes^H^H^H^H^Hphones starting with the A7 CPU, but this phone is older.
This requirement is self-contradictory. The device has no way to determine whether the attacker trying to gain access is a good or bad guy, nor can it.
Which law enforcement? The FBI? Really? how about the DEA? TSA? How about the federal police in China? Venezuela? Saudi Arabia? Syria?
Were I in Apple's position, I would probably do what Apple is doing here... but it's a harder question than just "should we cooperate?" They have to ask, "what if we don't?"
Sure, they can try brute-forceing it, or even break open the chips and try to extract keys with an electron microscope. That's all within their domain. But why should anyone be forced to assist them?
A safe is a container filled with physical objects: property. Property is subject to search and seizure with appropriate warrants, levies, writs, orders, wants, etc.
A phone is a container filled with information. The only physical property relevant to evidence consists of the electromagnetic state of the memory on the device. This would be no different from the bioelectric state of the neurons in the human brain, which coincidentally, also is a container filled with information. In both cases, there seems to be easy precedent to state that the information in those containers represents protected information, as it pertains to the possibly incriminating testimony of that information.
A safe can be physically removed and brought to a place where there are more specific tools available to access its physical contents. I stipulate to that.
A phone can also be physically removed and brought to a place where ... What? What tools exist to interrogate the electromagnetic state of the phone that aren't already accessible? Asking Apple to create some software allowing them to unlock and read the information is tantamount to asking a neuroscientist to create software allowing them to unlock and read your mind.
Not trolling, these are my sincere beliefs. Are they wrong?
Nobody can bypass the security on an iPhone but Apple, requiring them to do so isn't requiring a company to assist in a search, it's requiring a company to use its unique position as the manufacturer to damage their product.
Unfortunately that's exactly what they're going to end up doing with this faux resistance. It seems like in this case there is a master key that only Apple has. If the device's security is broken in this manner, then this is a terrible place to make a stand, as Apple will have no choice but to eventually comply.
Next time, with an actually secure implementation, the stance will be "you protested last time and gave in, do that again". And when USG realizes Apple isn't bluffing that time, their bolstered entitlement will result in the inevitable law for Apple to go back to the backdoored nearly-just-as-secure scheme.
To the first order, USG doesn't care about the argument that foreign governments could also compel Apple, since that simply reduces to traditional physical jurisdiction. And governments seem to be more worried about protecting themselves from their own subjects than from other governments.
We can only hope that the resulting legal fallout is implemented in terms of the standard USG commercial proscriptions based on the power of default choices, leaving Free software to continue to be Free.
Apple could be forced to write software that removes the rate limiter and the FBI could still be stuck without access because it's possible the user used a password with too much entropy.
Do people think this a game? Apple doesn't run things, the federal government does, and will, in the end, use it's full power to get what it desires.
EFF to Support Apple in Encryption Battle-
https://www.eff.org/deeplinks/2016/02/eff-support-apple-encr...
Chinas behaviour made HSBC, the worlds 5th largest bank, move to London. It's not an unheard of move.
Apple may have to comply with this order (after appeals), but this also helps muster the troops for the battle against universal backdoors.
Also, "its", without an apostrophe.
If true, precedent has already been set.
It probably wouldn't apply as precedent as it previously had nothing to do with encryption.
Specifically (emphasis mine):
> ...the Self-Incrimination Clause ... may be asserted only to resist compelled explicit or implicit disclosures of incriminating information. Historically, the privilege was intended to prevent the use of legal compulsion to extract from the accused a sworn communication of facts which would incriminate him.
-and-
> ...the act of producing documents in response to a subpoena may have a compelled testimonial aspect. We have held that “the act of production” itself may implicitly communicate “statements of fact.” By “producing documents in compliance with a subpoena, the witness would admit that the papers existed, were in his possession or control, and were authentic.”
-and-
> Compelled testimony that communicates information that may “lead to incriminating evidence” is privileged even if the information itself is not inculpatory.
† https://www.law.cornell.edu/supct/html/99-166.ZO.html
EDIT: Wikipedia summary of the case here: https://en.wikipedia.org/wiki/United_States_v._Hubbell
As a side note, the author mentions that Apple has updated the Secure Enclave with increased delays in the past without wiping data, though they state that only Apple knows how it really works. I just want to put forth the theory that maybe the Secure Enclave allows its firmware to be updated if and only if the user's passcode is provided at the time the OS tells the Secure Enclave to prepare for a firmware update. That would be a reasonable way to ensure the Secure Enclave can't be subverted.
There are Android full disk encryption schemes, and of course phones with signed bootloaders.
https://nerdland.net/unstumping-the-internet/pattern-unlock-...
http://www.extremetech.com/mobile/216560-android-6-0-marshma...
What Apple did that was so valuable is providing a very clear, almost abstract implementation, from scratch, hitting every point along the way (randomized device private keys, read and execute only Secure Enclave, signed loaders, proper AES(-XTS?) full disk encryption, probably also requiring strong a password too, full lock after ~48 hours - sure, it'd be good if this could be customized to something lower).
I still remain opposed to any kind of circumvention that reduces security, which this definitely does. Just questioning whether it's something to be so shocked about since it's not exactly far removed from the kind of requests that they have conformed to in the past.
Presumably this could then be used for offline attacks against the image dumped from the phone's flash memory.
Update: In the meantime I was talking to my security engineer peers and it is not feasible to carry out an attack this way. The user partitions remains un-mounted until the PIN is provided after the boot.
It's encrypted data using the PIN (and a key embedded in the phone). There's not another way in.
All an attacker would have to to was clone the contents of the the device's SSD and somehow read the secret key that is embedded somewhere else. I'm not sure how feasible the latter part is, but surely this shouldn't be beyond the capabilities of US three-letter-agencies?
The Pandora's box is opened?
After too many incorrect pins, there's a time delay before another attempt can be made.
The FBI is also thinking about the next time. They want to be able to take a phone, plug it in and brute force the PIN to gain access.
I agree with the first post. We need to be creative and find a way to resist government surveillance, and the piece where the engineering seems impossible is allowing an occasional breach of security for extreme circumstances.
What's extreme? Well first, physical possession of the device should be required. Second, it should take resources only a nation-state would be able to afford. Want to decrypt an iPhone? It's going to cost > $5million in processing power. Any criminal would move on.
It also seems pretty disingenuous/hypocritical for Apple to plead "customer privacy" when the ENITRE BUSINESS MODEL of much of the smart phone and app industry (from which Apple directly benefit with a 30% commission) is predicated on abusing customer privacy.
Apple uses privacy as a major selling point. Apple has also proven very bad at abusing customer privacy for profit, they have even shuttered their own advertising service.
They should already have this information from their 'metadata' collection programs. If the FBI was doing their job at any point this wouldn't even be a subject of debate.