I have nothing but respect for Apple's stance with regard to cryptography, but Google has been more instrumental in getting strong crypto deployed on the Internet, and, just as importantly, in sweeping the minefield of crappy 90s crypto that defined most Internet crypto until recently.
TLS for email is still in pretty bad shape but it's getting better. (Funny, I just noticed that Google's page says "Safe Browsing" while only "Safer Email".) I know you're not a fan of DNSSEC, but something like Secure SMTP via DANE is probably needed for meaningful improvement: https://tools.ietf.org/html/draft-ietf-dane-smtp-01 (though it won't help with the chicken-and-egg problem of domain ownership validation by email)
I'm sure Cook believes what he's saying, but the real marketing strategy here isn't "crypto versus plaintext"; it's "consumer product company" versus "online service provider".
Seen through this lens, there's an argument that what Cook is doing is counterproductive. He's making an argument that Google can't sign on to, and using crypto as a wedge to drive the argument home. "Be a consumer product company, because then you can protect users with crypto".
Also: the kind of encryption that Apple is really making a stand for? They do a better job of it than Android, but Android provides the same encryption: what scares the USG about Apple is that iPhones are locked by default, and when they're locked, they can't be imaged easily. That's true of Google's phones as well.
Meanwhile, Google is doing a much better job of securing browser crypto than Apple is; Apple is almost an obstacle to better browser crypto.
Did you finish reading the article?
> Facebook’s WhatsApp has brought end-to-end encryption to more people – over 800 million – than any other service; and Google’s engineering team has been a leader in securing much of the web in the post-Snowden era.
And then they go on saying:
> But this is much more than an engineering fight – it’s a political one where public opinion is crucial.
Do you think the average voter knows what ECC forward-secure TLS is? Heck, I'd like to think I kind of know a little about the subject but I know _nothing_ compared to you and a bunch of other HNers.
But unfortunately, we live in a society where people who can vote are really scared of terrorism and lack an understanding of how technology works. If a politician tells them we need to decrypt "all the things" for their safety they'll happily vote for them[0].
We need the celebrities of the tech world to reach out and explain why we need crypto in a way they can understand.
[0]: No link really, just watch any of Donald Trump's rallies and tell me if you think those people care about encryption.
As to the argument in the article, are there other examples of non-Tim Cook CEOs of big tech companies saying anything like this?
"But the reality is if you put a back door in, that back door's for everybody, for good guys and bad guys"
The closest I've found was a letter from many companies [1] which says "introducing intentional vulnerabilities into secure products for the government’s use will make those products less secure against other attackers." Google, Apple, Microsoft, Facebook and many others were signatories to that letter. So it certainly sounds like the companies might feel that way.
[1] https://static.newamerica.org/attachments/3138--113/Encrypti...
It seems that if you really want to guarantee privacy, you have to give the individual control over what they can install. Telling people to just "trust us" is not really good enough. And Cook is saying they are giving the user ultimate control by not having keys to their encryption but in reality that's nonsense... they are still requiring people to trust them.
This is probably the most native looking one of the bunch: https://forecast.io/
Certificate pinning helps against the MITM problem, but code integrity for downloaded client-side code is pretty tricky. Browsers could add some form of signed code pinning for power users, but it'd be tricky to be able to distinguish between legitimate updates and nefarious activity.
Me, to CEO: Hey, think we should ever build a backdoor into any of our
products that employ encryption to help the US government
and law enforcement?
CEO, to me: No, that's a terrible idea.
Me, to CEO: Okay good, just making sure we're on the same page.
I don't think there are many honest and competent technology CEOs who would rally against encryption. CEO, to me: Yes, because we are compelled by law
backed by jail time or hefty fines.At the end of the day, we care about our integrity more than we do dollars.
There is no shortage of minds working on to create backdoors, or develop cryptographic methods that have backdoors, just look at Dual_EC_DRBG. It was a backdoor for the "good guys", but now its backdoor for everyone - eventually people will study the code and see the backdoor exists.
The crux of the issue is mathematics has no concept of good guys or bad guys, so as far as mathematics is concerned a back door for anyone is a backdoor for everyone.
Also can't the role of the good guy be split up among a group? Similar to the two man rule to prevent rogue agents from launching missiles, can't we have some sort of process that requires agreement among a majority of a few parties including the end user, the company who owns the software, law enforcement, and the (public) judicial system. If all it takes to break down the door to my home are a judge and law enforcement to agree, why can't we accept similar when it comes to data?
Let's pause to consider this. Math works, that's why even the NSA can't break encryption. I wouldn't want to tell you wrong and say that it's impossible, it's not, but it would take something like ten billion years to crack. Needless to say, there's a reason why they need a backdoor, and that's because math works.
However, if a backdoor were put into all electronic products, the strength of the encryption is now meaningless as any would-be attacker (government or otherwise) would just target the backdoor instead of trying to break the encryption. Why wait ten billion years for a computer to brute force the message when you could just find a flaw in something designed by the government?
"The problem with cryptographic backdoors isn't that they're the only way that an attacker can break into our cryptographic systems. It's merely that they're one of the best. They take care of the hard work, the laying of plumbing and electrical wiring, so attackers can simply walk in and change the drapes."
It's perfectly possible and trivial to put in a backdoor that only works for people who have access to a specific private key.
Obviously if that private key gets stolen anyone can then access the backdoor, but that's true for anything, and you can mitigate it by storing the key in self-destructing immovable hardware with access limitations, as well as periodically changing the keypair (with signed updates).
The real problem is that there is no single "good guy" to entrust with that private key: in particular humans are inherently not fully trustable or good and both individual consumers and other governments have no interest in using or allowing backdoored products.
That said, all Apple would have to do to fix this is to allow advanced users to see all keys listed as authorized for their account. I'm getting increasingly annoyed Apple hasn't done that.
> Apple could collaborate with law enforcement to provide a false key, thereby intercepting a specific user’s messages, and the user would be none the wiser.
Key word is "could". Apple "could" also use its signing keys to install any kind of software on your phone to do whatever it wants. For example, to read your keychain and pull your private keys.
Foreign govts? Rather, "against the constant threat of criminal governments and hackers."
Off the top of my head:
SSO ID Service / Cloud Photo Storage / Cloud Document Sync / Cloud Backup / Email / Instant Messaging / Music Store / Music Streaming Service / Cloud Music Service / Movie/TV Store / App Store / Push Notifications / Payments / Video Conferencing / Game Centre / eBook Store / Shared Calendaring / Notes / Large File Sharing / Personal Assistant / Maps
Weak?
It may be in their Apple’s financial interests to do so, but it’s also the right thing to do. As you (partially) say, they care about the user experience. That experience includes taking steps to protect user information and they’ve had a long track record of doing just that. They did this long before it probably had any noticeable effect on the bottom line.