The problem is that no-one else has gone through the process of establishing a community/standard that's capable of replacing it. Each system has built their own inoperable walled garden that doesn't work with anyone else, and none of them have a user base large enough to make encryption easy and pervasive.
My own secret hope is that Apple is forced to open iMessage as a part of an anti-trust action and that acts as a catalyst for interoperability.
Do you want open source clients that can be altered to ignore all privacy criteria — or do you want closed-source clients that make a good faith effort to adhere to auto-deletion protocols?
Pick one. There is no middle ground.
If you ("you" here being the PGP team) knew going into the design that the use-case of ASCII-armored-binary (.asc) documents is specifically transmitting them in a MIME envelope... then, instead of making .asc into its own hierarchical container format, why didn't you just use MIME, which is already a hierarchical document container format?
I.e., if you're holding some plaintext and some ASCII-armored-binary ciphertext, why not just make those into the "parts" of a mime/multipart container, and send that as the email?
Then all the work of decoding—or encoding—this document hierarchy would be the job of the email client. The SMIME plugin would only have to know how to parse or generate the leaf-node documents that go into the MIME envelope (and require of the email client an API for retrieving MIME parts that the SMIME parts make reference to.)
And you'd also get the advantage of email clients showing "useful" default representations for PGPed messages, when the SMIME extension isn't installed.
• Message signature parts would just be dropped by clients that don't recognize them. (Which is fine; by not having SMIME installed, you're opting out of validating the message, so you don't need to know that it was signed.)
• Encrypted parts would also be dropped, enabling you to send an "explanation" part as a plaintext of the same MIME type to the "inner" type of the encrypted part, explaining that the content of the message is encrypted.
I guess this wouldn't have worked with mailing lists, and other things completely ignorant of MIME itself? But it would have been fine for pretty much all regular use of SMIME.
Using our library you can generate PGP keys using any key derivation mechanism for a large variety of key types! When using it right, this will greatly improve how you can generate and back up your keys!
E.g., things distinguishing how email is used from how text-messaging is used:
1. Email is potentially long-form. I sit down and type it from my computer. Text-messaging is always short, although possibly it's a series of short messages. A series of short emails, by contrast, is an annoyance; it's something you try to avoid sending (even though you inevitably do when it turns out you got something wrong). Similarly, you don't typically hold rapid-fire conversations over email.
2. On that point, email says, you don't need to read this immediately. I expect a text message will be probably be read in a few minutes, and probably replied to later that day (if there's no particular urgency). I expect an email will be probably read in a few hours, and probably replied to in a few days (if there's no particular urgency).
3. It's OK to cold-email people. To text someone you need their phone number; it's for people you know. By contrast, email addresses are things that people frequently make public specifically so that strangers can contact them.
So what am I supposed to do for secure messaging that replicates that? The best answer I've gotten for this so far -- other than PGP which is apparently bad -- is "install Signal on your computer in addition to your phone and just use it as if it's email". That's... not really a satisfactory answer. Like, I expect a multi-page Signal message talking about everything I've been up to for the past month to annoy its recipient, who is likely reading it on their phone, not their computer. And I can't send someone a Signal message about a paper they wrote that I have some comments on, not unless they're going to put their fricking phone number on their website.
So what do I do here? The secure email replacement just doesn't seem to be here yet.
We need a modern design for a successor protocol to email, and no one is working on it because they prefer instant messaging (or think other people do).
Everything would have to support SMTP as a fallback for the lot of people that just don't care and thus couldn't actually improve.
One day, a while after they become usable and common, people will just realize they've been sharing documents E2EE in place of sending email, and they'll be using it for basically everything that matters.
It would be a proper restart and allow for significant improvements in usability and security and everything else.
Not OP but I can definitely say that's a yes from me after doing this repeatedly. I cold email people once a month or so, and if it is to do with anything sensitive I'll check to see if a public key is available for them (on their website is best, else I check a public key server and use that key as long as there is only one listed).
I get a better response rate from PGP/GPG users too, I can only recall one not responding to an encrypted message and I sent a follow-up message unencrypted which they responded to.
I think it's important to send PGP messages for ordinary communications whenever possible, because this normalizes it and may increase the workload for those trying to defeat it.
Honestly, I wouldn't focus on (3), because as I see it, if you can replicate the feel of email, things like (1)-(2), so that it can replace email in contexts without (3), then (3) will just come naturally as it slowly replaces email.
Edit: All this is assuming it isn't tied to a phone number or something similar, of course!
I just don't get the encrypted email obsession. It's impossible for an individual to withstand a targetted cyber attack so it seems pointless to go above and beyond to ultra encrypt every little thing
Well, first of all, "breaking in" isn't the only way someone might get access to data on Google's servers. There are such thing's as subpoenas, not to mention that it is possible a Google employee might abuse access to servers. And then I would be _very_ surprised if google doesn't use the content of your emails for advertisement and tracking purposes.
Furthermore, unless both parties are using gmail, the email will be stored at least temporarily on other mail servers, which may be less secure (and you might not even know who controls them).
So the only people who can read email are you, your counterparty, your ESP, and your counterparty's ESP, assuming the email providers are following good practice.
That's actually more complicated than that.
If you're using a web mail, your connection to the mail provider most likely uses HTTPS. That is, HTTP over TLS. When the mail is sent, it depends whether the recipient uses the same provider or not. If it's the same provider, well, protocols are irrelevant. If not, it will usually be SMTP over TLS (minus any potential problems with STARTTLS).
The main problem with that is that the mail is not encrypted on the various servers it goes through. Only the server-to-server connections are encrypted. So your provider can access your email, and so can the recipient's. When that provider's business model is reading your emails so it can send you targeted adds, this is less than great. (Yes, Google reads your emails. They try to reassure you by telling you their employees don't read it, but the fact the process is automated actually makes it worse.)
It is possible for a determined individual to withstand targeted attacks if he’s careful and willing to make the sacrifices that come with the territory.
In CFB mode, for the first block, you take an IV, encrypt it, XOR it with plaintext. Second block: you encrypt the first ciphertext block, encrypt that, XOR with second plaintext block, and so on. It feels sorta halfway between CBC and CTR.
Here's the process in OpenPGP, straight from the spec because I can't repeat this without being convinced I'm having a stroke:
1. The feedback register (FR) is set to the IV, which is all zeros.
2. FR is encrypted to produce FRE (FR Encrypted). This is the
encryption of an all-zero value.
3. FRE is xored with the first BS octets of random data prefixed to
the plaintext to produce C[1] through C[BS], the first BS octets
of ciphertext.
4. FR is loaded with C[1] through C[BS].
5. FR is encrypted to produce FRE, the encryption of the first BS
octets of ciphertext.
6. The left two octets of FRE get xored with the next two octets of
data that were prefixed to the plaintext. This produces C[BS+1]
and C[BS+2], the next two octets of ciphertext.
7. (The resynchronization step) FR is loaded with C[3] through
C[BS+2].
8. FRE is xored with the first BS octets of the given plaintext,
now that we have finished encrypting the BS+2 octets of prefixed
data. This produces C[BS+3] through C[BS+(BS+2)], the next BS
octets of ciphertext.
9. FR is encrypted to produce FRE.
10. FR is loaded with C[BS+3] to C[BS + (BS+2)] (which is C11-C18
for an 8-octet block).
11. FR is encrypted to produce FRE.
12. FRE is xored with the next BS octets of plaintext, to produce
the next BS octets of ciphertext. These are loaded into FR, and
the process is repeated until the plaintext is used up.
Yeah so CFB except your IV isn't your IV and randomly do something with two bytes as... an... authenticator? And then everything after that is off by two? This isn't the only case where OpenPGP isn't just old, it's old and bizarre. I don't have a high opinion of PGP to begin with, but even my mental model is too charitable.(Disclaimer: I'm a Latacora partner, didn't write this blog post but did contribute indirectly to it.)
> Put a Signal number on your security page to receive bug bounty reports, not a PGP key.
We can reasonably assume in 2019 that this "security page" is from an HTTPS web site, so it's reasonably safe against tampering, but a "Signal number" is just a phone number, something bad guys can definitely intercept if it's worth money to them, whereas a PGP key is just a public key and so you can't "intercept" it at all.
Now, Signal doesn't pretend this can't happen. It isn't a vulnerability in Signal, it's just a mistaken use case, this is not what Signal is for, go ask Moxie, "Hey Moxie, should I be giving out Signal numbers to secure tip-offs from random people so that nobody can intercept them?".
[ Somebody might think "Aha, they meant a _Safety number_ not a Signal number, that fixes everything right?". Bzzt. Signal's Safety Numbers are per-conversation, you can upload one to a web page if you want, and I can even think of really marginal scenarios where that's useful, but it doesn't provide a way to replace PGP's public keys ]
Somebody _could_ build a tool like Signal that had a persistent global public identity you can publish like a PGP key, but that is not what Signal is today.
The Signal blog states that "we designed the safety number format to be a sorted concatenation of two 30-digit individual numeric fingerprints." [1]
The way I understand it, you could simply share your part of the number on your website, but Moxie recommends against it, since this fingerprint changes between reinstalls.
It's clunky, but less so than I feared. I can actually imagine a person doing this. I mean, they won't, but like PGP this is something a person _could_ do if they were motivated and competent.
Certificate Transparency could be reused/abused to host it. If, for example, you issued a cert for name <key>.contact.example.com and the tooling would check CT logs this could be a very powerful directory of contacts. Using CT monitors you could see if/when someone tampers with your domain name contact keys.
Mozilla is planning something similar for signing software: https://wiki.mozilla.org/Security/Binary_Transparency
Does anyone actually do this? Even Signal developers themselves don't! (see https://support.signal.org/hc/en-us/articles/360007320791-Ho...). Instead there is a plain old email address where you are supposed to send your Signal number so that you can chat.
You need to check their “safety number”, and now we’re back to the same idea as with PGP with web of trust and key sharing parties.
At some point you still need some kind of pub-key identity check if you don’t want to accidentally report your vulnerability to PRC instead.
WhatsApp already has a key extraction protocol built right in for its Web interface. Signal has a web (Electron) interface as well, and a shitty one at that, where the messages also get decrypted. For WhatsApp, this means you're one line of code away from Facebook extracting your private keys.
Signal is different, in that they're not a for profit company. However, they've shown in the past that they are under no circumstances willing to allow support of any unofficial client or federate with another. In fact, they've taken steps against alternative clients on the past, making it clear that only their client is allowed to use the signal system. The moment the signal servers go out, Signal becomes unusable. This also leaves signal in the same position as WhatsApp, where we are dependent on one person compiling the app and publishing it on whatever app store you prefer. If signal has any Australian contributors and their code review fails sufficiently this means you're basically toast the moment the Australian government gets annoyed at a particular signal user enough.
Very few real alternatives to PGP exist. PGP is not just a message encryption format, it's a _federated_ message encryption format. There are very few actual federated message standards that come close to the features PGP supports. There's S/MIME, but that's only available after paying an expensive company for it to be of any use because it's validated as a normal TLS certificate and the free vert providers don't do S/MIME.
If all of these "real cryptographers" disagreeing with PGP's design would design a new system that can be used the same way PGP is used, I'm sure we'd see that getting some good usage figures quite quickly. But all alternatives seem to focus on either signing OR encrypting OR the latest secure messaging app instead of a PGP replacement.
I don't believe this is correct. WhatsApp (and Signal AFAIK) web works by decrypting the original message on your phone, re-encrypting it with a different key that is shared with your web interface (this is what is being shared via the QR code when connecting to WhatsApp Web), sending it to the web client, and having your web client use the second key to decrypt. This is why your phone must continue to be powered on/connected to the network for the web service to work. The original key is never "extracted", and AFAIK can't be extracted by normal means.
There are a few apps that attempt to exploit a few security vulnerabilities to recreate your key for you if you lose it and need to access backups, but that isn't the same as what you're describing.
Still, it would take just one decision by Facebook to completely disable e2e or add an actual key extraction method to WhatsApp and there's nothing you can do about it. While WhatsApp is the most secure of all conventional chat apps, it's certainly not a replacement for PGP in most use cases.
CAs require you to trust people that aren’t supposed to be party to the communication (trust both not to be hostile, and not to be insecure themselves).
All other forms of PKI offer entirely impractical authentication mechanisms. With signal and the like, your options are
1) Verify keys by being in the same room as the other party before communication, and after every key rotation
2) Just hope that the keys are genuine...
The only thing that you can trust is that the party you’re communicating with is one of potentially many holders of the correct key.
You don't seem to have understood what's going on in Signal. Ordinary key rotations, which happen automatically, do not change the verified status. What can happen is that another participant changes phone or wipes it, and so obviously trust can't survive that change.
The problem isn't that somebody else may know the correct key, the Double Ratchet takes care of that. The problem is that a Man-in-the-middle is possible. Alice thinks Mallory is Bob, and Bob thinks Mallory is Alice. Mallory can pass messages back and forth seamlessly, reading everything. Only actually verifying can prevent this.
You don't verify the encryption keys, that's useless because those change constantly, the verification compares the long term identity value ("Safety Number") for the conversation between two parties, which will be distinct for every such conversation. Mallory can't fake this, so if Alice and Bob do an in person verification step Mallory can't be in their conversation.
The problem is the key servers are run by the same people who control the app. This helps if the key server specifically gets compromised and the target is verifying, but for many attacks people worry about it's actually not the key servers specifically that get popped, it's an employee laptop or the employee themselves via subpoena, policy change etc. And for those cases nothing stops the app itself being changed to show you a false safety number, possibly by Apple without the app vendor even knowing.
So we end up with a rather curious and fragile threat model that only really helps in the case of a classical buffer overflow or logic error that grants an adversary the ability to edit keys and not much else. It's very far from "you don't have to trust the providers of Signal" which is what people tend to think the threat model is.
And honestly, a technique that combats very specific kinds of infrastructure compromise are too low level IMO to bother advertising to users. The big tech firms have all sorts of interesting security techniques in place to block very specific kinds of attacks on servers but they generally don't advertise them as primary features. If you have to trust the service provider, and with both Signal and WhatsApp you do, then are you really getting much more than with bog standard TLS? After all forward secrecy achieves nothing if the router provider is diligently deleting messages after forwarding them to the receiving device - the feature only has value if you assume the provider is recording all messages to disk and lying about it, in the hope of one day being able to break the encryption of ... their own app. Hmmm.
OP mentions exactly this point in the "The Answers" (https://latacora.micro.blog/2019/07/16/the-pgp-problem.html#...) section.
Furthermore, signal and WhatsApp do e2e in group chats where telegram doesn't.
Dont get me wrong, I use Telegram daily (it's desktop clients far outperform any of its competitors), but it's not as secure as WhatsApp or Signal.
I'd classify Telegram as "maybe secure" but I wouldn't recommend it to people depending on the security of their messenger application.
https://www.actalis.it/products/certificates-for-secure-elec...
- I don't want to be manually comparing hashes in 2019
- it locks me into signal, I wont be able to verify a git commit from that person as an example
Is there a system that solves this? Keybase is trying but also builds on PGP, we can use S/MIME which relies on CAs but is not better than PGP. Anything else?
The underlying cryptography is NaCl, which is referenced in the original post.
But even that's still not quite what I'm looking for. There's no straightforward way to link arbitary protocol accounts / identities to it, outside of linking plain URL:s.
We need something a bit smarter than keybase that would actually allow you to maintain a single personal identifier across multiple protocols.
People often think it must be the opposite but this is essentially emotional reasoning: the Web of Trust feels decentralised, social, "webby" un"corporate", free, etc. All things that appeal to hobbyist geeks with socialist or libertarian leanings, who see encryption primarily through the activist lens of fighting governments / existing social power structures.
But there's nothing secure about the WoT. As the post points out, the entire thing is theatre. Effectively the WoT converts every PGP user into a certificate authority, but they can't hope to even begin to match the competence of even not very competent WebTrust audited CAs. Basic things all CAs are required to do, like use hardware security modules, don't apply in the WoT, where users routinely do unsafe things like use their private key from laptops that run all kinds of random software pulled from the net, or carry their private keys through airports, or accept an email "From" header as a proof of identity.
I wrote about this a long time ago here:
https://blog.plan99.net/why-you-think-the-pki-sucks-b64cf591...
" There is no scope for difference between a “big corporate” CA and a “small politically active” CA because the work they do is so mechanical, auditable and predictable."
There is room for a politically-active CA like there is for anything else. In each market, there's players that get business for doing better things for privacy, being eco-friendly, being more inclusive, etc. Things that get business from vote with your wallet types. My idea, inspired by Praxis doing Mondex's CA, was a non-profit or public-benefit company that had built into its charter and legal agreements many protections for the customers in a country without secret laws/courts like U.S. Patriot Act. The CA would also be mandated to use high-security approaches for everything it did instead of just HSM's. They might also provide services like digital notary.
In short, I can imagine more trustworthy and innovative CA's being made. I'd easily pay one like that over the rest. I'm sure there's some number of people and businesses out there that think the same way. I wouldn't try it as main business, though, since market is too cut-throat. My idea was a company like Mozilla would try it to see what happens. Let's Encrypt confirmed the non-profit, public-benefit part being feasible.
I haven't read your blog, but this sentence unfairly paints WoT with PGP/GPG's problems.
It's completely reasonable to have a WoT that operates correctly when at least a single participant isn't completely incompetent. That's how git works.
I haven't looked closely but I'd be willing to speculate that PGP is to WoT what C++ is to fast compile times.
With domain validation it is likely better to use dane in the context of email. The sender looks up the key and mx record and act accordingly, and for postfix there are plugins that already do it. Very few current users however.
We need something more expressive than the current CA system, where you can make the choice to define your own trusted roots.
If you do, "export GOGC=20" can help a little, but it'll still use a lot of memory.
I suspect that it's probably hard to add that functionality because you can't do the deduplication without decrypting the prior backups (or at least an index). That would also explain the memory usage JoshTriplett mentions.
It seems to be fine for my mediocre backup needs
Everytime I look at git's documentation GPG seems very entrenched in there, to a point that for things that matter I'd use signify on the side.
Is there a better way?
I'm getting the strong sense (see also my toplevel comment, and maybe someone will correct me and/or put me in my place) that there's an enormous disconnect between the open source + unix + hobbyist + CLI development communities, and the crypto community. The former set has almost no idea what the state of art in crypto is, and the latter (somewhat justifiably) has bigger fish to fry, like trying to make it so that non-command-line-using journalists have functional encryption that they can use.
I think this is a sociological problem, not a technical "using command-line tools makes Doing Crypto Right impossible".
The article mentions Signify/Minisign. [1]
[1] https://jedisct1.github.io/minisign/ as an PGP alternatie.
That's the problem I see. I have signingkey in .gitconfig, together with [commit] gpgsign = true. This way, set & forget, all my commits are signed (it's my employer requirement, probably some "compliance" stuff). You can see it right away nicely displayed as "Verified" on github. I didn't know about GPG-s supposedly weak security until now, but always considered it not very convenient to use.
This is not really helpful. For all its shortcomings, PGP is pretty much all we have. If used in a straightforward way it actually can protect email from nation state level actors for a significant time. That's gotta count for something.
We need a modern email replacement that is decentralized, federated, et al. Something that keeps all the modern cryptographers happy, while facilitating the same kind of long form conversations and federated self-hostability that email provides.
I think Matrix is getting there, but even that is still focused on instant messaging.
I am reminded of BGP (Border Gateway Protocol). Anyone who has even glanced at the RFC of BGP could write an essay of the horrible mess of compatibility, extensions, non-standard design of BGP. It also lack any security consideration. The problem is that it is the core infrastructure of the Internet.
Defining something as insecure with the implied statement that we should treat it as insecure is unhelpful advice in regard to critical infrastructure. People are going to use it, continue to use it for the unforeseeable future, and continue to treat it as secure. Imperfect security tools will be applied on top, imperfectly, but it will see continued used as long as it is the best tools we have in the circumstances. Email and BGP and a lot of other core infrastructure that is hopelessly insecure will continue to be used with the assumption that they can be made to be secure, until an actually replacement is made and people start to transition over (like how ipv6 is replacing ipv4 and we are going to deprecate ipv4 if you take a very long term view of it).
It would be great if we could replace the whole Internet with modern technology rather than relying on ancient systems like BGP and email.
For some of our thoughts on the recent certificate flooding problems, see https://sequoia-pgp.org/blog/2019/07/08/certificate-flooding...
- For offsite backups (disaster recovery), mirroring object stores and filesystems to cheap cloud storage.
- For encrypting secrets needed for maintaining IT systems (eg. all those shared passwords we never seem to be able to get rid of)
- For encrypting sensitive documentation for transfer (email attachment, shared via filesystem, shared via HTTP, shared via pastebin even)
Despite the awful UI, GnuPG does all of that in a standard way. We have tested disaster recovery with no more instructions than 'the files are in this S3 bucket'.
And the same tool is also useful for other tasks too: - public key distribution (needs care to do it securely, but functional) - commit signing, signed tags - package signing (per Debian)
We could use custom or multiple tools for all this, but a single tool to learn is a big advantage.
I think all use cases boil down to 'encrypt and/or sign a file' for one of the stages. In the article, 'talking to people', 'sending files', 'encrypting backups' are all really just 'encrypt/sign a file' followed by transmission. And some sort of keyring management is needed for usability. A tool that can pull keys from a repository and encrypt and/or sign a file to a standard format could be used to build all sorts of higher level tools. I imagine it would be quite possible to build this on top of libsodium, and if it gained mindshare, replace uses of GnuPG.
Signal is also vulnerable to server-side traffic analysis, and is strangely keen on both demanding a selector (a phone number) for identity and on trusting Intel's 'secure' enclave (I strongly suspect that it's suborned).
One thing I do like about PGP is that it has been around awhile: I can still decrypt my old files & verify old signatures just fine, something I don't trust the flavour of the month to do.
I think that rather than a Swiss Army knife tool & protocol like PGP, we should have a suite of tools built around a common core, probably NaCL. That way we can have compatibility when we need it, but also aren't cramming square pegs into round holes.
Finally, the Web of Trust was a bright, shiny idea — and wrong. But Trust on First Use is also pretty bad, as is the CA ecosystem. We need something else, something decentralised and also secure — and I don't think that's impossible. I've had a few ideas, but they haven't panned out. Maybe someday.
Yeah, this whole part is some 90s cypherpunk way of modeling human relations, which has never mapped onto any real world relationships. As soon as people had real world digital identities outside of their gokukillerwolfninja666 logins, this didn't help.
CA ecosystem might be fundamentally flawed, but WoT was a complete failure. So PGP users end up trusting some key server which is probably sitting under someone's desk and has been owned by every serious intelligence service since forever.
One of my numerous hobby projects is exactly that, but … I simply don't have enough Round Tuits.
Here's how it could work (unless you tear it apart):
Alice and Bob share two passwords out of band: Pa and Pb
Alice and Bob generate two key pairs ka/KA, and kb/KB
Alice sends KA, Bob sends KB
Alice and Bob compute ss = HASH(DH(kb, KA) = DH(ka, KB))
Alice responds to KB with Ha = HMAC(Pb, KB || ss)
Bob responds to KA with Hb = HMAC(Pa, KA || ss)
Alice verifies Hb
Bob verifies Ha
The session key is HASH(ss) or something
The main disadvantage is that to achieve the security of a true PAKE, passwords here must be twice as long. A 4 digit Pin number here would only have the security of two digits (1/100). You'd need 8 digits to get to 1/10,000 security. On the other hand, it's extremely simple, doesn't require point addition, and if there's any flaw you probably already have spotted it.I really don't think it is, because it might be worthwhile for a particular sort of attacker, say one who runs the default rendezvous server: observe global activity, attempt to MitM every connexion for 30 seconds, then write up a spurious blog post about a 'network issue' or 'bug' or whatever which caused a brief outage. N:2^16 is okay* against targeted attacks, mostly (hence my 'cargo-culting' comment), but with a large enough N …
The nice thing about 1:2^128 is that you just don't have to care.
(also 'pvg, who said i should write this, and it's been nagging at me ever since)
The closest advice to this in the article would be "use Signal" which has various issues of its own, unrelated to crypto: it has Signal Foundation as a SPOF and its ID mechanism is outright wonky, as phone numbers are IDs that are location bound, hard to manage multiple for a person, hard to manage multiple persons per ID, hard to roll over.
To me that seems to be a much bigger issue than "encrypting files for purposes that aren't {all regular purposes}".
0. (Only once) generate key pair id_rsa.pub.pem, id_rsa.pem
1. Generate random key
openssl rand -base64 32 > key.bin
2. Encrypt key openssl rsautl -encrypt -inkey id_rsa.pub.pem -pubin -in key.bin -out key.bin.enc
3. Encrypt file using key openssl enc -aes-256-cbc -salt -in SECRET_FILE -out SECRET_FILE.enc -pass file:./key.bin
-- other side --4. Decrypt key
openssl rsautl -decrypt -inkey id_rsa.pem -in key.bin.enc -out key.bin
5. Decrypt file openssl enc -d -aes-256-cbc -in SECRET_FILE.enc -out SECRET_FILE -pass file:./key.bin> The enc program does not support authenticated encryption modes like CCM and GCM, and will not support such modes in the future.
> For bulk encryption of data, whether using authenticated encryption modes or other modes, cms(1) is recommended, as it provides a standard data format and performs the needed key/iv/nonce management.
So don't use `openssl enc` to encrypt data.
`openssl cms` that is recommended above is S/MIME. Don't use S/MIME.
I can't wait for Filippo Valsorda's `age` to be done so I would have an answer to the question of "what should I use to encrypt a file?".
https://github.com/openssl/openssl/commit/c03ca3dd090c6eb739...
Any idea when Fillipo's `age` will be done, or how to follow its development, other than the Google doc?
I am a little bit giving Filippo shit here but one concern I have about talking "age" up is that I'm at the same time talking the problem of encrypting a file up more than it needs to be, so that people have the impression we'd have to wait, like, 5 years to finally see something do what operating systems should have been doing themselves all this time.
It's getting better, but not close being business ready imo.
Brings to mind the words of renowned Victorian lifehacker Jerome K. Jerome:
“I can't sit still and see another man slaving and working. I want to get up and superintend, and walk round with my hands in my pockets, and tell him what to do. It is my energetic nature. I can't help it.”
I use gnupg a lot and I'm certainly not very happy with it but I guess it's the same as with democracy: the worst system except for the all the others.
I think that a better approach is to bind identities from multiple purpose built cryptographic protocols.
Next you will hold rensponsible tech for social engineering and mandate users should not know their own secrets because that causes vulnerabilities in protocols :p
0) https://lists.gnupg.org/pipermail/gnupg-users/2019-July/0623...
E-mail is fundamentally a way to send a sequence of bytes somewhere (untrusted) so they can be picked up later by someone (trusted).
That’s also literally what Signal is built on so I think you’re overstating the difference.
Interestingly some protocols such as roughtime use the same tactic as OpenPGP: one long-term identity key that can be kept offline and rotation of online (short-term) keys signed by the long-term key. Details here: https://roughtime.googlesource.com/roughtime/+/HEAD/PROTOCOL...
SSL was designed in 1994 but it has been properly maintained and today no-one argues that TLS should be replaced by noise/strobe etc. OpenPGP's problem no 1. is that there are no parties using it on a wider scale and interested in improving it.
That's what didn't happen with PGP.
Yes. And your software suggestions are excellent for 2019. I just wonder whether in 10 years it would be better to have a standard improved/developed, instead of a collection of one-vendor tools where the code is specification. That said I don't have high hopes for PGP given its maintenance problem.
Thanks for writing on the subject, even if the subject should already be clear to majority of technical people!
For example, the fact that there’s a grab bag of different ciphers, compression options, and other toggles makes properly picking settings an exercise in copy-pasting from a site you trust or guessing and then running an SSL Labs test until it comes back green. If you miss something, congrats, somebody can MITM and trick your users into downgrading.
Things like this are why the most notable features of TLS 1.3 are the things it removed, more so than what was added.
If you want to encrypt something and prove you are the author, PGP will still allow you to do that.
Does the author mean that PGP is bad for email specifically?
Excel has many of the mentioned properties, such as backwards compatibility and inefficiency, but it gets the job done and you bet it will pay your bills.
It feels to me like these posts are like the 80:20 problem, but rather with 99:1 and it's all about that 1%. I understand that software developers should use libsodium. But I'll sign the words "U R A >on" right now in GPG and wait for you to break my key and sign "U R 1 2" with my private key...
For my use case I don't have any concerns about using GPG. I encrypt files with it, and if anyone wants to put up any money that they can access my files, let me know what escrow service you want to use.
I don't really know what concrete advice the article gives for me personally. (The only thing I take away from this is to learn libsodium as well, rather than not using PGP.)
The fact that I have `fix-gpg` script to restart gpg-agent somewhere in $PATH that I run when for some reason it can't find my YubiKey tells me that it's not a viable solution for 99% of people.
PS. Actual command from GPG:
> help
...
sex change card holder's sex
...Also, you may want to try using an actual OpenPGP Card (https://www.floss-shop.de/en/security-privacy/smartcards/13/...). (You can get a small one inside a USB token too)
Getting it to work with my phone is slightly dumber. But still not super hard.
It seems that the state of package distribution for many distributions is poor, security-wise. (OpenBSD, to nobody's surprise, is an exception.) For instance, archlinux (I'm loyal) signs packages with PGP[1] and, for source-built packages, encourages integrity checks with MD5. My recollection is that, about 5 years ago, MD5 was supposed to be replaced with SHAxxx. Am I misinterpreting this? Is this actually Perfectly Okay for what a distro is trying to accomplish with package distribution?
(I'm particularly suspicious of the source-built package system, which consists of a bunch of files saying "download this tarball and compiler. MD5 of the tarball should be xyz." I'm pretty confident that's not okay.)
Okay, now moving from package distribution to messaging, and again looking at the state of my favorite system. How am I supposed to message securely? The best nix messaging tools are all based around email. Even when I can get the PGP or S/MIME or whatever toolset to work (let's face it, that's at least 45 minutes down the drain), it's clear that I'm not in good shape security-wise.
I should use signal, apparently. Great. Just a few problems: (1) no archlinux signal package, (2) I'm guessing I can't use it from the terminal, and (3) most severely, it seems signal has incomplete desktop support. In particular, I need to first set up signal on my phone. Well, let's face facts: I have a cheap phone from a hard-to-trust hardware vendor, and I think there's a >5% chance it's running some sort of spyware. (The last phone I had genuinely did have malware: there were ads showing in the file manager, among other bizarre behaviors.) So in order to use signal on my desktop, I need to buy a new phone? That's even worse, usability-wise, than PGP.
Is... is it really this bad? I'm getting the sense that the desktop linux community has completely dropped the ball on this one. (And perhaps more generally desktop mac/windows... I wouldn't know.)
[1] Perhaps not so bad, since the keyring is distributed with the system -- but how was the original download verified? Options are: PGP, MD5, SHA1, with the choice left up to the user. That can't be right.
If you look at systems like Debian-derivatives or RPM-based distros (openSUSE/SLES, and Fedora/CentOS/RHEL) the cryptosystems are far better designed. In the particular case of openSUSE, the AUR-equivalent (home: projects on OBS) are all signed using per-user keys that are not managed by users -- eliminating almost all of the problems with that system. Yeah, you still have to trust the package maintainer to be sure what they're doing, but that should be expected. I believe the same is true for Fedora's COPR.
[Disclaimer: I work for SUSE and contribute to openSUSE.]
For the record AUR packages can use GPG keys, see e.g. GPGKEY variable: https://wiki.archlinux.org/index.php/Makepkg#Configuration
Arch also uses Web of Trust to introduce Trusted Users: https://www.archlinux.org/master-keys/ so I wouldn't call "less than 10 years" as a disadvantage, but rather advantage - they have seen problems with alternative designs (e.g. Debian's curated keyring) and came up with something better.
responsibility is assumed when using others for any can submit there. that said there are signatures that can be verified ie available.
Package signing is (hot take!) overrated and can be somewhat theater. It helps if your package manager connects to third party mirrors, but otherwise, the only threat it protects against is "the https server is compromised but the package build farm is not". I don't know why anyone would worry so much about that.
Looking at an /etc/apt/sources.list, it doesn't look Ubuntu is using HTTPS for package distribution. Since you don't need both package signing and transport security, and I suspect that the list of packages you're downloading is a fairly trivial conversation length analysis anyways, I don't think the setup of signed packages over HTTP is meaningfully less secure than unsigned packages over HTTPS.
Signal's product focus has been at best un-encouraging to those who want to use it for anything else. Federation, I'm looking at you, but you've also got to take it in the sense that every single vendor that embraced the only realistic messaging federation standard of the 21st century went on to embrace and extinguish it in less than a decade.
This speaks to a few problems: 1. Messaging is hard 2. Security is hard 3. Logic about security is hard to reason. 4. Historically, anyone paid enough to care about this space hasn't had any sort of public interest at heart.
Any application that combines any of the three falls squarely into what the people at Latacora and the like would call a high risk application. I might disagree with much of their analysis, but in the lens of risk control, they are perfectly correct.
If you're trying to figure out how we got here, you've also got to realize that there was an avalanche of government and commercial entities whose goals are not in alignment with say, those who think the optimal solution is a home rolled provable trustless security system.
For myself and many engineers I'd bet, I'd say that's where we thought things should go in the 90s and early aughts. Some things are better now, but most are much worse.
Society and encryption's implications I would say have caught up with each other, and theres definitely something found wanting. There's definitely say a market opportunity there, but there's also another big challenge that I read reviewing a discussion about package signing lately: "Nobody in this space gets paid enough to care."
That's what separates people like Signal, even if some of the engineering crowd doesn't like the way they delivered.
This is a bit of a ramble, so there's two afterwords:
1. Much of the morass about PGP is explicitly due to the environment, space, and time in which it was developed. This does not boil down merely to 'it wasn't known how to do it better.' There were decisions and compromises made. I think the writer at Latacora is not giving the history of the piece justice. That's OK though, because that's not the crux of their argument. It would be good though I think if they gave that a further explanation than why things like the byzantine packet format are impossible to explain, even if that explanation were only a footnote and reference. (Writing the history of how it got there is absolutely doable, but it would make for a dryly humorous history, at best.)
2: The open source and (linux/others?) distro community has tried hard, more than once to make this work. The development, design, and implementation burden though, is gargantuan. The overarching goal was basically to be compatible with every commercial system and maybe do one or two things better. What the article casts as purely a liability was the only way to get practical encryption over the internet well into the early '00s.'
Regardless of all that though, PGP is still a technical nightmare. If you dismiss it though, even when we have better components, I worry that we'd only repeat these mistakes. If you work in any sort of crypto/encryption dependent enterprise, please find and study the history. Don't just take the (well considered) indictment of PGP at face value. There's important lessons to be learned there.
I don’t think it practically achieves this. Even now, Signal is an unreliable platform to communicate with. One can’t be sure if the message will reach in a timely manner (like a few seconds or several seconds) or even reach at all. The UX and feature set are also far behind something like Wire.
I’ll accept that Signal has a strong and reputed protocol, and has take some strong measures on security and privacy. But everything else about is truly meh, to put it mildly.
Please don’t dismiss these points about reliability saying it has never failed for you or someone you know. It routinely fails for people I know, and that’s all that matters when recommending a messenger platform to others.
Signal has made it seem like security is easy to focus on (though a lot of thought and work has gone into it), but has shown that UX or pretty hard and that running a platform is even harder (even at Signal’s scale, which I presume is a fraction of other platforms).
VSS support on Windows is an open issue:
What are the biggest weakenesses of using PGP for local file encryption?
My use case is I have a 'vault' of secrets that I store in a gpg-encrypted org file, with a key binding in emacs to let me easily decrypt this 'vault'.
The encryption is done with gnupg's symmetric encryption, i.e., just me typing a passphrase, not using my private key. This encrypted file resides in a Filevault-encrypted drive, and is backed up to a an encrypted TimeMachine backup. It never leaves my local drive or the external USB drive I use for backups (at least not that I know of).
What are my biggest risks here? I did not get much info from the article on this use case other than "perhaps wait for age". I understand I could be using an encrypted volume that I mount on demand, which I already do for other use cases, but that would get rid of the convenience of having this "vault" available to me just one emacs shortcut away. I'm willing to give that up if the risks are big enough but I'm kindly asking the knowledgeable HN audience for advice to see what I'm doing is really that bad.
But with email you communicate with one or more people about many topcis.
To archieve the same structure in a messenger, you must create several discussions. So messenger are not the holy grail of communication, that is why people still use email.
We need an email user interface with open messenger protocols under the hood for secure communication and usability. None of the current messenger offer that.
I've been (ab)using ansible-vault for this. It's apparently[1] now using reasonable primitives (PBKDF2+AES-CTR+HMAC-SHA256), but not necessarily the latest and bestest ones: https://github.com/ansible/ansible/blob/v2.8.1/lib/ansible/p...
The implemention is sub-optimal though, with some silly issues that haven't been fixed like https://github.com/ansible/ansible/issues/12121
I would dearly like a replacement for gpg in this context; `age` by Filippo Valsorda and Ben Cartwright-Cox looks like it will be nice, though I would like it to be easier to audit the recipients of encrypted files. Note that this kind of auditing relies on an information leak that cryptographers usually try to suppress...
Secure communication with most people is nigh impossible these days, because nobody wants to not use SMS.
There has to be a better solution, and despite email+PGP's flaws, it's the only system I've seen that's stood the test of time. There are pretty good plugins for Mail.app (macOS) and Outlook (Windows), and if you use Linux you probably can deal with gpg. We use it at work for email, and pretty much everyone can work with it at any level of technical expertise. Anyone else have any solutions that have worked in their circles, including non-technical family members?
Gajim is a usable client for desktops, but it's certainly picky about which users it is friendly with (that is, I can't recommend it to non-technical folks).
On the other hand "it uses IDs like email" is a concept my friends could understand, and the promise of end-to-end encryption over a server I control (as compared to some faceless organization somewhere else) was appealing to them.
And so I'm first level support for my peers and report issues upstream in the hope of improving the ecosystem.
And while it's closer to an email replacement (in that it avoids the ID issues of the WhatsApp clones and provides multiple clients for different purposes), it's still only a complement to email due to the ephemeral nature of messages, not a replacement.
The main criticism (and I'm preemptively responding to tptacek here) is that they haven't yet made E2EE the default (though this should happen in a few weeks now that cross-signing appears to be done). I also think the key backup system should be much better explained -- there is a usability bug open for it and I've posted some suggestions.
Unfortunately, its implementations are difficult to use, it was never directly supported by operating systems or major applications, and its crypto agility makes it impossible to write minimal implementations.
I wrote and use Piknik for small files transfer (especially with the Visual Studio extension), Encpipe for file encryption and Minisign to sign software. What they have in common is that they are very simple to use, because they only do one thing, with very little parameters to choose from.
The main reason for having written these in the first place was that PGP is too complicated to use, not that it couldn't have done the job.
Comparison of all these tools to GnuPG is valid and it clearly shows not only implementation problems but design ones as well in gpg.
What I fear is future riddled with all these incompatible tools. Even if they're written by brilliant engineers and cryptographers they are not standards (e.g. IETF standards). Why is that important? For example rewriting libsignal from scratch (for example to publish it under permissive licenses) can be problematic [0].
Until there is, whoever already uses or knows how to use GPG is still better off using GPG than not using anything.
Signal is especially not a good replacement for anybody who doesn't want to have "secure" communication connected to his (or any) phone number and to the central server.
The major issue with GPG is that in some (many?) cases a user could believe to be more protected by using it than the user is. But the same can be said for other technologies.
And the article rightly notes that the GPG defaults are often bad.
The right question to ask first is always:
https://www.usenix.org/system/files/1401_08-12_mickens.pdf
Are you "dealing with Mossad or not-Mossad"? (Also worth noting is that somehow Edward Snowden managed to "deal with" NSA and still remain out of prison. At least he knew from the start the answer to the question.)
Still, anybody who wants to replace PGP in some PGP use-case should provide a really good alternative for that use-case.
As of right now, there isn't any better for even such a simple need as encrypting a file. Let's discuss it once we have that one, at least. And even then, we'll need to be able to open our old files, meaning we'll still need GPG for that.
- you need team members to read/write the secrets
- you need the deployment service to read the secrets
Without using an online system managing the secrets via ACLs + auth, I don't know how to replace PGP here.
(KMS is not the only option! I'm just trying to eke out why you think that's valuable. For example, I think age, mentioned in the blog post, is a direct replacement?)
Some kid recently tried to have OpenPGP support deprecated from Golang's crypto-x package. And that's fine, but do not pull stunts like this without offering a concrete, working and widely adopted replacement. Otherwise, they are just that, publicity stunts with a lot of sound and fury but no solution. That's not helpful to anyone.
A more mature thing to do would be to suggest deprecation in 3 to 5 years and offer a plan of how to get there with other specific tools (some of which do not exist today).
But first: hold up. "Some kid"? You mean Filippo Valsorda? Google's Go Crypto person? The same person who is writing the replacement for that one use case?
- your key won’t be private forever but future compromise does mean past disclosure
- on long enough timescales if said TLAs can mount offline attacks against it.
So, maybe? But it’s definitely the safest way to do it, most of GPGs problems are unforced interaction errors.
I'm being attacked! :P
Using our library you can generate PGP keys using any key derivation mechanism for a large variety of key types! When using it right, this will greatly improve how you can generate and back up your keys!
Battle-tested by ransomware already (sigh).
The author is overly dramatic about it in order to make a point, to hopefully get people looking for alternatives, so that a good one might take it from pgp in the future (and continues to suggest whatsapp and signal, like, really? That's your replacement for pgp?).
I am perfectly fine with saying that GnuPG uses old algorithms, or that different applications should use different keys, algorithms or techniques. But, please, when designing such systems keep in mind that you want to check where your keys come from and which identity they are attached to. And TTBOMK only OpenPGP is currently able to do that. To me it would be great if all crypto applications done in the right way would have a way to tie their keys to the OpenPGP web of trust, in the same way Monkeysphere tried to do for SSL and SSH keys.
Seriously, I'm a developer. I have never once felt motivated to verify a PGP signature. Of anything.
Lack of a proper email infrastructure.
And by "email infrastructure", I mean proper support by most email clients contact apps.
The most basic use case seem to have never been addressed. You receive an email with a public key attached. Clicking on it should automatically open your contact app and ask if you want to add it to your contact's details. It should also be synchronizable with cardDAV or any other protocol of your choosing.
The talking alts either use PGP or a botnet. (signal uses PGP with a CA type of thing, so much for "stop using pgp") and whatsapp is owned by facebook.
tarsnap as I understand requires that I lock in to a service to secure my backups. F- that. Someone already talked about wormhole.
And how is essentially forking pgp with 'age' really going to solve things? Wow thanks another forked app! :^)
It would be easier to just make a wrapper for gnupg that sets the settings for everything the author is talking about. (well most of the things the user is talking about)
Wouldn't it be easier to just inform maintainers of the package to change the default standards of packages like gnupg? Has the author even attempted to change some of these things?
Don't get me wrong, I get where the author is coming from. Uinx philosophy should be followed... but certain systems can not be compartmentalized they have to unfortunately interact with other another.
If you for example encrypt data without signing it, how would you know that someone isn't trying to poison ciphertext to extract data leakage or worse yet, they already found a way to decrypt your data and manipulating sensitive data?
An encryption program by DESIGN should also have a method to sign data.
I legitimately have no idea what this means.
> And how is essentially forking pgp with 'age' really going to solve things? Wow thanks another forked app! :^)
A big part of the criticism we've gotten when we tell people "PGP bad" is that we're not providing alternatives. age is one of those alternatives, for one of those use cases.
> It would be easier to just make a wrapper for gnupg that sets the settings for everything the author is talking about. (well most of the things the user is talking about) > Wouldn't it be easier to just inform maintainers of the package to change the default standards of packages like gnupg? Has the author even attempted to change some of these things?
As we mentioned repeatedly in the blog post: no, the PGP format is fundamentally broken, it is not a matter of "just fixing it".
> If you for example encrypt data without signing it, how would you know that someone isn't trying to poison ciphertext to extract data leakage or worse yet, they already found a way to decrypt your data and manipulating sensitive data?
I think you're making an argument against unauthenticated encryption here. That's true! You should not have unauthenticated encryption, the way PGP makes it easy for you to have. age is not unauthenticated encryption, so the criticism does not apply.
Well, currently extricating myself from the Signal mess, I really am expecting my smoke alarm to go off any minute.
On intallation a couple of years back, the Signal app desparately wanted to curate my sms traffic, but forgot to inform me it would be holding my message history hostage. Forgot to inform me it would kill my instance, should I ever have the audacity to try setting up on a second phone unit. Forgot to inform me that yes, there is a desktop client, but it is useless Electron crap. Forgot to inform me it would be livestreaming my usage to everyone on my contact list - when I installed, when I reinstalled or changed devices, or when I mistook an ambigous list-feature for a personal book-keeping thing. The last item didn't happen to me, but to an acquaintance who thus had a somewhat embarrassing list of contact spilled out into the open. Not to mention the whole phone number based ID disaster, and the lack of any web interface. Sorry, but I'm out.
Fully aware of serious concerns about the crypto and the privacy, but for UI, consistently well designed client apps, data export, and lack of nasty surprises, I have seen nothing to rival Telegram. Which may help explain why that seems to be where everyone is heading.