It is easier to provide the list of things that are worth worrying about than it is to list the things that are safe. There are a lot of as-yet unbroken ciphers and constructions. So, here are the things to avoid:
* Block ciphers in the default mode ("ECB").
* The Dual_EC random number generator, which virtually nobody uses anyways. You weren't going to accidentally end up using it. Or, for that matter, any other PKRNG (random numbers produced by public key algorithms).
* RSA with 1024 bit moduli (or below); RSA-2048 is your starting point. Conventional DH at similar key sizes will be an issue too, but there's a "means/motive/opportunity" issue for RSA-1024 given its prevalence.
* MD4, MD5, and SHA1 aren't backdoored, but are broken or weak. But: all three are survivable in HMAC (don't use them, though). SHA2 is your best all-around hashing bet right now.
* The NIST P- curves. There's no evidence to suggest they're backdoored, but (a) the rationale behind their generation is questionable and (b) they have other annoying properties.
So far as I can tell, you are now fully briefed on the "distrusted" crypto.
Don't build your own crypto. Use PGP for data at rest, TLS for data in motion, and NaCl for the rare in-between cases.
You can plainly see the problem ECB causes in this example image: http://legacy.kingston.com/secure/image_files/Figure2_ECB.jp...
Perhaps 'pbsd will be around in a bit to resolve whether the index calculus will push the size of p or a first; my understanding is that it's bounded by the size of the modulus, and that most of the work it does is independent of the specific element of the group you're attacking.
I am definitely a lot fuzzier on DH key sizes than on RSA; we're getting into cryptanalytic attacks that don't have a lot of relevance to the kind of work I do.
It's believed that any elliptic curve algorithm that doesn't have a transparent process for choosing the curve points may have been backdoored by the NSA choosing points that they already knew how to factor. If you use those curves, then you're revealing your secrets to the NSA but not to anyone else, because the discrete log problem is still (mostly) just as hard as it ever was.
Specifically, the elliptic curve random number generator in NIST SP 800-90A is believed to have been backdoored by the NSA. For obvious reasons no one has any hard proof, just very strong circumstantial evidence.
You can continue to use SSH2-RSA with decent size (2048 bit as a minimum) keys & AES. Those are not believed to be breakable at the current time, although as ever you can never have absolute certainty in these matters!
Everything beyond that is the precautionary principle.
It's also really important to understand the difference between Dual_EC (the random number generator) and the NIST curves. There is much more circumstantial evidence against Dual_EC. Importantly, the potential backdoor in Dual_EC isn't really related to elliptic curves; you can describe a functionally similar backdoored RNG using other public key algorithms.
...until some worker or contractor takes their "secret" values for himself, or sells them, or publishes them on the internet. Producing the public standards with the built-in master keys increases possibility of overnight global breakage.
See: https://en.wikipedia.org/wiki/Nothing_up_my_sleeve_number
(Or, of course, you could just not publish RNG standards based on public-key crypto ;-)
[1] Schneier: http://www.theguardian.com/world/2013/sep/05/nsa-how-to-rema...
[2] Snowden: http://www.theregister.co.uk/2014/03/10/snowden_a_few_good_d...
[3] http://www.theguardian.com/world/interactive/2013/oct/04/tor...
You should avoid at all costs anything that has been standardized by NIST without going through years of reviews by international cryptographers. Dual_EC_DRBG is a clear example of crypto construction which falls into this category.
This is my general rule of thumb.
However knowing which ciphers one should use is not enough! You absolutely need to know HOW to use them. A basic and superficial example is AES in ECB mode, which is semantically secure as long as you use a key to encrypt one and only one single block. Another one is, for example, after how many encrypted blocks a key should be rotated, based on the underlying cipher used.
Once you have learnt how to use the basic building blocks of crypto you are then NOT supposed to write your own implementation and instead use existing ones....there is a small problem with this....they are broken or they either not implement all the necessary crypto constructions you need. OpenSSL is an example of broken crypto implementation, and instead NaCl does not have TLS implemented.
So this is a short summary and my personal opinion of why crypto is hard. On top of all this there are not enough experts out there which have the time to review crypto implementations or new and old constructions, and we are living a historical period where we desperately need crypto to protect our privacy.
So my final suggestions is to take some of your spare time and go through Dan Boneh Crypto 1 at Coursera: https://www.coursera.org/course/crypto
It is worth every single minute.
Once you have done that, I would also suggest you to take the Matasano Crypto challenges: http://www.matasano.com/articles/crypto-challenges/
Finally I want to thank everybody who have taken their time to create and maintain both Crypto 1 course and the Matasano challenges.
The judges who chose AES and SHA-3 as the "winners" of the global competitions are the NSA.
> You should avoid at all costs anything that has been standardized by NIST...
That would include AES and SHA-3.
Sure, however this process creates alternatives and if the crypto community thinks the winner is backdoored I am pretty sure we will know it and additionally we will have a valid alternative ready to be implemented. Additionally if the NSA/NIST modifies the specs for the crypto construction there is still the possibility to implement the original one. See SHA-3 for instance. It was about to be weakened, but the crypto community could still implement the original spec.
> That would include AES and SHA-3.
You cut the rest of the sentence and therefore changed completely the whole meaning. My original sentence included: "...without going through years of reviews by international cryptographers." Take a look at this video of D.J.B.: https://www.youtube.com/watch?v=G-TM9ubxKIg He makes a great example with the Dual_EC_DRBG, where many cryptographers told NIST that there could be a backdoor. NIST answer basically was: sorry too late, it has already been implemented !
So in other words, in case of Dual_EC_DRBG the standardization process was all in reverse. First NIST standardized it and then the crypto community started to review it and found problems.
Barring some major advance in breaking crypto (which is entirely possible) it will probably stand for a long time to come.
Here are more modern alternatives to each of Colin's suggestions:
* Message encryption: AES-CTR+HMAC -> A fast native stream cipher (Salsa20) + polynomial MAC (Poly1305, VMAC).
* Standalone integrity checking: HMAC -> HMAC or SHA3.
* Hash: BLAKE2 or SHA3.
* Passwords: scrypt or, if not available, bcrypt.
* Public key encryption: ECDH + whatever you're using for message encryption, over Curve25519.
* Public key signatures: Deterministic ECDSA, EdDSA.
* Ephemeral key agreement: ECDH over Curve25519.
* Online backups: use Tarsnap.
I absolutely agree. I am 100% in favour of being conservative when choosing cryptographic primitives.
All the alternatives you've mentioned have arguments in their favour. But unless you need to have signatures which are 32 bytes instead of 256 bytes, or you need to perform 10,000 private key operations per second instead of 1,000, or you need to build an ASIC which uses a few thousand fewer transistors, my recommendation is to be conservative.
Is it only classical cryptanalysis on the cryptographic algorithm? Or do you take into account the programming mistakes (not necessarily related to crypto) of specific implementations? Or do you allow side-channel or fault-injection attacks, which will be able to break most algorithms, if they are not implemented with specific countermeasures?
In anyway, it is a very difficult question which doesn't have a single definite answer.
Obligatory XKCD:
http://www.javamex.com/tutorials/cryptography/rsa_key_length...
Ciphers to prefer ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256
A pretty good source/guide:
https://hynek.me/articles/hardening-your-web-servers-ssl-cip...
You'll need apache 2.4+[I think], or nginx. And possibly fresh certs to use DHE/EC.
A quick rundown of a fairly secure setup:
Cipher Priority list:
ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:!ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:!RSA+3DES:!aNULL:!MD5:!DSS:!SHA:AEAD
==========================================================
Generate the cert and private key:
openssl req -x509 -sha256 -nodes -days 3650 -newkey rsa:4096 -keyout serverkey.pem -out servercert.pem
==========================================================
Generate the DH parameters:
openssl dhparam -out dh2048.pem -outform PEM -2 2048
==========================================================
How to List Elliptic Curves:
openssl ecparam -list_curves
===========================================================
Note: Generating DH parameters is gonna take a while. If you are implementing this on a slowish machine like a Raspberry Pi, you might want to use a faster machine to do the DH step, then copy file the key over.
It should also be mentioned how you came up with your ordering of TLS cipher suites, in declining priority
1. Forward security is preferred (ECDH|DH > RSA)
2. AESGCM > AES256 > AES|AES128 > 3DES
3. ECDH > DHFor instance, an hashing algorithm can be used to securely store passwords, and must therefore be slow, or to find duplicate files, a task which greatly benefits from speed. If you use a fast hashing algorithm to "securely" store passwords you might as well use a compromised algorithm since the security is nonexistent in both cases.
I think the same applies to crypto algorithms: it doesn't matter if the building blocks are individually secure if you don't know how to put them together in a secure fashion.
Also, you don't use hashes to store passwords, you use KDFs.
Well, technically a hash can be seen as a particular key derivation function (KDF). Not a proper one for the purpose of storing passwords I agree, but then most KDFs are built using salt + an iteration of hash functions, to my knowledge at least (which I admit is not very deep on the subject).
ECDSA, ECDH, and ECIES (which we don't see a lot of) all require a curve. Saying "ECDH is good" isn't helpful if you can't safely choose a curve to run them over.
ECDSA has another problem: it has a hard randomness requirement. If you repeat the per-message nonce, leak bits of the message nonce, or even fail to fill the modulus for the message nonce, you set up a condition where attackers can recover your private key. DJB is trying to push ECDSA into disfavor, replacing it with deterministic signatures.
One-time pads are awful and should be avoided at all costs. Virtually every computer program developed by generalist programmers that claimed to be a "one-time pad" was instead a crappy stream cipher.
Shamir splitting is fine, although that's a strange thing to have in your "regular use" bag.
Right. But I don't see a reason why you would ever want either of those when SHA256 works just fine. Maybe because it's a bit more efficient per byte as a PRNG, but there are better specialized tools for that.
> ECDSA has another problem: it has a hard randomness requirement.
Not if you use RFC6979. In the Bitcoin space that's been standard since about three days after the Java.SecureRandom bug. As for curves, secp256k1 and curve25519 seem to be the most popular as far as I can tell.
> One-time pads are awful and should be avoided at all costs. Virtually every computer program developed by generalist programmers that claimed to be a "one-time pad" was instead a crappy stream cipher.
Well, yes, an OTP generated by a PRNG is a stream cipher by definition. You do need true randomness for them to work. I think they're very useful if (1) you're scared that NSA has a constructive proof of P=NP deep inside their lairs, or (2) you want ultimate deniable encryption.
All encryption is breakable. You aren't choosing an unpickable lock, you are picking how good of a thief it will take to rob you.
A 4096 bit encryption might make it really expensive to attack you, but those old numbers about "it would take a computer 40,000 years to crack" don't matter much in a world where that just means you spin up 160k instances in the cloud for 3 months.
That's a Dollar amount that makes cracking YOUR bank account not worth doing. But if it were the Nuclear launch codes for Russia's arsenal it would not be undoable.
To brute-force AES-128, if you assume:
- Every person on the planet owns 10 computers.
- There are 7 billion people on the planet.
- Each of these computers can test 1 billion key combinations per second.
- On average, you can crack the key after testing 50% of the possibilities.
Then the earth's population can crack one key in 77,000,000,000,000,000,000,000,000 years.
Source: Seagate, http://dator8.info/pdf/AES/3.pdf
http://www.eetimes.com/document.asp?doc_id=1279619
But they are both still wrong.
A. The rate of Keys per second on both are way low, and B, you don't have to test every combination, Certain combinations will tell you that whole chunks of possibilities are not possible.
In truth most of the time you can narrow the potentials to 1% of the total possible to determine a range for the right answer pretty quickly.
Granted if it was as slow as 77 Billion years .7 billion years is still a long time. But no, these numbers are orders of orders of magnitude wrong.
If you don't have one it is billions of times harder. How large is your Prime table? How large is mine? How large is the NSA's?