Wikipedia's definition: "the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system."
Youbico isn't saying that the security of the device is increased by keeping the source code secret.
They say they are increasing the security by things like this: disabling user-loading of new firmware (which could be a bad actor loading bad firmware), using hardware with built-in side-channel countermeasures, and disabling JTAG ports (which could be used for key extraction).
This isn't obscurity. These are some good engineering arguments. Engineering is always full of trade-offs.
"Youbico isn't saying that the security of the device is increased by keeping the source code secret."
Yeah, they're not really saying anything other than trying to provide an excuse for why they won't release it. "You can't use it anyway" isn't much of a response (I actually find it rather patronizing and dismissive).
Not to pile on, but regarding: "Engineering is always full of trade-offs."... what exactly is the supposed trade off here? (Maybe they're using licensed code that they can't redistrib?)
Are all of those listed features only possible with secret code? And if yes, once someone unobscures the code or methods, they'll be able to defeat the security. Isn't that the exact definition of 'security through obscurity'?
They clearly changed stance to ensure users cannot play with the hardware and competitors cannot copy the code. Which is fine. But its always weird when the argument of security is used instead of being genuine.
You can copy the freaking key by removing the plastic of the yubikey4. you dont need a jtag port. you just connect to the pins. And guess what. its no big deal. You can't do that remotely and its not a device for 007 spies.
As per the statement (and earlier statements) you can't change the firmware unless you have a yubikey neo developer edition, which was only sold during 2012 and 2013. The change here is that the yubikey 4 doesn't run open source code (for the pgp part) as a result of changing platforms. The best way to show that you support open source is to buy the YubiKey NEO instead of the YubiKey 4.
Any more information available? googling for "yubikey 4 takeapart" got me nowhere.
Well, sort of.
In the linked article Jakob Ehrensvard (Yubico CTO) wrote:
>> (…) One could say it actually works the other way. In fact, the attacker’s job becomes much easier as the code to attack is fully known and the attacker owns the hardware freely. (…)
While the rest of the article makes good points, this particular sentence hints at "security through obscurity".
The principle with open source is that you can trade that obscurity away in favour of the "many eyes" on your code and the fact that it is then proven secure. That tradeoff is definitely worth it, but that doesn't mean that the obscurity doesn't help security.
Could this be a sly attempt to close-up the source (and hardware) before they have a Tangibot[1] situation?
That scenario played out poorly for MakerBot, and perhaps YubiCo learned the wrong lessons from the entire ordeal.
[1] http://www.cnet.com/news/pulling-back-from-open-source-hardw...
Am I understanding correctly that these devices can never have their firmware updated? That there is no update mechanism seems insane. They could prevent bad firmware updates by wiping keys on upgrade. The risk now is that some firmware version is discovered to have flaws, and that device is vulnerable forever.
This happened last year and they offered free replacements for affected users[1].
[1]: https://www.yubico.com/2015/04/yubikey-neo-openpgp-security-...
This does not close the attack vector of someone intercepting the device before you get it and surreptitiously installing firmware with a backdoor.
The most important thing any security company needs to realize is that their primary product is their reputation, not the physical or digital goods that they produce. "We, as a product company" is totally the wrong attitude. There's really no question about it, every ounce of closed source software/hardware in a security offering is something the customer should be concerned about it.
From a product perspective it totally makes sense to be worried about open sourcing the entire design. "Our competition will make clones!" And that may be true of every other kind of product. But would you buy a cheap knockoff Yubikey? I certainly wouldn't. Again, reputation is the key here. That's what a security company sells to their customers. Confidence that when they buy from company X they know that company X has put the best engineers to the task and crafted a device that will protect their valuable digital information.
A company can build up a reputation in the security industry, produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets. That premium is completely derived from the trust that they've garnered. It's insane for Yubico to squander theirs under some false sense of IP security.
EDIT: And all that said, I totally understand where they're coming from on some of their points. They have to depend on chip manufacturers, and chip manufacturers are just the absolute worst when it comes to open source and security. Sometimes there are hard constraints and compromises have to be made. Most of cryptography is a trade-off. So don't take my comment to mean that designs absolutely have to be 100% open source. That's infeasible most of the time for hardware. But Yubico should be striving for it and pressuring the market.
Hmm. I think there's considerable limits on how true this is. I would argue Yubikey's current security is more than good enough for almost everyone.
As mentioned in your edit, there's not a lot Yubico can do about the hardware restrictions. Given these restrictions, a common way companies in this industry assure users of the security of their device is FIPS 140-2 certifications, which range from levels 1 to 4.
Level 4-certified devices are extremely expensive, and the market for them is tiny, which seems to indicate that there's a definite limit on the amount people and organisations are prepared to pay to ensure security.
That's semi-true. They're both important. The belief that the product is worth buying and effort into selling it are primary importance. Getting hacked or sued in public diminishes sales. So, the most important aspect of security for these kinds of companies is perversely minimizing potential for their image to be hit by hackers even if the products have no security. Not an accusation at Yubico but a common strategy in this market. So, they just have to present a good impression to target market.
" every ounce of closed source software/hardware in a security offering is something the customer should be concerned about it."
Not really. It might surprise you but many companies have run for decades on proprietary platforms. They generated ridiculous sums of money in the process. All kinds of people got jobs, made money, and retired in this time. Nothing to worry about apparently most of the time. The reasons to worry are there but smaller than you think. One must balance many needs in a business. For most, this kind of thing is a checklist item about reducing liability. They're fine if it looks good on paper.
" But would you buy a cheap knockoff Yubikey? I certainly wouldn't. "
Most would. They want something as an obstacle to hackers while minimizing cost. They don't know if Yubikey has any real quality underneath given how businesses often do things. So, it's a real Yubikey vs a cheaper one. Many, not all, will choose the cheaper one. See Cisco and mobile manufacturers vs Huwei to see how big of a market share that can lead to.
"Confidence that when they buy from company X they know that company X has put the best engineers to the task and crafted a device that will protect their valuable digital information."
There's a market for that. I used to try to serve it. It's tiny and fickle. Yet, I question what confidence people have in those engineers to begin with as they've never assessed their capabilities in INFOSEC and strong attacks rarely are publicized. It's not like Googling rate of car crashes.
" produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets. "
Many tried. Market rejected almost all of it. Still does. They want security-defeating feature X, protocol Y, and fall-back Z. They want it to run as fast as competition despite security or safety checks on insecure, potentially-backdoored hardware to get COTS HW benefits. They also don't want to pay hardly anything extra for it despite whole teams of extra people being put into every other component for rigor and price of external evaluations. Market for high-assurance guards is so small that they have to charge over $100,000 per unit to make the money back. Hell, Signal is free and Threema charges $1-2 but they're barely a fraction of 1% of WhatApp or Facebook in marketshare. Demand-side is the problem.
So, Yubico is doing what's good for business. All of them are and should until market shows it's willing to make the compromises necessary for strong security. They won't. So, wasting money on it is foolish outside defense sector, academia, and a few niches (eg smartcards) where one can keep a job doing it.
Implying the Huawei is the "cheap knock off" and Cisco/Apple/Samsung/etc are the noble high quality product fighting the good fight....
My Hauwei Nexus 6P has been the best phone I have ever owned, far exceeding the quality and usability of every Motorola, Samsung, and other phones I have owned.
As to Cisco, after their fasco with the NSA I would not trust them at all for security.
Other companies have managed secret distribution for secure devices just fine - randomise the card manager key and bundle a tamper proof packet containing the key along with the product. Provide instructions on how to verify the integrity of the packet, and confirm a digitally signed affirmation of the key against Yubico's public key online.
That's more than RSA offers for SecurID seed verification and more than my business bank offers for two factor device PIN integrity checking.
I'm not sure who they use for their Secure Element (NXP?) but it also sounds like Yubico has gone along with their request (and NDA) to keep implementation details secret. We've seen a similar situation in SE implementations in mobile phones (for contactless payment, primarily).
Again, enterprise customers don't care (mid-sized one have insurance that will cover loss if their Common Criteria EAL 5+ vendor's hardware is compromised, big enterprise can pay for auditing). Governments don't care (they'll pay for auditing or negotiate it in any significantly high volume contract).
End users and the tech community are the only groups who'll really lose out here.
1. Hardware cost money to develop, has to make it back, and is easy to clone. They'll keep hardware secret by default for this reason like everyone does. Also lowers odds of patent suits. All kinds of people demand open, secure hardware but almost nobody will buy it. Just like software. Number 1 problem in the INFOSEC industry.
2. There's three companies IIRC building the kinds of secure IC's they need. They NDA the stuff critical to understanding it. Plus, the implementations are secret with tamper-resistance mechanisms. Pointless relying on open-source model to understand or evaluate such a thing. Some marginal benefits but major risks would still be there. Whereas, open-sourcing the stuff adds risk in terms of issues with the suppliers. So, no OSS is an acceptable choice here.
3. Restricting some of the firmware/software is a tradeoff of the protection methods they're using. Again, reduces value in open-sourcing it as you'd have to dump it off the chip to verify it anyway. The kind of people that can do that don't need Yubico's help.
4. Yubico might not know how to build secure HW/SW combos. It's a rare skill whose techniques are a mix of published and trade secrets. Plus, attackers are always coming up with new stuff. So, obfuscation... not security by obscurity... but obfuscation of aspects of design to increase work of attackers between product releases is both justified and a proven method. If no other measures exist, then it would be the garbage known as security by obscurity. This seems to be better practice of proven mechanisms plus obfuscation which can hamper even nation-state hackers. Who knows how good their mechanism are going to be but there's potential.
So, it seems like a combination of sustaining their business by stopping clones and lawsuits with improved branding from effects of obfuscation & hardened IC's on low-skilled attacks that dominate the press. Two, very-good reasons to make a decision in this market. It's just economics in action. :)
People that spend considerable effort turning a good idea into hardware that sells tell me otherwise. ;)
"because it makes it easier to reverse engineer and clone the chips themselves."
You first said it's easy to reverse engineer and not valuable. Then, said they want closed designs to reduce reverse engineering and cloning. Which is it?
"For YubiKey themselves it's mainly the firmware"
That may be true. I can't speak to that.
I also don't see anything that would really prevent them from just releasing the source they're using, even if we can't realistically do anything useful with it. The whole point of those systems is that it's secure via algorithms and hardware silos - releasing their sources shouldn't change anything.
But in practice it doesn't really matter that much - as long as they use standard interfaces and replace your key for free if someone finds a vulnerability, I'm (cautiously) fine with their new position. I think a big part of the issue is that they did something better before, but if they started with the current design, people wouldn't really complain about it that much.
Separated program and data memory with only one executable. USB host would get (in hardware) an outright memory dump of the program memory on connection, so it could hash it/compare it to known-good firmware. If you flashed the firmware the data memory should get wiped, and if you flashed it with anything the driver didn't know as a good build, unless you manually whitelisted it, you'd be warned.
That seems like a better approach to me. (It turns out I really suck at designing hardware, let alone secure hardware.)
Doing the same kind of general thing with, say, a RISC-V microcontroller and trying to secure the RAM seems like a generally fruitful possible course of action? Let's see how Lowrisc turns out.
The problem is, it's not a tradeoff Yubico have to make. They can allow users to achieve the same goals by distributing the device un-flashed, with the source code to the firmware. Upon flashing, the firmware would disable further flashing. If the user doesn't like this tradeoff, the user can choose to change the code. As a courtesy to more trusting users they could provide the service of optionally flashing devices for you. And qualified users can verify the security of the firmware before loading it.
But by flashing the devices themselves, Yubico has chosen the worst of both worlds. Now an outside actor can once again add malicious firmware: Yubico is an outside actor. AND nobody can verify the security of the firmware. This isn't even a tradeoff, it's just a loss.
There is the possibility of the device being intercepted before it reaches you. Or before you have gotten around to locking it down. Or when you plug it into your (compromised) system to lock it down.
Since all communication is done over the USB port, the problem is that the firmware can be flashed with a backdoored firmware that appears to be normal/unflashed. One that can be flashable (by basically having a virtual machine/emulator that runs the flashed image), appears to get locked down when you go through any lockdown process (since you just end up locking down the VM). But still has the backdoor in place.
Firmware aside, people can modify the hardware too. Unless you crack open the device and inspect the internals (which many devices are designed to prevent). And even then a really sophisticated attack could replace the chips with identical looking ones. If you are using off the shelf ones then it wouldn't be that hard. They can also add an extra chip before the real one that intercepts the communication. Or maybe compromise the 'insecure' USB chip (if it's programmable).
With locked down hardware the manufacturer can bake private keys onto the chips and ensure that the official stuff checks the hardware by asking it to digitally sign something with a private key. But if the attacker has added their own chip between the USB and the legit chip, they can pass through the requests to the official chip.
TPM will do something like keep a running hash of all the instructions that are sent to the hardware and use the resulting has as part of the digital signature verification, but if you mirror the requests that doesn't help.
The next stage is to use the keys on the chip to encrypt all communication between the 'secure' chip. So any 'pirate' chip won't get anything useful.
Users could be allowed to 'bake' their own keys in, but that leaves us with the intercepted hardware problem. The attacker gets the hardware, installs fake firmware that appears to accept your custom key and preforms the encryption.
Personally I think worrying about security to that level is over kill even if your dealing with quite a bit of money. It would have to be quite an organised attack. They would have to gain physical access to the device, compromise it, return it unknown and then gain physical access again later. Requiring both physical and digital security skills.
That's much more work than just, stealing it or applying Rubber-hose cryptanalysis. Attackers can also compromise the system being used to access whatever.
I have had my own business, and the one thing I would say to the critics of Yubico: If you have a way, given existing hardware and software tools and suppliers, to do a better job, step up and do it. AFAIK, Apple didn't opensource their hardware related to crypto, or their software.
I think you will find it takes more than wishful thinking; more like, put your money ( = or your time) where your mouth is. Engineers, and I don't just mean CI engineers here, know it is a long way from a math equation or set of equations to a real world working object. I would love to see, and I would contribute money to an opensource solution. I just don't think it is as cookie-cutter simple as the majority of comments seem to intimate on this forum.
All that he says is summarized in "it was too hard to think of a solution, so we didn't do it."
With regards to the applet manager, that seems to be an issue with customer friction less so than being too hard. While "crypto nerds" would be fine, business applications could be affected.
No hardware is 100% secure and for Yubico to say this issue is about "Secure Hardware vs. Open Source" seems like a red herring. Perhaps they are just trying to protect their business model? After all, there isn't anything particularly unique about the hardware.
So, it's not so simple. Otherwise, all buildings containing valuable protected by locks and stuff would be compromise because enemies had potential of physical access. They aren't. That's telling you something.
this is a fundamental concept in FOSS and for anyone to try and rationalise their way out of it - be it out of some corrupted sense of trying to do the right thing - is absurd.
fortunately i feel that the very people that would be interested in this device will be aware of this; i hope the folks at yubico reverse this decision.
My reasoning: I don't need physical tamper-resistance for my threat scenario - if it is stolen by a random thief, a coworker, a "friend", etc..
But if I was attacked by a nation-state-like actor, I cannot trust any security measure of the device. How do I know the NSA does not have a copy of every "random" card-manager key? How do I know that generated keys are not subtly biased so that they can be guessed easily? Or that there is not a secret function to extract them? Even if Yubico is 100% honest and their device is clean, I must assume that if e.g. the NSA were after me, they have the technology to extract the keys from the device, no matter what protection it has.
I hope the response from consumers will be: we understand your position. Unfortunately that is unacceptable and we'll look for another vendor. It is mine. I own a Neo, not getting any of their future products.
Also as a strategic guideline...maybe if you're in the business of security...don't use hardware that requires NDAs. Yes it'll make it impossible to do some stuff and more expensive to do some stuff but I'd say there's really no option to compromise.
Their current industrial design very clearly says "hey, I am an important security key", which is exactly the wrong thing to do.
It should instead look like a cheap flash drive. And when the thief plugs it in, he sees exactly that, a low capacity USB flash drive, unencrypted, with some random documents on it.
Is the thief at this point going to perform some sophisticated hardware hacking? No, it will just get thrown away.
Regarding DES, the smartcards and HSM's were originally developed for use by both government and financial industry. They originally standardized on DES then used 3DES to reuse their HW and SW. It was one of few tradeoffs that made long-term sense given a three, key version of a 1975 algorithm is still secure in 2016. That's 41 years of security through variants of that algorithm. Unheard of in our industry. That you call 3DES, itself going strong almost 20 years, something that should be repellant shows the difference between security-critical sector and mainstream. Former prefers what's proven longest with latter preferring what's popular and good in theory. Both AES and 3DES are valid choices given peer review. That their money-makers came from 3DES customers made the best choice obvious.
Regarding NDA's. A HW guru that taught me what I initially knew on the subject mentioned patent suits. He said his company refuses to do business in the U.S. since those companies get sued into the ground. There are so many patents on HW, esp microarchitecture, that it's impossible to avoid all them. So, he said keeping things as trade secrets was a common strategy of smaller firms to reduce legal risks and ensure profits. Also, reduces copying and attacks by hobbyists. And of course they didn't say "hides infringement" in the datasheet. :P
I stopped there since I think these should address your concerns. At the least, it should start to make sense what those companies are doing whether we like it on our end or not. Personally, I'm more a fan of Caernarvon OS for smartcards as one of the inventors of INFOSEC (Paul Karger, grandmaster of high-security) made it. Look it up for interesting lessons on what smartcard OS's deal with in terms of development and certification difficulties.
Personally I only tried IsoApplet, but openpgp applet should work too.
With the older YubiKey NEO devices, the applet source was available and I could freely upload an applet. This was great for a few reasons. I could modify or upgrade the app (of course, doing so would cause me to lose existing keys, which makes sense from a security PoV). (I actually did this on my old YubiKey.) I could also, in principle, audit the app. And, if I trusted Yubico to get their security right, I would trust that my freshly-arrived-in-the-mail device was secure. Moreover, if I trusted Yubico not to act maliciously, then the applet on the device I got in the mail would match the firmware on github, and I could trust that it did what I thought it did.
There were, of course, problems. The GlobalPlatform platform is awkward to use, the toolchain is terrible, and the key management is awkward at best.
I could not trust that a key I installed in the OpenPGP applet while my computer was compromised was secure.
With the new locked-down NEO devices, I can't change out the applets, and the bad guys would also have trouble doing so. As before, if I trusted Yubico not to act maliciously, then the applet on the device I got in the mail would match the firmware on github, and I could trust that it did what I thought it did. Also, as before, I could not trust that a key I installed in the OpenPGP applet while my computer was compromised was secure (because an attacker would simply export it before uploading rather than swapping out the whole applet).
Enter the YubiKey 4. If I use one, I am completely at the mercy of Yubico and their third-party audits. I cannot audit the code myself. Even if I trust Yubico not to act maliciously, I have to take them entirely at their word that they didn't accidentally mess up. And, of course, I cannot not trust that a key I installed in the OpenPGP applet while my computer was compromised is secure.
In other words, there's a big difference between source-available and source-not-available, even if I can't personally verify that the source I think I'm running is the source I'm running.
As an aside:
> There is an inverse relationship between making a chip open and achieving security certifications, such as Common Criteria. In order to achieve these higher levels of certifications, certain requirements are put on the final products and their use and available modes.
This may well be true, but, if so, it's a sad statement about Common Criteria and their misguided rules. Publicly disclosing the source code of an EAL5+ device should not reduce its supposed security level.
With SGX, Intel had the chance to offer a widely available security token (built in to every new CPU!) that anyone could freely program and use for their own security purposes. They blew it when they created their "launch control" policy, which essentially says that developers who don't sign lots of contracts (which you can't even read without an NDA AFAICT) can write an applet but can't run it. The Linux community, at least, is pushing back hard, and this just might change in the next generation of CPUs or maybe even sooner. Fingers crossed.
This inspires a challenge to Yubico: give me a hardware token that runs applets. Let the token attest to the hash of a running applet, but let it run any applet whatsoever. If I want to verify that I'm running the bona fide Yubico OpenPGP applet, I can check the hash myself. If I want to replace it, I can, but then the hash will change. It'll be hard: you'll have to figure out a real isolated execution environment. It's definitely doable, though.
Now rescinded.
The YubiKey NEO was always "unsecure" now with the YubiKey 4 it's only possibly "unsecure".
"If you have to pick only one, is it more important to have the source code available for review or to have a product that includes serious countermeasures for attacks against the integrity of your keys?"
Even worse, you won't know when the device becomes obsolete. So you might be buying an insecure solution from the start.