This isn't the first extremely serious and dumb High Sierra password bug this year [1] [2], and unless Apple is severely hurt by it, so they're forced to change, it won't be the last. High Sierra is full of bugs and seemingly not just annoying bugs, but also security bugs.
Let's hope Apple gets sued for the damage they'll cause by including this bug in High Sierra so they make sure that next release of macOS won't be another bug filled mess.
[1] https://arstechnica.com/information-technology/2017/09/passw...
[2] https://www.macrumors.com/2017/10/05/macos-high-sierra-disk-...
Encouraging irresponsible disclosure because one wants to see Apple hurt is a reckless and selfish attitude because it puts millions of Apple customers at risk in the process.
I don't want to see Apple hurt (I'm an Apple-guy myself, using Macs, iPhone, iPad and Apple Watch), I want to see them improve. I doubt they start will start caring about QA unless they're forced to.
One absurdly serious and stupid password bug like this can be a honest mistake, but three (that we know of, that were full disclosures) in a few months is negligence that should be criminal if it isn't.
Now if every person started disclosing vulnerabilities via twitter without giving the company turn around time to resolve the issue based on their dissatisfaction with Apple based on standards they came up with personally, I don’t think it is nice or fair.
A root password solves this issue. Its seconds to implement and helps right now.... Not "later" as closed disclosure does.
I'd rather know every error and critical bug. I can bring up with our team and decide now to either sudo service * stop or continue.
Your closed options keep the fact I'm vulnerable away, along with any pathways I might have to fix.
I mean, this bugs has been reported already - by every cheesy hacking movie ever, by every beginners book on social engineering and so-forth. Heck, it was "reported" by Richard Feynman talking about cracking safes during the Manhattan.
IMO, this behaviour is part of the problem, the reason why tech companies take security only on a superfiscial level seriously.
Don't kill the Messenger.
EDIT: putting users at _additional_ risk
edit: Typo.
> it puts millions of Apple customers at risk in the process.
Nah, it's Apple which put millions of customers at risk, not the person who disclosed the vulnerability. let's not shift away the blame from the guilty here.
Apple one of the richest company in the world is obviously just cutting corners in QA here. This is unacceptable.
it's seems some people here are more concerned about negative publicity than user security. This is a pattern that have been seen countless times in big tech corporations(such as Yahoo), not disclosing hacks that put their users and their data at risk. This is unacceptable for a company that claims to be all about their users.
Yes, it's Apple's fault for poor QA that this was released, but this guy also put users at risk by telling the entire world about it without giving Apple a chance to fix it.
You're right, it's about user security before publicity. So make sure users are safe first.
Nowadays, you're "irresponsible" if you don't follow some vendor's own made up procedures.
Disclosing 0day vulnerability via Twitter for the sake of self promotion is bad. Especially when you advertise yourself as a software developer.
It's not a bug; it's a bad design decision. How to initialize the root password on a new machine is a hard problem in a consumer environment. Some people will set it, lose it, and then want support to fix it. One would expect some clever Apple solution, such as initializing the password to random letters and providing the buyer with that info on a scratch-off card. That way, the buyer can be sure no one has seen the password before they use the scratch-off card.
Setting it to null? That means nobody thought about the problem.
Apple put millions of their customers at risk by skimping on QA. As an Apple user I'm OK with this getting out if it motivates Apple to improve their approach in the future.
Edit: as usual, downvotes but no response. I miss when this place was decent.
The very comment you are replying do lists a reason why disclosing huge vulnerabilities without providing upstream time to patch is irresponsible: "because it puts millions of Apple customers at risk in the process."
Your comment doesn't refute the reasoning the comment you are replying to provides, and it also doesn't tell us anything about why you think "There is nothing irresponsible about disclosing huge vulnerabilities in software by any means necessary." You state your position, but offer no rationale, no reason for it; why should I accept your position as the correct or ethical thing to do?
I'm a die-hard Apple user myself, but I agree that the long list of severe bugs in High Sierra is absurd, and a big public backlash might be enough to kick them into gear. On the other hand, I, a university student with next to no understanding of computer security, can simply walk onto campus, sit down at a Mac, and within seconds have complete access to the computer. It's ridiculous, it's horrendous that it shipped like this, but it's not something that needed to get out, especially something so easy to utilize.
Us geeks have been complaining about the horrible QA in macOS for years, yet nothing has been done. The fact that this is so simple to do will probably/hopefully get ordinary people to start talking about it too ("Hey, have you heard that you can hack Macs without a password? Very insecure"), which would force Apple to improve.
I think you have to be very careful about that line of argument. It's a single vulnerability researcher making a unilateral decision about the short term and long term security of an entire user base, based entirely on personal judgement. I personally think the researcher should make the decision that best protects users from that specific vulnerability. Making long-term changes to a company's QA should come second.
I find it odd that you're putting the responsibility of making decisions about how to protect Apple's users on an unaffiliated third party.
Apple has a multi-hundred-billion dollar war chest and, if they wanted to, could afford to make macOS the most secure operating system on the market. The fact that they don't is their own choice and a reflection of their priorities, not some act of God or a natural disaster. Putting the onus for cleaning up the mess in the most "responsible" way possible on third parties with a fraction of Apple's resources is being too kind to Apple.
Unfortunately, I don't believe those will happen.
I don't have any experience with enterprise-grade IT, but it seems like shared computers should be thin clients or at least use UEFI to securely boot an image over the network and not keep anything sensitive locally.
If you give someone physical access to a box, they will be able to own it.
its educational for the end user. You cannot trust Apple. Good reminder there are other OS available out there.
One would think that something as simple as a login would be deterministic.
Edit:
See: https://stackoverflow.com/a/33272796 for a bit more information of what I mean.
How would you feel if someone discovered a 0day at a company that exposes credit card and identity info, published the 0day, then hackers steal all that info (including yours)? I'm sure 'creating a thunderstorm of negative publicity' would be the last thing you would want.
You mean, in addition to bad QA and complete disregard for their users' security? And being the richest and most profitable company ever, cutting corners and evading taxes?
Their response on Twitter was amazing: "PM us so we can discuss this privately", not "thank you, we're looking into it NOW".
Apple is a Rorschach test writ large. What people see in it reflects more on the observer than the company in many cases.
If so, why? How do you identify companies like Apple that get one set of rules to other companies?
Yes Apple shouldn't be having this issue but disclosing a 0-day issue can possibly hurt users far worse than hurting Apple. Apple may lose a tiny bit of money but users could lose far, far more especially if someone develops a good way to remotely deploy / take advantage of this defect.
Ignoring responsible disclosure also limits the ability to sue them for any damage resulting from it (or so I'm told by one of my lawyer friends who thinks this disclosure may make it almost impossible to successfully sue them over it unless it simply takes them too long to fix).
How can that happen in any case ? Isn't pretty much the first line in every license waiving of liability ? Unless you have some special contract with Apple that overrides other standard boxes that you ticked, how would anyone sue ?
It's about protecting systems RIGHT NOW from immediately causing harm to people's lives.
https://www.eff.org/deeplinks/2017/10/drms-dead-canary-how-w...
Blame the DMCA. This guy is in Turkey - does GP really think he can expect fair treatment and equal compensation as a "western world" security researcher?