As someone whose starry-eyed Mac obsession predated Windows 95 - Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio, and it was buggy under Jobs. I remember getting plenty of sad Macs under System 6 and 7, and early versions of OS X weren't any better.
We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.
The comparison with Microsoft is instructive. Microsoft software was even buggier than Apple's during their period of greatest dominance. Win95/Win98/WinME would crash all the time, and was an open barn door for security. Early versions of IE were pieces of shit. Even later versions of IE (6-9) were pieces of shit. Microsoft finally got a handle on security & software quality just as the world ceased to care about them.
Apple's been driving change in the computer industry since the iPhone was introduced in 2007. New products are always buggy - the amount of work involved in building up a product category from scratch is massive, and you don't know how they'll be received by the market, so there're frantic changes and dirty hacks needed to adapt on the fly, and they often invalidate whole architectural assumptions. It's just that most of the time, this work goes on when nobody's paying attention, and so by the time people notice you, you've had a chance to iron out a lot of the kinks. Apple is in the unenviable position of trying to introduce new product categories while the whole world is looking.
The Apple Watch is buggy as hell, but I still find it useful, and pretty cool.
This made total sense in some of Apple's biggest products: OS X and iPhone. When OS X first came out it couldn't even burn CDS, but we all "understood" the magnitude of the project and thus gave it some slack. Similarly, the iPhone lacked a lot and was slow, but it was such a revolution that we let it slide -- in fact we let the rest of the products slide.
The problem today I think is that these decisions are being made for reasons that users don't deem "worthy". Introducing some new music services is not a good enough reason to break my iTunes. The fact that a new watch was released is not considered important enough to let other platforms languish. We "get" why less attention is being paid to other products, but unlike with the phone, its not deemed a good trade off.
In other words, I don't think Jobs was distracting us with promises, but with actual shiny things that made the bugginess worthwhile.
I forgive most faults that happen because it's almost as if I can forgive them not working all the time since 99% of the time, everything is awesome. On Windows, that same forgiveness manifests itself as me not using my Windows machines as much as my Macs. I still love to use my PCs, but not for anything that I need to rely on the majority of the time.
Now, though, Apple is making changes to things (iPhoto/Aperture were a really great example) where it seems like the change is just to bring parity of some sort to OS X and iOS rather than introducing new features. iPhoto was buggy as hell when they added Faces and Places to it, but I totally forgave that because 99% of the time it was making my life way easier than it was before by detecting faces properly. If it crashes every now and then, it at least saved the data, so I was still better off than I was before the update. I still like Final Cut X (I know, I know... I'm an outlier), but convincing me that a switch like iPhoto/Aperture -> Photos is worthwhile is much harder since there's nothing to distract me away from those issues and I've somehow managed to actively lose features that they convinced me were necessities in the past.
I hope this is not an indicator of things to come. One thing that gives me some hope is that they've gone back to alternating between feature updates and stability updates. Leopard was cool, but Snow Leopard was incredible to me. If that pace comes back, I'll be happy again. Until then, Apple needs to get their software game back in line with the rest of the company.
Right - which is why we have all of the snow leopard nostalgia: because none of the newer releases have given us anything substantive that we really needed to justify the hassle and the bugs.
I am trying to think of something - anything - that compels me to upgrade SL on my mac pro, and all I can think of is that nifty take-a-picture-of-your-signature in the Preview app that you can then insert into PDF documents.
Ok, and maybe USB3 ?
That's all I can think of.
I'm a Safari user (better battery usage for the # of tabs I have open) and it too has improved with El Capitan though that's irrelevant for Chrome/FF users.
Also SL mamed Expose (that weird non-proportional grid view) that was reverted to the Leopard-style in Mission Control (of which Mavericks/Yosemite had the best implementation, and they've now broken its utility in ElCap thanks to hiding thumbnails by default. FFS.)
But apart from that... I think I preferred the Apple apps back in 2009-or-so.
To be honest, I think the latest Apple release cycles have been more about "remove a feature so that we can add it in again and sell it to our users again". Think multi-monitor support, something that worked perfectly in SL and earlier, and then broke fantastically with the full screen apps in .. Lion? ML? One of the two.
Apple always makes up for bugs with newer devices with faster CPU and GPU units that make OS code run faster. That means buying a new Apple device to get better performance. The older Apple devices are left out of updates eventually and if they do update to a newer OS version it runs slower.
Apple is driven by an upgrade model to buy a new Apple device every three years or so. In the PC world Windows 7 can still run on old Pentium 4 systems and if I am not mistaken some of them can upgrade to Windows 19, the 32 bit version but it can still work. For example I used to have a Macbook that only ran up to 10.7 and 10.8 needed a newer Intel CPU to install. Anyone with an iPhone 4 is going to find the latest iOS slow as well.
It is in Apple's business model to sell customers a new device every few years or so and phase out old Apple devices.
Apple doesn't care if their software isn't the best quality as long as it is easy to use and will keep people buying new Apple devices to run things faster.
I myself like GNU/Linux better than OSX, because it can run on older PC systems and it runs quite fast and has a good quality to it. GNU/Linux is virtually unknown to the average consumer and when people get tired of Microsoft they usually just buy an Apple device. Apple devices are easier to maintain and use. You even got toddlers using iPads, that is how easy they are to learn to use.
Apple has saved up billions just in case they have problems. Apple has done well financially in an uncertain economy where other companies are struggling.
Only Alphabet seems to be doing better for some reason. Google's parent company. Google's Android needs better quality as well and since Oracle sued them over the Java API they have to change the way the OS works. The Web Services seem to earn a lot of money and Google's AI is very advanced.
I disagree.
My wife's iMac is 6 1/2 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iMac soon.
My iPad 2 is almost 5 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iPad soon.
It is precisely because our older Apple hardware is still working well, and Apple still supports us with the latest updates to that older hardware, that my family is not only sticking with Apple, but we've recently invested in new iPhones.
Apple has earned our trust.
Compare this to today's Apple, where upgrades add "hundreds of features" but feel mostly the same (except everything runs a bit more slowly). There's no coherent vision of what the future of the software should be like.
Apple has always been a fairly closed system and it didn't bother me more than not having the features I wanted. In El Capitan, it was different. Things didn't work well and Apple took over my whole system. With SIP(system integrity protection) I had no control. It would seem to turn protection back on after being disabled, and it takes a nontrivial amount of time to turn it off because you have to reboot the entire system into recovery mode, wait for it to connect to the internet and download a bunch of apple shit, and then select a language preference and then type a command into bash and reboot.
Deleting apps is difficult, changing settings is difficult, having siri take up 10% of my iphone is annoying, removing apps destabalize the system, installing my system from time machine reinstalls their system and settings and overrides mine.
I disabled most of apples applications and processes, the system in fairly stable, although I think I went to far with disabling notification center, but your point is correct.
tl;dr users are willing to accept a lot for revolutionary changes. Evolutionary changes with only marginal improvements are not going to make me forget that they unpredictably disallow me from using sudo and are fucking up all my devices doing things I don't want them doing in the first place.
Some time, try this yourself:
sudo opensnoopI think you forget how crazy slow feature phones were. Opening a GPS app and finding your location could take 5-10 minutes in 2007 on a feature phone.
Classic Mac OS was buggy by design - it didn't have multitasking and memory protection, it was single user... Windows had that same problem before 2000 (well, NT 4, but not many people used that).
I use OS X daily and I use Windows 7 daily. I have far fewer issues with OS X for whatever reason that may be. My computers don't magically reboot or bluescreen nearly as much. It might happen every 6 months at the most, where with Windows it probably happens every 2 months.
Because of Apple making OSX Unix based, it cut into sales of other Unix companies like SGI, and also GNU/Linux cut into sales of SGI and others as well.
But making OSX Unix based solved a lot of problems that Classic MacOS had that they couldn't solve.
My pc is my beater car, and it needs repair--regularly.
My Mac is the classic car in the garage, that only gets used for work, or safe places.
Windows, not so much. The only stability issues I've had with windows have been related to poor drivers, almost exclusively from nVidia or ATI/AMD. The equivalent hardware for Apple machines either didn't exist at the time, or was running much less ambitious drivers.
I probably have more issues with my Macbook Air (relating to sleep, hibernate, and wake-up) than I do with my Windows machines these days.
Compare that with the _desktop_ Windows 7 machine. It first crashed intermittently (memory failures), but after I changed the motherboard, it has not crashed at all. But then again, I am not using, for example, the most cutting-edge graphic drivers.
I remember quite some crashes during the Windows XP times, but I've since taken a more conservative approach to hardware and drivers.
You try fitting all that plus a GUI those constraints.
What's amazing is that it had the features it had and that it worked at all.
Both provide GUIs and rudimentary Web browsers. QNX was full POSIX, too, although the demo disk didn't include a terminal.
Pity that nobody remembers Windows NT4, it was miles better than OS7. I stopped using mac altogheter after starting to use it.
Not quite. They got a handle on security when Linux list a fire under their ass.
Competition, true honest to market competition, spurs improvement.
The thing about Apple is that they may have competition on hardware, but they have no competition on Software.
If you buy a Mac or a iPhone, you have already thrown money at Apple. But you can easily assemble a PC without Windows and then install Linux on it.
Keep in mind that the latest US warship is not running Windows, but RHEL. That is a very big wake up call for Microsoft, where before we have seen the likes of Win2k (US ship) and XP (UK submarine) used around the world.
I have my doubts that all 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS influenced Microsoft to do anything but if you have data to show otherwise I'd be interested in it.
But to manage all those desktops you need server, and with MS the billing pr active user etc.
Some data says Linux desktop/laptop share is 1.5% (not counting chromebooks)
https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...
Please note also that Android is Linux and iOS is Darwin BSD unix.
Linux and Unix based kernels are more numerous than Windows based units.
Wikipedia: 1.5% [1] NetMarketShare: 1.71% [2] W3Schools: 5.6% [3]
[1] https://en.wikipedia.org/wiki/Usage_share_of_operating_syste... [2] https://www.netmarketshare.com/operating-system-market-share... [3] http://www.w3schools.com/browsers/browsers_os.asp
But you can install Linux on it for sure :)
When people talk about OS X having issues, they often mean some new feature is a little flaky. Classic Mac OS lacked basic stability and security features like preemptive multitasking and memory protection. Classic Mac OS was just like the pre-NT Windows: crash prone.
OSX also is still the best development platform despite it's flaws.
Microsoft is making over 4 times what it made in its glory days, growing year by year, across a wide range of products and services. Windows and office account for only half of that, making them a diversified company with plenty of potential for revenue growth. Windows 10 is by far the most successful windows release ever, with more active installs than os x (any version). Basically the only place microsoft is truly failing is phone.
Apple by contrast gets two thirds of their revenue from the iphone. They have nothing else that even comes close, and nothing that could replace it if iphone sales start dropping. Mac sales are down, ipad sales are down, and the apple watch is a dud. Since 1990 apple has basically had two hits: ipod and iphone. I did not mention ipad because it is just another iphone model, which you can tell by its sales slumping as iphone screen sizes moved up. Success for Apple is rare, and most of what they do isn't all that amazing. The apple tv isn't going anywhere, even after the refresh. The apple watch distinguishes itself from other smartwatches only by its price. Basically the only place that apple is truly succeeding is phone.
Perception is everything. How you choose to look at the facts determines which facts you see. Apple is perceived as strong and microsoft as weak, but the facts give you the option of going either way.
Regardless, apple has few excuses for any quality issues. They have the resources, and they have had enough time (given that aside from the watch everything else is half a decade old or more). Personally my mac and ipad anno 2015 have the same amount of glitches as my mac and ipod anno 2005. For me, Apple doesn't seem to be getting worse, but they don't seem to be getting any better either.
I tend to characterize the "reality distortion field" as a magician-like talent to focus an audiences attention on a particular subject.
As the villain in the Incredibles said:
"When everyone's a superhero, no-one will be."
I didn't know that story. Who was the professor? What was the technique?
> This means it can be learned.
Oh definitely. Magicians learn all their tricks, and they are very useful for anyone performing in front of a crowd.
> The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.
Which reader is that?
If you treated 16- and 32-bit Windows nice -- typically running one program over long time periods -- they were quite stable on the plant floor.
So how do we measure this in some valid manner?
When System 7,8 crashed, it crashed hard. Complete system lockup. And it crashed rather often. No recoverable... progressive crash like Windows.