- M1 [0]
- A14 in the iPhone 12 Pro (first 5-nanometer chip with 11.8 billion transistors) [1]
Previously, they introduced the first 64-bit CPU in a mobile device, which stunned competitors at the time [2].
Not to mention their excellence in hi-dpi displays. In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize. In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+ (which I own, and it's excellent). However, the MacBook Pro 16" remains superior when it comes to sound quality, form factor (i.e. weight), touch pad precision and latency, thermal performance, etc., despite significant investments from Dell in those areas to catch up. [3]
[0]: https://www.apple.com/mac/m1/
[1]: https://www.apple.com/iphone-12-pro/
[2]: https://www.forbes.com/sites/ewanspence/2015/01/21/iphone-5s...
With everything on that list, thru made a better version of something that already existed, especially with display - it's not like they were the ones developing and manufacturing the displays. Even the ARM architecture and instruction set is not created by them.
I think it's important not to get carried away.
Ironically this is now somewhere they could stand to improve.
The MacBook displays are excellent, particularly when it comes to colour reproduction, but for the past several years they default to a scaled display mode. For anyone not familiar, the frame buffer is a higher resolution, and scaled down for the display, trading sharpness for screen space.
Evidently the drop in sharpness is imperceptible to most people, but I can certainly tell, to the point where I forego the extra space and drop it back to the native resolution.
For a company that generally prides itself on its displays, I think the right option would be to just ship higher res panels matching the default resolution.
They have also done this with certain iPhone displays over the years, but at 400+ppi it’s well within the imperceptible territory for most people. For the 200-something ppi display on the MacBooks, not so.
My understanding of how scaled resolutions in macOS work is that graphics are always rendered at the display's native resolution. The scaling factor only decides the sizing of the rendered elements. Can you point to some documentation that supports your view? I'd like to learn if I'm wrong and understand all the details.
It used to be this non-native scaling was only an option and by default the MacBooks ran at the exact native panel resolution. But at some point that changed so the default is one “notch” on the “more space” slider. I presume most people preferred it that way as you don’t get a lot of text on the screen at the native “Retina” resolution. But the sharpness is worse than when running unscaled.
Uhh, both Sony and Dell had 1080p, 1200 vertical and then QHD laptops in form factors down to 13" before Apple. I owned both before I moved to Apple myself.
You can read people here talking about how their laptops have had high res displays when Apple announced "Retina" back in 2012: https://news.ycombinator.com/item?id=4099789
But it’s not just the resolution, it’s that Apple made such a high resolution usable via 2x rendering, and did so immediately for the entire system and all applications.
You can also get a 4K UHD Dell at 13".
> In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+
> But it’s not just the resolution
It was, above. Now it's the resolution and the ecosystem. "Apple did it first". "No they didn't." "Well, they were the first to do it right" ( for varying definitions of "right").
I have no particular horse in the game. In fact, my entire home ecosystem from Mac Pro to MBP to iPad, iPhone, Watch would more lean me in one particular direction, but ...
It's amazing how far we've come
There are some technological improvements that are so transformative (wifi, flash storage, high-resolution/"retina" display, LTE data, all-day battery life) that once you try them you never want to go back.
Then there are the changes that make you go "hmm..." (butterfly keyboard, touchbar without a hardware escape key, giant trackpad with broken palm rejection...)
- Phones, (Original iPhone way ahead of competitors)
- MP3 players (Original iPods)
- Tablets, (Pretty much the only serious tablet as far as I can see)
- Smart Watches, (Apple Watch still defines the category)
- Ultra Books, (First MacBook Air)
- All in One Desktop (iMac)
- Mobile CPUs (Apple silicon has been way ahead for years)
- Laptop CPUs (M1)
This doesn't just all happen by making existing things more "User Friendly". This takes real innovation to pull off.
I'm ex-Apple and an Apple fan as much as anyone, but I also have the benefit of being old. Not to take anything away from Apple's collective accomplishments, in many of these categories I'd say they "redefined" more than "defined".
There were many smartphones before the iPhone (the Palm Treos were great), many MP3 players before the iPod, many tablets before the iPad (the Microsoft Tablet PC came out about a decade before the first iPad), all-in-one PCs go back 40 years now, etc.
Pretty much, and they did a pretty good job too.
It's kind of the difference between invention and innovation.
Apple certainly does invent things, but they are a superlative innovator.
Xerox never made the Macintosh because they missed key innovations such as regions, the Finder, consistent UI guidelines, the ability to put a usable GUI OS in an affordable package.