More than likely designers are making up work to justify their jobs. Not good for your career if you admit the desktop interface was perfected in ~1995.
Apple looked at innovations in hardware form factor and, rather than trying to out-innovate in that sphere, said, instead: how do we make something in software that nobody would ever try to imitate, and thus position ourselves as the innovators once again?
And the monkey's paw curled and said: Liquid Glass is a set of translucency interactions that are technically near-impossible to imitate, sure, but the real reason nobody will try to imitate is because they are user-hostile to a brand-breaking extent.
And Apple had nobody willing to see this trainwreck happening and press the brakes.
It also contributes to obsolescing older hardware.
ios 7 relied heavily on blurring effects-- a flex at the time due to the efficient graphic pipeline vs android they had. this was coming off the heels of Samsung xerox'ing and they wanted a design that would be too expensive for competitors to emulate without expensive battery hit. liquid glass is following in this tradition.
and similarly to ios 7, the move to flat design was predicated on the introduction of new form factors and screen. flat design lent itself well to new incoming screen sizes and ratios. prior there was really one or two sizes and that was it, easy to target pixel perfect designs against. as apple moves to foldables and more, this design flexibility is once again required.
as for no one trying to emulate it, i'm not so sure, OriginOS 6 ripped it off back in October.
Where what we really needed was a stable release version (now a year late from the original promised date) so we can build out UI components for the content editors to use that don't require constant design tweaks.
You know the designers are:
a) Just fucking around having fun
b) Making busy work to drag it out as long as possible
As it's now 4 years since they began working on the "design system", there's a good chance it will get canned as there's some more modern design they will want to use.
This has been solved with a button that switches the layout between the two designs, when I'm making changes it is sometimes necessary to flip back and forth between the two mid-change.
Even if Apple is right, why shoehorn the future into the present on devices unsuitable for its new paradigms? The iOSification also only worsened the macOS UX. It's one of the reasons I moved to Linux with KDE which I can configure as I like.
If they want make the AR OS of the future then make it on the vision pro where it belongs.
Wearable products, outside of headphones, have a decade-long dismal sales record and even more abysmal user retention story. No board is going to approve significant investment in the space unless there's a viable story. 4x resolution and battery life alone is not enough to resuscitate VR/AR for mass adoption.
In recent weeks, I’ve been getting push notifications about VP.
They hired Alex Lindsay for a position in Developer Relations.
And there’s the M5 update.
Just remember, it’s a lot cheaper than the original Mac(inflation adjusted). Give it 40 years – hell, given the speed of change in tech these days, it won’t even take 10.
Not to mention the fact that first, you have to get to a point where AR wearables are commercially viable, and we don't seem to have hit that point yet.
(Apologies to @cyberge99 if my tone comes off intense, this is not to come at you but rather is just me venting my frustrations with Apple. I think you are correct in your assessment of the idea here.)
All people I know describe this usecase first: “Will be awesome when it replaces my 2x34" screens”. I described it to the salesman when he asked me why I wanted to try it. He never showed it. Gave him 0/5, he complained, I explained this is specifically what I asked. You can emulate one screen in VisionPro but it’s absolutely obnoxious about making it about apps and iPhotos 3D whatever. Users desire it. Apple is hell-bent in not addressing that usecase, and addressing family usecases first.
Imagine they find a proper UI to visualize an infinite Typescript file. Something like flinch and you find yourself in a method, look elsewhere and you immediately see the other method. Make it viral by making it possible to write files in a way that is not practical to normal-screen users, like the old “max 1 screen height” limit. View your team in the corners of your vision. THE tool for remote working.
Workplaces would look futuristic. Your experience at the workplace would be renewed thanks to Apple.
And then, reuse the desktop’s UI on VisionPro instead of the desktop using VP’s concepts.
But no, Apple prefers killing off VisionPro and imposing LiquidGlass to everyone. (In waiting for my threat letter from Steve Jobs for suggesting ideas now).
No, this is the fault of a company and industry with way too much money and not knowing what to do with it.
So they hired a bunch of artists who would otherwise be carving wood in a decrepit loft somewhere after taking half a mushroom cap. These people now decide how computers should operate.
I remember watching a documentary from the 80s where Susan Kare explained how every pixel in the Macintosh UI was deliberately put there to help the user in some way. One lady did the whole thing, the whole OS.
Now we have entire teams of people trying to turn your computer into an avant-garde art piece.
…brother, you’ve just described the history of the personal computer and the Internet. It’s not the hippie artists causing this problem, I promise you that.
It seems much more likely that the driver here was to produce a UI that was resource intensive and hard to replicate unless you control the processors that go into your devices as well as the entire graphics processing stack that sits above that as well. It seems created to flaunt the concept of "go ahead and try to copy this" to Google and Microsoft.
VisionPro was meant to literally overlay its interface over your field of vision. That's a very different context and interaction paradigm. Trying to shoehorn the adaptations they made for it into their other, far more popular interfaces for the sake of consistency? It's absurd.
Things like “human interface guidelines” get written by nerds who dive deep into user studies to make graphs about how target size correlates to error rate when clicking an item on screen.
Things like Liquid Glass get designed by people who salivate over a button rendering a shader backed gradient while muttering to themselves “did I cook bro???”
They’re just two very orthogonal cultures. The latter is what passes for interface design in software these days.
Edit: On Linux, you have desktop environments like LXQt for this. Unfortunately, last time I checked, Wayland was not supported.
Acrobat reader still performs like a lead balloon though, even a miracle can't fix that one.
Contrast this with the "os" of my LG oled monitor. It seriously takes 5 seconds to open the settings menu.
I'm not sure what they use these days, but 10-15 years ago the MCU in a monitor was likely to be a ~10MHz 8051.
- Arguably the dock, though it's probably contentious - Ubiquitous instant search (e.g. Spotlight) - Gesture-based automatic tiling of windows to left/right side of the screen, tiling presets - Smooth scrolling, either via scroll wheel or trackpad - Gesture-based multi tasking, etc - Virtual desktops/multiple workspaces - Autosave - Folder stacks, grouping of items in file lists - Tabbed windows - Full-screen mode - Separate system-wide light and dark modes - Enhanced IME input for non-latin languages - App stores, automatic updating - Automatic backup, file versioning - Compositing Window Managers (Quartz, Compiz, DWM, modern Wayland compositors...) - The "sources bar" UI pattern - Centralized notification centers - Stack view controlelr style navigation for settings (back/forward buttons) - Multi device clipboard synchronization - Other handoff features - Many accessibility features - The many iteration of Widgets - Installable web apps - Virtual printers ("print to PDF") - Autocomplete/autocorrect - PIP video playback - Tags/Labels - File proxies/"representations" - Built-in clipboard management - Wiggle the mouse to find the pointer
None of these can be said to be at their final/"perfect" form today, and there are hundreds if not thousand of papercuts and refinements that can be made.
The real issue is probably due to management misunderstanding designer's jobs, and allocating them incorrectly. The focus should be more on the interactions and behaviors than necessarily on the visuals.
The Dock came from NeXtSTEP circa 1989. It had square edges and no Happy Mac. (So did Mail.app, TextEdit, some of the OS X Finder, and a whole bunch of other things.)
To the untrained eye it looks like an Apple innovation because most people couldn't afford NeXt computers unless you worked in a university or research lab.
But since then, each new version of Windows has made me more and more grateful for not having to deal with that dumpster fire on my personal devices.
The saddest part to me is that I have the strong impression it wouldn't take that much work to turn Windows into a much better system. But for whatever reason, Microsoft is not interested in making that happen. Maybe they are incapable of doing so. But the system itself not the reason.
Though if we could get the newer settings panel of macOS a few versions back, before they inexplicably ruined the best OS GUI settings interface I’ve ever used, that’d be great.
I don't need or want art, eye candy, or animations. I need to get work done and the rest of the OS to stay tf out of my way.
User interfaces are not art.
Do UI designers think that way?
I imagine some see it as engineering - make things work efficiently for the users. Others see it as art. The outcome will depend on which group gains the upper hand.
1. "Picasso, that's the wrong way to depict a human nose."
2. "Picasso, that's the wrong material, that vibrant paint is poisonous and will turn to black flakes within the year and the frame will become warped."
I interpret parent-poster's "interfaces are not art" as meaning they're mostly craftsmanship.
It may not be quantifiable enough to be labeled "engineering", but it's still much less-subjective and more goal-oriented than the "pure art" portion. All these interfaces need to be useful to tasks in the long term. (Or at least long enough for an unscrupulous vendor to take the money and run.)
> No project manager ever got promoted for saying "let's keep things the same".
Designers at Balenciaga don't have to justify their jobs when they make oversized t-shirts, neither do the ones at Apple.
In actual tools, the form and function are strongly connected. Tools of competing brands look pretty much the same, except for color accents, because they can't look any different without sacrificing functionality, performance and safety characteristics.
You don't see power tool vendors trying to differentiate their impact drivers by replacing rubber handles with flat glass because it's more "modern", because it would compromise safety and make the tool unsuitable for most jobs its competitors fulfill. This happens in software mostly because the tools aren't doing much of anything substantial - they're not powerful enough for design to actually matter.
Maybe stakeholders were calling the shots and everyone was like, "Fine. If you want us to reuse the same icon for different purposes, you're the boss. We are done trying to explain why this is a bad idea."
The anti-design bias in this forum is genuinely unhinged. I see some saying the entire destruction of the natural world stems from design lol.
Things got pretty bad. More than 95% of all employees (and I'm guessing 99% of designers) were using iPhones at the time. There would be rough edges all over the Android app, but as one of our designers said "people with taste don't use Android".
Imagine knowing that most of your new users were getting a subpar experience, and that not being enough motivation to expense a flagship Android and drive it daily.
But the new users kept coming, and despite mostly being Android users, they still used the product. Turns out that legacy taxis are themselves an ugly interface, and ugliness is relative.
Probably the vast majority of profitable Uber users were still on iOS, though, like most apps?
> but as one of our designers said "people with taste don't use Android".
Based lol
Probably true at the time.
I don't think anyone seriously believes Uber, Airbnb and Robinhood won because of "beautiful apps".
RH made a lot of investment tool accessible to people that "I just want to buy stock of some company", I used tasty trades for a while, but their mobile app while has all functionality, but realistically you will just look to overview portfolio.
Unfortunately, most of the SW industry isn't even aware of the difference:
For beauty you hire a graphic designer
For usability you hire a PhD in cognitive psychology
"Behind every great fortune is an equally great crime."
https://www.britannica.com/biography/Honore-de-Balzac/La-Com...
It is 2026 and UIs are still abysmally slow in many cases. How is that even remotely possible? Now, with that in mind, consider (just for a moment) why people might think that UX people don't know what they're doing.
Because UI/X teams were separated from engineering. (Same thing happened with modern building architecture)
It's fundamentally impossible to optimize if you're unaware of physical constraints.
We need to get rid of the "It's okay to be a UI/UX designer who doesn't code" cult. (Looking at you, Adobe and Figma...)
Yes. Yes, it has. I'm currently in the midst of a building project that's ten months behind schedule (and I do not know how many millions of dollars over budget), and I'd blame every one of the problems on that. I - the IT guy - was involved in the design stage, and now in construction (as in, actually doing physical labor on-site), and I'm the only person who straddles the divide.
It's utterly bizarre, because everyone gets things wrong - architects and engineers don't appreciate physical constraints; construction crews don't understand functional or design considerations - so the only way to get things right is for someone to understand both, but (apart from me, in my area - which is why I make sure to participate at both stages) literally no one on the project does.
Seen from a perspective of incentives I guess I can understand how we got here: the architects and engineers don't have to leave their offices, and are more "productive" in that they can work on more projects per year, and the construction crews can keep on cashing their sweet overtime checks. Holy shit, though, is it dispiriting to watch from a somewhat detached perspective.
We have convinced ourselves as an industry that this is not true, but it is true.
I don’t think designers who don’t code are really a problem. They just need to dogfood, and be lead by someone who cares (and dogfoods as well).
It's slow, bloated, buggy and ugly. Probably one of the worst apps running on my phone.
But there was a time when their app was native and was actually quite good.
In my opinion, this article had very clear and direct criticisms; they were hardly "anti-design bias". The increase in visual clutter is, for sure, a net loss for MacOS Tahoe.