Even if you momentarily ignore the reasons why someone thought this could be a good idea, why not do it in one of the pre-releases or betas??? Doesn’t look like the kind of thing you’d want to do in a last minute change.
The real problem in my opinion is the fact that you cannot go back after a macOS upgrade. So if something like this happens, you literally have no option but waiting for Apple to release a fix, if they want to do it at all.
code review: self-reviewed
test plan: this change is so obvious no tests are neededThe good kernel engineers are working on iPhone or Vision Pro, not on MacOS
I do not have very kind words for Apple's dev teams today. Charitably I am trying to think that screwups happen, but this is bad and it is very hard to see how anyone thought merging it into an rc was okay.
Windows was a virus laden mess and was not useful for running Linux apps.
And besides the flaws of the other OS’es, OS X had some of the nicest window management features (Expose from the Snow Leopard is still my favorite window switcher), was a UNIX and had a thriving indie development scene (which was basically killed by iOS…).
Since then OSX has completely languished as a developer platform. It’s not clear what you can do today as a developer to make your life easier that you could not a decade ago on OSX. And in fact, the destruction of the indie dev scene, combined with the many heavy handed security restrictions of dubious benefit have made it a far worse dev environment than a decade and a half ago.
Further, Linux DEs have greatly improved and Windows now supports Linux development.
The Mac ecosystem has seen a complete turnaround where you now buy a Mac for the hardware, not the software.
1. Terminal is very usable, compared to Windows cmd. Modern Gnome Terminal is good, though.
2. Cmnd+C for copy, Ctrl+C for SIGINT.
3. Touch ID instead of root password, which works with Bluetooth keyboard as well, and that's with absolutely minimal configuration, uncommenting single line.
2. Ctrl+C works for copy when text is selected otherwise SIGINT.
It is subpar, however, when compared to Windows Terminal.
Can't compete with the streamlined easy of highlight-to-copy. I never use a keyboard shortcut to copy text from a terminal (except for yanking in vim / evil-mode)
2. I remember the XFCE4 Terminal using Ctrl-Shift-C and Ctrl-Shift-V for copy/paste and liking it, no more SIGINT by mistake. But IMO a minor gripe, you can remap keys for copy/paste in most self-respecting terminal emulators anyway.
3. I agree on that but passwordless sudo saved my sanity and I don't care anymore. If I install a virus then I had all the troubles coming and I'll take responsibility. ¯\_(ツ)_/¯
However at home I have always been a Windows/Amiga/UNIX head, with Linux being the cheaper path to that UNIX experience, had Microsoft not messed up the POSIX layer, probably I would never have bothered.
For some time I even tried to acquire one of those nice Toshiba laptops using Solaris that Sun used to have.
I'm always confused by such statements, because what KDE offers on Linux easily dwarfs every window management concept in every major OS. I always need to install additional third-party apps (e.g. Rectangle on macOS) to get a poor-man's equivalent of KDE-style window management functionality.
Nothing came close to OSX’s Expose 15 years ago.
OSX has gone backwards in terms of windows management since then.
Linux is far superior. Even Windows is slightly better because at least windows snap to edges.
I often have to re-center the balance, it's driving me nuts.
Your explanation makes a lot more sense as x86 is probably the only time I’m pushing the cpu usage high enough.
The popping is darn annoying
The hardware is not even that good. I presume people like it because it looks slick and serves as a status symbol.
The list of specific annoyances and bugs is likely in the 3 digits by now, and I've only used it for half a year.
The worst of all was getting the M2 soft-bricked by an update, because I had changed the display frame rate to 60Hz, because the tween duration when moving between desktops was for some reason tied to this refresh rate. About 2 second tween duration on 120 Hz until input control, and one second on 60 Hz. Impressive for such a thing to not be picked up by QA.
I use the command line almost exclusively for file management and avoid it like the plague.
How to lose your work using Undo Copy in Windows
The answer to most "it's a bit dumb that MacOS doesn't let you / forces you to" is "install app X, Y, Z".
- Don't like that apple's "Music" app pops up when you connect a Bluetooth headset? => Install an app.
- Want to be able to "alt tab" through windows of the same program, or in general not be uselessly flawed? => Install an app
- Want to be able to move and resize windows without aiming at the exact edge pixels of the window? => Install an app.
- Want to move & resize windows to very common places and sizes on a screen? => Install an app.
- Want global hotkeys for whatever? => Install an app
- Want a software package management system a'la apt? => Install an app.
- Want to rebind keys or make things like Home/End not be dead keys... because apple keyboards don't have that, and they cannot be bothered with it. "you should be using "⌘ + →" anyways... or, I suppose it depends on the window"? => Install an app.
- etc....
You don't get any of these annoyances with Linux / Gnome. "Why not use that if you hate MacOS so much?" I pretend to hear you say. First of all, because of anti-competitive reasons by Apple, I sort of have to. Secondly... something something angry old man yells at clouds.
They could have gone down the path of translating the windows UI APIs, though I think it's better that they left it as is. The bigger issue however, is that there are different systems depending on what it is you want to configure, and it's all ducktaped in an ancient registry that I'm just amazed only breaks as often as it does.
Not to mention that the thing Windows was supposed to always be better at, was driver support. On windows, you have to manually source them, and try as best to avoid all the bloatware that comes with. Windows itself might also decide to replace a driver with an older one (version, release date...). WiFi drivers didn't work last time I upgraded the mobo either.
As for Linux. Completely agree. Gnome is consistent, and gets out of the way often enough. There are some annoyances there too. I have my 90 y/o grandma use Linux/Gnome, because that's what Just Works these days.
It's not unusable if you at least have one of those medium density 4K monitors, but it feels like a step backwards if you're used to Windows which still (mostly) supports subpixel font rendering for crisp text at 100% scale, and can render natively at 125/150/175% scales.
As you know, subpixel rendering only works when you have a very specific display (i.e. LCD) since it takes advantage of, and hence relies on precise characteristics of how the pixels are physically laid out in the display.
This means that subpixel rendering fails to work on displays that have different layouts, the most recent example of this have been newer OLED displays (I think QD-OLED) which has a different pixel arrangement and then you ironically had Windows users complaining that the text looked jagged (although you are able to change the algorithm for subpixel rendering to match the QD-OLED, the unfortunate problem here is that its not really possible for this to work all for applications as it depends on which UI engine you are using, Windows is a giant mess here).
Long story short, I can see why Mac removed sub pixel rendering, its basically a workaround that reflected a time when you had less dense displays which were all LCD and had the same physical layout. Nowadays though high pixel density displays are a lot more common and then you don't need sub pixel rendering at all (and it works with all of the different physical pixel layout arrangements)
At least Dell supports their hardware: have you tried updating the monitor firmware or submitting a report?
I gotta disagree hard here. Macs have by far the most obnoxious and temperamental WiFi stack I've ever experienced. Constant disconnects, have to turn it off and on to get it to bother looking for APs again. All of them constantly trigger bad experience scores in UniFi.
Absolutely subpar compared to any of my Linux devices, even the raspberry pi jammed inside a metal box.
My opinion and experience is the exact opposite. In fact, I switched to Macs BECAUSE of how good macOS was for development and just general work and daily life.
Around 10-12 years ago I got an iPad, my first ever Apple purchase, as a gift for my aunt. I loved how simple and clean iOS was and found the apps and games interesting, so I thought I'd dabble in iOS dev. I was on Windows 8 at the time (and already sick of Microsoft's bs) so I downloaded a VMWare image for Mac OS X Lion.
As the days went by I found myself spending more time in macOS than in Windows, and enjoying it! A month later I bought my first ever MacBook and never looked back.
Well, sometimes I do look back at Windows, in a VM on macOS, just to try some games, and man, it's still a sad joke in 2024.
It's been a long time since I ran a Macbook, but this was my biggest problem. The weird uncanny valley where its almost the same but then not.
WSL has problems but there's a very clear line in the sand between Linux and Windows and you know what you're getting.
Tried updating to latest xcode, learned that my Mac's storage is almost full. Why? iOS simulator images were taking a whopping 40 GB of space even when I didnt target those iOS versions nor tested on those simulator devices. I uninstalled all the images keeping the one I build for. Next tried updating Xcode again, the issue with creating objective C files was fixed. But then it forced me to download the iOS 17.2 again along with tvOS and a bunch of other extra things. Now my space is close to full again. Why Apple? Why do I need iOS 17.2 when I build for 15.4?
I have a $2000 AUD LG monitor that Mac OS just occasionally decides to overdrive (or something) and cause instant but temporary burn in. I'm not the only one - you can find others on Reddit.
While my work Windows laptop might be faster, it's certainly not the one I'm going to pick in a pinch or when I want to travel with just one laptop.
The best mobile configuration I know right now is a Macbook Pro + Parallels. Even with all of its deficiencies.
Are there any good Linux laptops with similar experience as Macbooks when it comes to power management and time from lid opening to usable state?
Macs are finnicky with hardware (but hdmi sucks by definition, they think you bought the cable and monitor to pirate movies and not to do some work).
However the GUI actually works and if you spend a week on windows 10+ you'll remember why people buy Mac OS.
Personally I have a Mac for stuff that requires a GUI and a headless linux box that I ssh into. And I switched to Macs from ... Linux on the desktop.
Edit: docker is shit because they just install a Linux VM and run their Linux stuff in there. Same on Windows I guess.
macOS is just so clunky. It tried to be so smooth all the time but just ends up being annoying.
So far I haven't managed to turn off the firewall scare popups, I did manage to remove that crap in the task bar that pops up the weather and selected news covering half the screen if you hover in the wrong place, I may or may not have turned off the OneDrive upsell, and I also got a full screen message to upgrade to 11 for free when booting once.
Great user experience overall. And don't tell me I can spend another week to turn those off, is Microsoft paying for that wasted time?
Is the problematic corporate Mac an M-series Mac or Intel?
M-series have been great in my experience. I did used to get random full system crashes on Intel Macs which haven't happened in a few years on M1/M2.
https://stackoverflow.com/questions/66408996/python-not-foun...
> using a third party second display
Has always worked fine for me
> Docker sucks
This is Apple’s fault how? It also sucks on Windows.
> posix compatibility is technically there but isn’t really useful
What does this mean exactly? Can you find an example where it’s not useful? In my experience most of the command line applications I would want on Linux are easily installable via Brew and I can choose all the same shell environments as Linux/Unix.
> The thing randomly loses network and only rebooting fixes it.
On your machine. Not my experience with any Mac I’ve owned. That isn’t expected or common behavior.
> I reboot my corporate Mac more often
I’m going to guess this is because your IT department sucks. I never reboot except for OS updates.
I'm finding this with software everywhere. Products keep doing the same old stupid shit they did when they were first released. "Refinements" are poorly-designed cruft.
Is there anyone in charge of the OS X experience? There seems to be a lot of resume development - features that can be illustrated with smiling people in a video but don't really work all that well - and not so much interest in the core UX.
I still find it better than Windows, but the gap between what it could be and what it is keeps growing.
1. There's a button on my M3 Mac keyboard that says 'delete'. It deletes stuff everywhere else, but welcome to Finder, this simple button doesn't delete a file or a folder. They thought giving it a two/three keys combination was a better idea.
2. Similarly, they thought you rename file/folders more often in a day than you open them. Why else would they make you press two keys to open one, and the most common single button in the world to open files (Enter/return) to rename one instead?
3. No 'Cut' (I know the alternatives). One might find it surprising but there are fans that defend even this move - they say it's because this is more "intuitive". You only copy everything first and only at the time of pasting you decide whether you want to move it or copy it. I say, if that's really the case, why does every other app and Editor (including the ones made by Apple) have a Cut option? Why don't we always follow this more intuitive method of "copying" first and then pressing the Option button while pasting. Let's remove Cut from everything and see how intuitive people find it.
4. By default, the Finder doesn't even tell you where you are. That's a basic requirement from a File Manager. Sure, fiddle with the settings and at some place you'll find an option to kind of enable that.
5. No option to quickly create a text/other file in a given folder. If you've struggled enough and enabled the view where you're able to see where you are at the moment, there's a _chance_ you'd also see that from that view you can actually go to Terminal directly in that folder. Go there, and type `touch <filename>` to create a file in that folder.
6. You got a full path to go to somewhere on the disk. You quickly open Finder. Oh, the default view doesn't even have a place to paste it and hit Enter. Who could have thought to hide it? Same problem with the native 'File Open' dialog that's used by all the other apps on the system. Even if you have the full file path, unless you go to settings you won't find a way to go to that file directly.
7. No easy (if at all) way to persistently map a network drive that automatically remaps when the network drive is available. You have to keep connecting to the SMB server again and again.
8. Side bar folder shortcuts get removed when the folder is deleted and recreated for any reason. You have to recreate them. Not sure who made all these decisions or if they were even thought about.
9. No straight way to even 'Refresh' the files in a folder. Try going out and in, closing and reopening Finder and just 'hope' that it will update and show the newly created files or changed file properties outside. Many times it just doesn't.
10. 'Get Info' allows you to also 'Set' (a lot of) Info. This is UX 101. They could have just named it `Properties` instead.
11. Hell, you can't even maximize this app window by double clicking on the Title bar, unlike for example another Apple made app 'App Store'. No consistency.
12. In List view there's no padding, I can't even find a place where I can right click and paste a previously copied file in the 'current folder', without it hitting a subfolder and pasting the files into that instead (assuming the folder has many folders inside). I'm surprised no one found it in internal user testing.
These are just off top of my head, I'm sure I can find more if I spend some time. There might be involved solutions to these, but there's no way we can call this an 'intuitive' interface. And this is just one application in the whole Operation System.
The hardware definitely keeps getting better and yet the software keeps getting worse. sigh.
I mean they have even screwed up a nice app like iBooks. I used to use it for reading ePubs all the time, but now I dread opening up one. Lags like crazy. And so many crashes and reboots needed. Keep submitting crash reports but fairly certain that no-one ever reads them.
Yes, remarkably today - the Windows desktop needs less reboots than macOS today. Can anecdotally confirm this with 2 windows PC's, 3 windows laptops and 3 Macbooks in the family.
It has been a pretty frustrating experience at times. Most of the time is _fine_, but the problems after updates, Docker bugs, certain libraries that we cannot install..
On the other hand, it was never perfect with Linux either. But that was expected. And I can say that macOS does not deserbe the reputation it has.
Overall, kind of a mixed bag. There are some very nice aspects to both he hardware and software, but some that are jarring and make me thing "this is not really meant for professional users". Like the atrocious window management (that admittedly can bve fixed with a couple free applicaitons).
One that says don't update mac os to avoid breaking Java. Another that essentially says upgrade macos to latest version within x days else the issue will be escalated.
It is going to be quite a hassle for IT teams across companies to deal with this problem.
As Gale and Evelle bang in through the door. Evelle holds a
shotgun; Gale holds a shotgun in one hand and Nathan Jr. in
his car seat in the other.
GALE
All right you hayseeds, it's a stick-
up! Everbody freeze! Everbody down
on the ground!
Everyone freezes, staring at Gale and Evelle. An Old Hayseed
with his hands in the air speaks up:
HAYSEED
Well which is it young fella? You
want I should freeze or get down on
the ground? Mean to say, iffen I
freeze, I can't rightly drop. And
iffen I drop, I'm a gonna be in
motion. Ya see -
GALE
SHUTUP!
Promptly:
HAYSEED
Yessir.
GALE
Everone down on the ground!
EVELLE
Y'all can just forget that part about
freezin'.
GALE
That is until they get down there.
EVELLE
Y'all hear that?Haha, this article is quite something :D
The Java Applet was removed from the safari browser. That is unrelated to java apps running on the desktop.
> With macOS 14.4, when a thread is operating in the write mode, if a memory access to a protected memory region is attempted, macOS will send the signal SIGKILL instead.
What is bizarre to me is that Oracle relied on receiving SIGSEGV as normal mode of operation. That should have been a hint where things are going, no?
It's useful for other things as well. I've used SIGSEGV to emulate hardware interrupts. Normal execution wouldn't trap and there's no need for tests + branches (= normally no slowdown), but when an interrupt occurs a specific often accessed page is marked unreadable.
> Write attempts to memory that was mapped without write access, or any access to > memory mapped PROT_NONE, shall result in a SIGSEGV signal. > > References to unmapped addresses shall result in a SIGSEGV signal.
How a SIGSEGV can be handled by the program to continue execution normally need some OS specific code. For Linux there's also userfaultfd to suit this need better.
A JVM's use of SIGSEGV might include platform-dependent details for recovery. But for simple application usages (e.g. eliding inlined bounds checks in a performance critical loop operating on an array) longjmp can suffice for recovery. POSIX very carefully defines async-safety and longjmp to permit jumping out of a signal handler and resuming normal execution, provided certain constraints are met, such as that the signal did not interrupt a non-async-signal-safe function.
So you have to disable signals prior to doing anything "non-async-signal-safe" and re-enable them thereafter? That's a pretty big "but"...
Not bizarre at all, this how the runtime has always operated, as anyone one who's ever attached a debugger to a Java process knows. The SIGSEGV handler is also responsible to handling NullPointerExceptions IIRC.
> ... the JVM can intercept the resulting SIGSEGV ("Signal: Segmentation Fault"), look at the return address for that signal, and figure out where that access was made in the generated code. Once it figures that bit out, it can then know where to dispatch the control to handle this case — in most cases, throwing NullPointerException or branching somewhere.
https://shipilev.net/jvm/anatomy-quarks/25-implicit-null-che...
This means a workaround is running java with -Djava.compiler=NONE, no?
1. There is very little you can safely do in a signal handler. For a threaded application, that pretty much boils entirely down to setting a bit and leaving it at that. If they did anything more, the behavior is undefined.
2. The memory state that a program receiving a SIGSEGV in is often undefined/garbage, and attempting to execute further at this point is at best unsafe, at worst trampling on state further, continuing execution in a broken state and destroying all evidence that would be useful for debugging - whereas a coredump preserves the state at the time the issue occurs.
There are cases where you need to catch SIGBUS, such as if an anonymous file has been truncated after you mmap'ed it.
The code in question takes into account that the value read might be garbage. See the big comment here: https://github.com/openjdk/jdk/commit/29397d29baac3b29083b1b...
On current CPUs and operating systems, this is not an optimization, so the code was removed earlier this year: https://bugs.openjdk.org/browse/JDK-8320317
This is opposed to calling `sigwait` or similar to actively suspend and wait for a signal, which is not possible to do here.
Granted, it may be that the stars align and their implementation works in practice, but that does not make it any less bizarre.
You can actually do pretty much anything you want, it's just the C library that uses a lot of global state and internal memory allocations, which messes things up. The core syscall API and any reentrant code you write yourself are not affected.
>The memory state that a program receiving a SIGSEGV in is often undefined/garbage
That may be true for arbitrary segfaults caused by bugs, but the JIT has 100% control over what instructions to emit, it is not restricted by ABIs or platform-specific issues, so there is no problem to use SEGV as a signaling mechanism.
It's mostly fine, though. The crashes are rare, and since everything auto-saves, you're not really losing anything. It's just an "oh, okay." moment.
Obviously it'll be good when it's fixed, but on my personal list of impactful bugs, this doesn't crack the top 10.
News like these are the major reason why I apply updates only after long periods of waiting if anything blows up for others. Why companies use their userbase as testers?
But then you are accepting that you are running an exploitable OS since you are lacking the latest security fixes. Not sure if that‘s an acceptable tradeoff.
As other posters said: macOS might have had an edge over Windows and Linux before but that's no longer the case for a few years now. I'll definitely be looking for ways to use 5K display with my Linux laptop and will likely make a full transition to Linux in the next year or two.
Macs have amazing displays. So I'll use mine as thin clients I suppose. My eyes are happier with an Apple display so I'll use them for that alone.
Apple can still turn this around but their bogus security claims that serve mostly to annoy devs is them shooting themselves in the foot and making themselves a very uncomfortable bed to sleep in just some very short years in the future. Hope somebody at HQ understands that and is able to see the problem before too many people leave.
I suggest the Oracle blog as an alternative.
I thought it was clear, but I have replaced the "this" in my comment anyway.
In that case, I 100% agree with you, the Oracle article seems much better than the Apple Insider article.
This is why most enterprise workplace tech teams don’t roll out any OS level updates immediately. Regardless of whether they are on windows or macOS. Also a good idea to disable automatic updates on all devices that you use daily.
This is misleading. What was deprecated was the browser Java plug-in distributed by Apple. That’s very different from “deprecating Java”.
They basically bamboozled us with fancy wallpapers and gave us this immensely substandard software.
macOS Sonoma 14.4 introduces new emoji as well as other features, bug fixes and security updates for your Mac.
Emoji
• New mushroom, phoenix, lime, broken chain and shaking heads emoji are now available in emoji keyboard • 18 people and body emoji support facing the opposite direction
This update also includes the following improvements and bug fixes:
• Podcasts Episode text can be read in full, searched for a word or phrase, clicked to play from a specific point, and used with accessibility features such as Text Size, Increase Contrast and VoiceOver • Safari Favourites Bar adds an option to show only icons for websites