I love you Arnd. More seriously, this will become an issue when someone starts the process of integrating Rust code into a core subsystem. I wonder whether this will lead to the kernel dropping support for some architectures, or to Rust doing the necessary work. Probably a bit of both.
The long-term solution is for either of those to mature to the point where there is rust support everywhere that gcc supports.
I'm curious though, if someone has an ancient/niche architecture, what's the benefit of wanting newer kernels to the point where it'd be a concern for development?
I presume that outside of devices and drivers, there's little to no new developments in those architectures. In which case, why don't the users/maintainers of those archs use a pre-6.1 kernel (IIRC when Rust was introduced) and backport what they need?
There’s an asymmetry in what the retro computing enthusiasts are asking for and the amount of effort they’re willing to put in. This niche hobby benefits from the free labour of open source maintaining support for their old architectures. If the maintainers propose dropping support because of the cost of maintenance the hobbyists rarely step up. Instead they make it seem like the maintainers are the bad guys doing a reprehensible thing.
You propose they get their hands dirty and cherry pick changes from newer kernels. But they don’t want to put in effort like that. And they might just feel happier that they’re using the “real” latest kernel.
Wanting big fixes (including security fixes, because old machines can still be networked) and feature improvements, just like anyone else?
> I presume that outside of devices and drivers, there's little to no new developments in those architectures.
There's also core/shared features. I could very easily imagine somebody wanting eg. ebpf features to get more performance out of ancient hardware.
> In which case, why don't the users/maintainers of those archs use a pre-6.1 kernel (IIRC when Rust was introduced) and backport what they need?
Because backporting bits and pieces is both hard and especially hard to do reliably without creating more problems.
> To me the more salient questions are how long before (a) we get Rust in a core subsystem (thus making Rust truly _required_ instead of "optional unless you have hardware foo"), and (b) requiring Rust for _all_ new code.
Previously, the position was that C developers would not be forced to learn Rust.
And a few days ago a security vulnerability was found in the Rust Linux kernel code.
was it a security vulnerability? I'm pretty sure it was "just" a crash. Though maybe someone smarter than me could have turned that into something more.
I have no dog in this race, I really like the idea of Rust drivers but can very much understand retiscience at getting Rust to be handling more core parts of the kernel, just because Rust's value seems to pay off way more in higher level code where you have these invariants to maintain across large code paths (meanwhile writing a bunch of doubly-linked lists in unsafe Rust seems a bit like busy work, modulo the niceties Rust itself can give you)
It's a race condition resulting in memory corruption.[1][2] That corruption is shown to result in a crash. I don't think the implication is that it can result only in crashes, but this is not mentioned in the CVE.
Whether it is a vulnerability that an attacker can crash a system depends on your security model, I guess. In general it is not expected to happen and it stops other software from running, and can be controlled by entities or software who should not have that level of control, so it's considered a vulnerability.
[1] https://www.cve.org/CVERecord/?id=CVE-2025-68260 [2] https://lore.kernel.org/linux-cve-announce/2025121614-CVE-20...
The learning seems to be the only legitimate issue that people have. But they avoid mentioning it because it sounds intellectually lazy.
"You don't need to learn it or use it, we just want to do our own separate things with it over here"
.. some time later ..
"Oh yeah it's working good for us, we think it'd be useful to use it in these additional places, think about the benefits!"
.. some time later ..
"Now it's going to be core and required, either deal with it or get out"
They know they could never jump straight to the last step without revolt, so they shove their foot in the door with fake promises and smiles and then slowly over time force the door all the way open until they eventually get what they wanted from the beginning.
The cherry picking for this one Rust vulnerability to the ~150 C vulnerabilities is such a weird take that I can't help but think people have some weird hatred of Rust.
Your post is curious, for the post I quoted basically argued for just that eventuality for all new code. Even as the new language introduces undefined behavior vulnerabilities.
The promises as stated previously, and the goal as stated by that lwn.net post now, are starkly different. And the poster did not even wait until the new language has proven its worth. And then a UB CVE comes by in the code in the new language.
What Linus wrote in the past:
https://www.phoronix.com/news/Torvalds-On-Rust-Maintainers
> So when you change the C interfaces, the Rust people will have to deal with the fallout, and will have to fix the Rust bindings. That's kind of the promise here: there's that "wall of protection" around C developers that don't want to deal with Rust issues in the promise that they don't have to deal with Rust.
That both you and that lwn.net poster writes these things, is extraordinarily weird and strange.
There was a lot of interesting discussion on the previous post [0], but one thing I didn't see was much discussion about this bit:
> The DRM (graphics) subsystem has been an early adopter of the Rust language. It was still perhaps surprising, though, when Airlie (the DRM maintainer) said that the subsystem is only "about a year away" from disallowing new drivers written in C and requiring the use of Rust.
I was a bit surprised when I first read this. Is this meant to be read in a way that is more just a description of the state of Rust bindings (e.g., the DRM subsystem is about a year away from being able to require the use of Rust, but isn't actually planning on doing so), or it is describing actual plans (e.g., the DRM subsystem is about a year away from actually requiring the use of Rust)? I was originally more inclined to go for the former interpretation, but this other bit:
> With regard to adding core-kernel dependencies on Rust code, Airlie said that it shouldn't happen for another year or two.
Makes me think that perhaps the devs are actually considering the latter. Is anyone more in-the-know able to comment on this?
Now the question is: If we live in a world where magnesium fires are common, can we afford to not at least try building with the inflammable bricks?
I know this topic stokes emotions, but if you haven't tried Rust as someone with C/C++ experience, give it a go. You will come out wiser on the other side, even if you never use the language for anything.
There's dozens of rust osdev projects. Its an open question if any become relevant like linux.
The Absolute State of the Kernel Rust Experiment Right Now
And every comment has its confidence/aggressiveness taken up to 11 (tho still within site rules).