I'm aware of the NASA/JPL rules for developing safety-critical software[2] but I'm not sure if any car manufacturers follow anything similar.
Does anyone here have any knowledge of the software development practices of any automakers and what they do to ensure safety and reliability? And is there anything else I can do to mitigate this risk (short of buying a very old car, which would have other safety downsides)?
[1] https://en.wikipedia.org/wiki/Sudden_unintended_acceleration [2] http://spinroot.com/gerard/pdf/P10.pdf
Either way, if you've had a fuel injected car you were still exposed to these issues. You would have to go buy a carbureted engine from the 80s or before to get away from these "unintended acceleration" issues, as in the end a car with EFI probably has a computer actually controlling the injection. I'd be way more wary of daily driving an 80s or older car from a general safety standpoint than a software issue. You're way more likely to be t-boned at an intersection than a software glitch causing an accident; having a much more modern car will help from a crash safety standpoint than having a carburetor.
There's a ton of things that can go wrong in a car which can cause an accident. The software stack is surely one of those things, but even a 100% mechanical car can have a lot of failures as well. Ever have vacuum hoses fail on an old car? Carburetors get stuck or clogged? Personally, I'd prefer a computer controlling components directly instead of tons of vacuum lines and springs trying to keep things tuned right. On top of that I'll also get much better efficiency and reduce harmful emissions which hurt my family and my neighbors.
And modern cars are much better at handling these types of scenarios. For example, in my late model car, if you apply the accelerator and brake at the same time, the vehicle will ignore the accelerator input. This solves two potential problems from the past: someone accidentally stomping on both pedals when they meant to hit the brake, and a foreign object wedging the accelerator pedal down.
This happens to me (sometimes) in parking lots where pedestrians walk between cars. Mine (2019 Mazda Miata) does not do this, instead I get an engine rev while standing half on brakes, half on accelerator, full on clutch. I end up feeling embarrassed as I tend to get glared at by the pedestrian (no, I did not intentionally rev it to scare you).
I don't think unintended acceleration with cable operated throttles was ever much of an issue. The simpler EFI systems of the 80s and 90s were very robust with predictable failure modes. We've certainly bought a lot in terms of safety and efficiency with the newer designs, but their complexity also means problems can be more obscure and more likely to sneak their way to market.
Anecdote: My 01 Volvo had weird/dangerous intermittent acceleration, but had a fully computerized throttle. The software got confused by a failing throttle position sensor. The best fix is to replace it with a hall effect sensor that doesn't wear out.
> reduce harmful emissions which hurt my family and my neighbors.
If there was a true problem it could be solved by having less kids. Why do you end your sentence with this dipshit way of arguing? Nobody falls for that. Of course we all know the game here is for someone to call you out being a passive aggressive dipshit and play the victim once that happens.
And pretty much none of them ever do if the driver doesn't react exceptionally poorly. Even the spectacular stuff that the internet absolutely loves to hand wring about, like a wheel falling off for whatever reason, almost always results in the car coming to a controlled stop on the side of the road. The conversion rate between "failures" and "meaningful harm to anyone or anything is abysmal."
Even with EFI, if the throttle is mechanical and the EFI continues to ask for more fuel for whatever reason (or a fuel injector gets stuck open), all that will happen is the engine will stall due to the excessively rich mixture.
Ever have vacuum hoses fail on an old car? Carburetors get stuck or clogged?
The normal failure mode of a carburetor leads to an engine that doesn't run, and not the opposite. Before complete failure, you will notice a performance decline.
Personally, I prefer no computer control.
On top of that I'll also get much better efficiency and reduce harmful emissions which hurt my family and my neighbors
You can get a lot better efficiency from a carbureted engine than most people think.
As for safety, I'd rather have freedom.
If the throttle is mechanical. So yeah, I guess there's a window of time there where EFI became the norm but before throttles were also electronic, so late 80s to early 2000s. I imagine the majority of cars on the road today in the US are fully electronic.
> The normal failure mode of a carburetor leads to an engine that doesn't run, and not the opposite.
I've personally experienced carburetors getting stuck open, usually on abused/unmaintained lawn equipment. I do agree the usual failure is that it gets gummed up and inefficient in its atomization, but a stuck open carb isn't impossible. Either way, a carb that fails and you suddenly lose power can also cause problems when unexpected.
> You can get a lot better efficiency from a carbureted engine than most people think.
Yeah, a well-tuned and well-maintained carburetor isn't absolutely horrific in efficiency. But it'll still pale in comparison to the combustion efficiency that can be had in a GDI engine.
> As for safety, I'd rather have freedom.
Cool, and feel free to drive that freedom car in your freedom yard. Please keep your freedom emissions in your freedom air though instead of polluting your neighbors. When you're driving on the public streets there's more than just you out there.
To answer your last question first, buy a car that hasn’t been launched within the last 12 to 18 months. That’s not software specific, that general vehicle safety across the board as they will be working through the initial warranty issues. So if you are looking at second hand and you know model ABC was launched 2016, don’t buy one made in the 2016/2017 period.
ISO 26262 rates every system on a critically rating, if they have a ASIL rating of C or D they have multiple back up systems in place. This falls under functional safety which is a newer (5 years or so) area targeting that cars are now highly complex interconnect systems linked with software - the idea being that you target specific subsystems to make sure their function isn’t totally taken out due to some failure or error in the wider system.
Cyber security wise there is an EU reg coming in from 2024 making sure that OTA updates are safe, reducing hacking attack vectors and the like. This is being introduced to new cars and designs as a result of the issues cited above.
As far as people hacking in via the infotainment to access the car control systems - there are firewalls between infotainment and primary car control to mitigate against that issue. There multiple networks in a single vehicle to isolate systems so that no one central unimportant system (infotainment for eg) can take out the whole vehicle.
Software in cars to this level is new, it’s evolving and it takes 7 or so years to create a new platform. This means there is a lag in the system, especially during this transitionary period.
However car makers take this stuff incredibly seriously and their software teams are absolutely not run in the same way as a lower consequence dev situation. Lives are on the line and the type of devs who work in this field know that.
Nothing is perfect but the safety downsides of an old car are widely considered to be far greater than the threat of hacking or bad code in a new car.
The one thing that could cause a lot of problems for cars and software is Agile/Scrum.
The projects that are being run in this, new for the industry way, are always late and people hate working on them.
CEOs and other C suite people see the massively shorter lead times that software can offer and are getting greedy. They saved a year or more of time on a feature thanks to code and over the air and then they decide they want it made in 4 weeks, when 3 months would be prudent.
There’s something about the intangibility of software that makes traditional automotive people’s brains break.
Thankfully many rank and file engineers and PMs in OEMs are pushing back against Scrum etc so a more pragmatic layer of management will come up in the coming years. Sadly Agile/Scrum will cause some preventable issues in the meantime.
Unlikely to be safety critical stuff due to the rounds of QA and safety council sign offs and gateways they need to go through. But less safety critical stuff may slip through.
actually I see this break most managers' brains. In my experience it's been a constant pressuring to reduce scope such that the plans of the incompetent tend to be selected over those who know how to build great software with all the non-functional requirements in place (security, reliability, operability, modularity/flexibility etc) .
Nobody in the industry is doing Agile for safety critical systems. The development standards are getting such that writing automotive software is not fun any more, but that is the correct way to go.
Want to electronically open the frunk on an EV? That piece of hardware and software has a surprising level of safety concern. Because inadvertently opening the latch can kill someone.
You are correct to be concerned, but the industry is very much on top of things.
ISO 21434 came out a few years later. https://www.iso.org/standard/70918.html
This was all kicked off after the Jeep Hack. https://www.wired.com/2015/07/hackers-remotely-kill-jeep-hig...
Overall the people in the field working on security these days seem to be excellent to me. They have crypto experts, kernel experts, and pretty good standards.
Before the Jeep Hack, they still took it seriously, but it was a lot of roll your own crypto types, and they didn't really know what they were doing.
Since then all the automotive companies hired and purchased companies from the traditional Cyber area and have trained up hybrid automotive and cybersecurity experts.
They still aren't perfect, but nobody really is, but cars these days have pretty cool tech in them.
If you are worried I'd recommend trying to hack your own car. You can learn a lot from it, and there are a lot of cool things you can do. In my experience, nothing alleviates fear better than a deep dive into a subject.
comma.ai for example have built an open source self-driving platform from hacking on the internals of vehicles. https://comma.ai/
The industry has also recently seen the introduction of ISO21434, cybersecurity engineering standard for road vehicles.
"widely considered" by the same industry who would love to sell you a new car...
That wasn't enough to prevent the Uconnect disaster of a bug that only existed because they sold out on two occasions: when ECUs were invented (green and performance marketing), when smart crap was bundled into cars (smart being a word that universally means ostensibly convenient but in practice even layman consumers hate it).
The reality is that this is the current state of affairs. Most of people doing software for cars have not the foggiest idea what software is really about.
All the software I read is just impossible to understand. And no standard help in many cases.
Some examples I've seen in code: - Use of kind of hungarian notation to the point that a loop variable was named something like "uibe32bb_i_lns" - Comments in other human languages that were not english - Use of recursion - Have seen a call like name1::name2::name3::name4::name5::name6::name7::name8::name9::name10::name11::name12::name13::name14. The names where some kind of hungarian notation, those calls where everywhere in the code. - Lines more than 1000 characters wide, as a rule - Files north of 100kB of code I can go on and on and on....
Some examples of exchanges with people:
1) Software architect, of a ECU: one programer asks for the memory and CPU budget for a function. The reply was "I'm the architect, I've no idea what you are talking about"
2) System chief architect, for a very important project of a big auto-maker: one engineer says something about software errors. The architect interrupts, and explains that the software never makes an error. Because a computer only does what it is told to make. -- that is terrible enough, for example ignoring the possibility or a SEU, but he goes further, to say that any kind of test is not necessary, because, SW, as stated, makes no errors.
Some general points: - 99% of people in "SW" do not know what gdb is. They debug by "cout <<" - I found nobody who knows what tail recursion is - 90% are only able to program, to some extent, in one of C++ or Python, but no other language. - Mentioning Ada, Lisp, Forth will trigger a waterfall of insults saying those are old and should never be used.
I keep buying the most basic cars. I'm genuinely terrified to think in anything automatic in my car.
I meant, they do not know what a debugger is. As stated, they use "cout << "At line 26, after call to xx" as debugging tool. For gdb there are plenty of python extensions, and GUIs, even web front-end, that are not bad... but it may be difficult for some people, I understand that.
> Ada is useful for automotive. Lisp and Forth, not so much (especially since Lisp isn't usually used in hard real-time applications). This isn't the 1980s, MCUs aren't that memory constrained.
Well, first, they do not have the foggiest idea what Ada is. That is my problem. Once somebody suggested we should look into it, for L4 autonomous driving. He was laugh at, and it was said "it is a dead language from the 60, like Cobol or Fortran, nobody has used it in 50 years, there are no compilers for it!!!". I've seen forth being used in some 8-bit uC in the automotive industry still. Now is 99% gone, but was very much used. Lisp, can be used in hard real-time. BTW another thing always hanging around is the "hard real-time" for automotive. It is interesting, because other than for airbags, ignition and injection, you are talking about 100ms response times, which can be achieved very easily.
> Knowledge of obscure programming languages doesn't necessarily make you a better software engineer.
I'm not taking proficiency in the languages, I'm talking knowing it exist, having an idea of what is possible. I mean, I know no good C programmer that is not at least aware or the existence or Rust. And no, 90% of the programmers writing safety critical SW have no idea that a language called rust is available.
> I want my automotive embedded engineer to have a solid grasp of computer architecture, real-time safety protocols, and defensive programming.
Well, again, another example, with a chief software architect, in an ECU, so embedded: I ask "do we have some kind of stack monitoring?" Reply: "What?! we have no stack, stack is a data structure, ..." goes on with a long explanation of what stack, queue and tree are, and when are used... My personal opinion is, if you search for people that "know" only C++, and have no idea of asm, that is what you get. That is my experience at least.
> I don't see a problem with a Korean/Japanese car manufacturer having their documentation in a non-English language. As long as they do everything in-house and don't outsource to India like Boeing I have no problem with it.
I'm talking in-code comments, not documentation. But anyway, honesty, thinking you can do everything in-house today, and you will be able to maintain that in that way for the next 10 or 20 years when you have to maintain the code, sounds optimistic to me, at least. But again, I'm talking code I had to read, and maintain... so... yes... I'm talking a case, where it was BAD to have not english comments.
You are not reading it correctly. It is not code as everyone knows it. It's like an electrical circuit with variable names attached to each conductor, and the code propagates information like electricity would.
There's tools dedicated to this, able to draw pictures of such code circuits (e.g. Simulink, Ascet). And such pictures can be automatically translated into c-code, that looks even worse than anything translated manually.
In the end, of course the tests prove that the code works like the picture of the circuit shows, and therefore the car must work correctly! This avoids the need for anyone working on only the code to understand a car.
In reality, things usually work in the end only because of how simple everything is and high number of iterations.
I’m talking about human written code, meant to be read, maintained, debugged and tested by other humans.
I trusted Volkswagen because of their reputation. Then the news broke about them systematically lying and breaking the law with respect to engine emissions. Shortly after this came to light, other "reputable car companies" turned out to have been not trustworthy at all.
Yes there are good standards in place and some companies claim to adhere to them but no company should be trusted on their word or reputation alone. The better question is what kind of regulatory oversight is in place to make sure those claiming to adhere to certain standards are actually doing so? Also, how much power do the regulatory organizations have in addressing violators?
Ford as one I can speak about with knowledge took seriously the cost of recalls versus catching issues in testing. It's massively cheaper to spend money up front to do full process and catch every bug you can than to cover recall costs to update later not even considering liabilities if anything does go pop.
Mistakes of course happen. But they're also rarely working from scratch.
It makes working in modern ways horrific seeing the shoddy shit tossed out to meet consumer gadget deadlines.
Then a few years later they got hit again with one of their suppliers: Takata's killing airbags
There's a separate standard (ISO 21448) trying to address issues with safety of intent, i.e. maintaining safety when there's no actual fault in the system. (Like the misclassification example.) This one's newer, much less effort has been spent developing it, and even less has been spent trying to follow it. Frankly it doesn't have as much to say. (And how could it? Nobody knows how to solve general classification problems, and especially not with something running on some 20 W max control unit.) This part of the problem space is basically the wild west. Some auto makers do a good effort trying to create safe solutions. Others not so much.
In summary, some of the electronics solutions in the car can probably be trusted to do what they're meant to (e.g. airbags). Others (e.g. lane keeping assist, emergency braking) are still still mostly be safe but certainly warrant keeping your hands on the wheel. Anything approaching fully self driving is at best quite dubious at this point though.
Both are acceptable standards, but ISO 26262 is a behemoth of a standard that most people have never read. Many companies don't even make the full standard available to their development teams, let alone educate people to employ it effectively. Similarly, MISRA is fine in theory, but the practical usage often ends with running code through an automatic checker that can only detect half the rules.
The API itself is decent but the configuration and the ecosystem is a nightmare.
https://www.motorauthority.com/news/1121372_why-mazda-is-pur...
You can make a modern electric vehicle with actual buttons and dials. There is nothing about a car not having a gas motor that requires every tiny bit of functionality being controlled by a touch tablet. If anything it just seems like laziness in car design.
I'm with you and hope all the idiot touch screen crap is ditched.
That means you can get OTA upgrades that 99% of the times will work flawlessly, but a day may do not, the day you are in a rush in the early morning.
Since most connected cars are de-facto owned by their vendor a potential breach or deliberate sabotage might brick ALL at once across the globe or in some specific areas/countries.
...
A modern car is a car co-piloted by a human and a computer. A local airgapped computer might have bugs, a connected one might have vulnerabilities. Be more scared about them.
In mere local safety terms I can say most cars I know are partially mechanical that means for instance your steering wheel can auto-steer BUT with (more than) a bit of force you can steer it mechanically even if automation completely fail. Similar the break pedal have some servo systems but still partially work in mechanical forms, so might became very hard to push but still able to break a bit.
The most dangerous common design I know are:
- impossibility to turn off certain ADAS who might act really badly in certain weather condition, like the classic ABS on icy roads;
- automatic doors lock when car move, NO DAMN WAYS to unlock them while the car still moving;
- manual parking break disappeared so a kind of emergency breaking ALSO usable by a passenger (for instance if the driver fell ill suddenly) ABSENT and no electronic replacement either since the electronic one if present refuse to engage if the car is moving;
- cockpit design that makes very hard/slow for a passenger to push the driver feet out of accelerator etc if he/she fell ill suddenly.
I consider the above as a sign of VERY BAD design, so I doubt those who made it can be trusted for anything else in safety terms...
2. The code, in many cases, is probably an unmaintainable mess. Embedded programming is not always modern programming, for good and bad.
3. Today, the computers in cars are doing more, and the systems are more complex. It's reasonable to expect more serious problems as a result.
4. Companies do safety testing, of course, but there's no such thing as as "100%" test coverage for complex physical machines running outside of a lab.
5. The best way to judge the safety of cars is the best way to judge safety for airplanes: let other people test them out for a while and then check whether or not they report problems.
Now the companies are migrating to real programming in C++, and it is a terrible mess. There are just not enough people with software competence to drive it.
I've seen people trying to do L4 automated systems with this blocks. Pages and pages and pages of boxes (which can only be the basic logical function, and the 4 basic arithmetic operations!!!). Of course the project didn't go anywhere!
Optimize this problem by buying a car with the best safety rating. This is something that can be objectively measured, both in crash testing/labs and from reviews of real-world crash results. Expect that a crash could be inevitable as it is totally out of your control. Optimize for the best odds of surviving a crash without issues.
[1] https://asrg.io/
The difference is that Tesla has bugs throughout its software, including critical systems.
The other OEMs are extremely slow and methodical about updating things like their fuel-injection software, and that stuff goes through incredible amounts of QA. The same is not true of Tesla. They're a company that, at Musk's direction, do not prioritize QA or human life.
https://en.wikipedia.org/wiki/Firestone_and_Ford_tire_contro...
https://en.wikipedia.org/wiki/General_Motors_ignition_switch...
https://medium.com/the-snail/the-exploding-ford-pinto-of-197...
https://abcnews.go.com/Blotter/toyota-pay-12b-hiding-deadly-...
Such care, wow.
But as long as we're talking about results, "FSD" is an intentional homicide engine and Tesla has had far more recalls per vehicle sold than any other company. They're at the bottom of quality rankings by a wide margin.
Plus OEMs have a vast parts and software supply chain that can be compromised.
I suspect that in couple years timeframe we can see massive incident, like ransomware, that will disable entire fleet of a single OEM globally. Like imagine all Mercedes around the world to just stop operating - these kind of incidents
This stuff falls completely within any infosec person's expectations. Privacy leaks are expected, as are interference from remote signals.
Once you've committed to never driving after having had a drink (and surely never more than 1 drink), never driving while tired or on medication, have completed several advanced driving courses/car control clinics, chosen the top cars based on safety and crash testing, only then might it make sense to use software development methods as a tie-breaker to pick a car.
I am never going to put my life in the hands of some software doing image analysis using machine learning.
Well...on your car, at least. I'm not sure how comforting that approach is when you're surrounded on the interstate by Tesla "FSD"s.
Back when I spent a full year commuting to work daily on a motorcycle, Teslas with "FSD" were the least of my worries. All while people who sharply switched into my lane in front of me with no turn signals used, they ended up almost killing me a couple of times.
> https://illmatics.com/carhacking.html
is a good starting point. But there are a bunch of buses on a modern car, some of them are critical, some less so. Some are firewalled off, others are open.
As you know you can get access to a lot of the car's inner workings by plugging into the ODB2 port. Its perfectly possible to brick some cars by fuzzing the ODB2 port.
In principle, most things in cars _should_ fail safe. even if they are electric or talking over a bus of somesort.
As a hacker like any other who realized that all supposedly ultra safe American quality (TM) software in mission critical applications is in fact less secure on average than random amateur projects, I have been worried about software in vehicles for 20 years. I correctly predicted that it will lead to remote control vulnerabilities such as the uConnect vulnerability disclosed a decade later. There are obviously more of such vulnerabilities out there, just nobody is researching this. I also suggest people start looking at HVAC.
In 2015, some security researchers found a vulnerability in the Chrysler Uconnect software which allowed them to connect to the car's IP address (yes, each car had an IP address, which you can't get rid of), and control the vehicle (as in actually control it). There were 1.5 million vehicles IIRC that were vulnerable to this. So if a bad guy found it first he could have controlled all those vehicles at once from the comfort of his home, probably causing 10% of them to crash and kill people (given that 1/10 of your average modern driver would probably panic (or not panic but still fuck up) from the slightest surprise on the road).
I also am of the opinion that people regularly die from software faults in vehicles, but we just haven't figured this out yet.
What is NASA/JPL rules? Some more misra C crap where it's just making the code more "readable"? Most "software engineers" have extremely wide gaps in their understandings of basic things from programming, to math, to physics. The problem has much more to do with this than cute little best practices recommendations.
Now they are trying to write C++.
However, most of the safety systems software is held to a very high standard, and much happens in embedded systems where the surface level for software foot-guns (such as state) are minimal. I wouldn't worry about buying a new car for these reasons. Though I would try to find one with as many physical buttons as possible.
In my personal experience, the automotive industry has a problem with aggression and dishonesty, both of which seem to go hand-in-hand.
Both of these cultural traits tend to have a negative impact on quality and safety.
Due to regulations you will not be able to find a non-veteran vehicle without those systems, nor you'd want to, but BMW, Mercedes, Subaru and Lexus still have models which are well balanced and don't rely on those to such a heavy degree. This would be my advise as well.
Disclaimer: I am not against (almost) perfectly deterministic safety systems such as ABS. On the contrary - I consider them to be a massive advantage or almost mandatory.
And how about the additional failure mode for say BMW. It's modern controllers have all the software for all the features in, just disabled unless you pay more. So a theoretic sophisticated attack could throw all sorts of crap into operation.
Generally speaking, most sportier cars have better handling and better weight distribution and are in fact regularly driven with DSC disabled (on racing track etc)
When will auto companies wake up and realize that physical controls are better in every way?
Any car running CANBUS is vulnerable to a potentially fatal attack. They have not resolved this. However, you also generally cannot avoid it. Even the base model Honda civic is vulnerable to attacks on the drive-by-wire system. In a less morbid sense, most modern cars cannot even be serviced at home without going to the dealer for a reset of whatever subsystem. ABS comes to mind.
I would not detract from an old car. A car 25 years old has 99% of the safety features of a modern car and, in good working order, will protect you just the same. Or maybe I just don't worry about it because the probability of anything greater than a minor fender bender killing you is pretty high even with modern tech.
DEFCON has a lot of great security demos, but don't mistake any of those demos as representative of the real-world landscape of issues.
So can my brake or fuel lines, if you're needing physical access. Get this, the door locks aren't even 100% secure, there's this whole thing that side steps them called "Windows".
I do lament not being able to fully flush my brakes at home and do wish the programming harness would be freely available to override the system and have the ABS clear the lines. However, I wouldn't for a second choose to not have ABS on any vehicle I own, including my motorcycle.
No, it isn't. CANBUS is a non-safe protocol ("black channel" in safety parlance) and if anything safety-relevant is sent over it, there is a safety protocol on top.
And I definitely agree; I'm way more likely to be harmed because of a drunk driver or someone running a red because they're just too busy to bother stopping at this light today rather than some hacker remoting into my car to change the car from drive to neutral or remotely disable ABS or something like that.
This statement shows a fundamental lack of understanding of how automotive computer networks operate.
The CAN bus is just a network. It's an industrial control protocol that's been adopted by the automotive world. It doesn't offer security by design, it's intended for use in limited environments where all hardware on the network is known and trusted. CAN provides methods for prioritization of devices, that's it. Any security is left to higher layers of the stack.
There is no such thing as "breaking" CAN, you just physically connect to the network and you're able to talk to whatever controllers are on that network (most modern cars have multiple CAN buses connected to different subsets of the vehicle systems). At that point it's about the security features implemented by the devices on the network.
> Any car running CANBUS is vulnerable to a potentially fatal attack. They have not resolved this.
There is nothing to resolve at the network level. To put it another way, almost every computer that's ever been hacked over the internet was running Ethernet but that's just as irrelevant as CAN in cars.
If you are able to physically connect to the network, you can talk to and potentially spoof devices on the network.
> A car 25 years old has 99% of the safety features of a modern car and, in good working order, will protect you just the same.
You couldn't possibly be more wrong. Pick your favorite vehicle from 1997 and look up the crash test videos, then compare against a similar recent model.
Here's the most popular vehicle sold in the US, the Ford F-150, from 1997 (https://www.youtube.com/watch?v=_i5EmJBaGeQ) versus one from 2016 (https://www.youtube.com/watch?v=Cou88zi4pMY). You tell me which one you'd rather be in.
You might say, correctly, that the 1997 F-150 is particularly bad, but here you can see a 1997 Volvo V70 versus a 2009 Volvo V70 (https://www.youtube.com/watch?v=msnJK0ce-VM). Volvo has a reputation for building some of the safest vehicles on the road, and even those twelve years show substantial gains in crash performance where the older car's passenger compartment is clearly compromised while the newer one's crumple zones work as intended.
> Or maybe I just don't worry about it because the probability of anything greater than a minor fender bender killing you is pretty high even with modern tech.
Again, absolutely wrong. I say this as someone who's flipped a truck off the road at highway speed and walked away with minor abrasions and bruising from the seatbelt and a few cuts from broken glass as the rest of the truck got ruined but the cab stayed intact. My anecdote is of course statistically meaningless, but the data agrees. Crash fatality rates have consistently trended downward from the '80s until 2020. The main reason modern vehicles have gained so much exterior size without gaining nearly as much interior size is all the space taken up by modern safety equipment, crumple zones, etc.