This is reality. Decisions aren’t kill/not kill.
It’s much more ambiguous and murky, especially when dealing with low probability events. That’s why it takes expertise, but it doesn’t mean there shouldn’t be accountability for those “expert” decisions. Other engineering domains already have this, there isnt anything that makes these decision inherently different.
As I’ve said elsewhere, good organizations implement a distinct chain of command for those decisions so they can be made more impartially. Even then, it’s not without career risk, but IMO that’s part of the gig and why it takes a certain amount of professional integrity. As someone else said, if someone isn’t up to that task, maybe developing safety critical software isn’t the right gig for them.