No, at the very least tort laws still apply even if the driver is a corporation. Do you really need someone sitting in jail to satisfy your justice boner?
By corporatizing social harms, basically nobody is ever held accountable - except for the little guy.
Again, this is false. At the very least there's financial penalties, which the shareholders are on the hook for. Moreover the corporate malfeasance that does happen don't map nicely to human crimes. If you kill a guy, you get sent to jail for decades. But what if you're a company, that makes a machine with sloppy code[1] that unintentionally kills someone? What do you do? Jail the programmer who wrote the code? Jail the manager who did the code review? Jail the CEO who had no knowledge of it but "buck stops with him" and we hate CEOs? How does the death penalty work? If you think it through it's basically a fine equivalent to the company's market cap. If Boeing does a bad that kills one person, does that mean the US government just repossesses the entire company?
However there's cases where its completely proven that someone high up knew there was a systemic safety issue (they had a broad view and could see all the different areas of what was going on), they knew exactly what was causing it, and they do nothing because they want to keep the profit going. The fact those people don't go to jail just tells me that corporations have way too much leeway.
For example: https://www.owhsp.qld.gov.au/court-report/fines-imposed-fail...
Is there any indication this requirement was breached for this case? I'm all for jailing executives of companies where they specifically failed to enact safety measures, or even didn't care enough about safety, but in this case it's simply a case of a edge they didn't test. It's not for lack of trying either. Apparently they have their own AI model to generate test data, so they can train/test what happens if a hurricane hits, for instance.
https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-f...
In this case it just sounds like the thought process was
> waymo did a bad
> someone doing the same would be arrested (?)
> therefore somebody needs to be arrested
If i poison someone, i go to jail. If DuPont poisons thousands, "there's financial penalties".
Literally, and intentionally avoiding any attempt to examine the implications? No probably not.
But reasonable punishment discourages bad behavior. And software engineers have a habit of ignoring the implications of a defective design. I think apocalyptic fines applied to the companies creating the systems for automated cars would also create the correct incentives, but I find that to be less likely than imprisonment.
What I want is software and systems to not suck ass. I don't want to deal with defective... everything, because it was faster to deliver. That's especially true when it contributes to the death or injury of a person that didn't do anything wrong.
I don't care what works, but people being afraid of going to jail for hurting someone absolutely does work. And 'administrative fines' don't work.
This just feels like the "we should make the justice system harsher to deter crime" argument but applied to software engineering. If it works, why stop at criminal cases? Maybe we should dock the pay of SWEs next time they cause a prod issue?
Why not software engineers too? Why are we so special that we can never be held accountable for the damage our lack of standards causes?
Ignore that feeling, it's wrong. Because it's not what I'm arguing for. Reasonable is a load bearing qualifier.
It doesn't feel like the people making the decisions that meaningfully contribute to causing harm to other people, ever have to deal with the fallout or repercussions for their unfortunate choices. Deincentivizing that behavior is my goal. And I'll unfortunately take iterative or suboptimal options at this point. I don't like it, but I do want to try to be realistic.
We don't send everyone to jail either. You can run over people and get away scot free, if it's an honest mistake and you weren't being negligent.
Do they?
Twitter is creating CSAM, Meta & OpenAI pirate millions of books and Nvidia is playing some sort of shell game to pump their stock price.
If a regular person committed any of those offenses once they would be lucky to just to be sued but because of "AI" nothing happens to these companies.
>Twitter is creating CSAM
It's unclear whether generated CSAM is illegal, see: https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn.... Moreover x/x.ai wasn't intentionally generating the images. Yes, someone intentionally set up grok to generate images, but nobody at x/x.ai was like "yes, let's generate some CSAM". That adds an additional layer of obfuscation that makes it harder to compare to a "regular person".
>Meta & OpenAI pirate millions of books
Give me a break. People on /r/datahoarders pirate millions of books all the time. Use a VPN and basically nobody bothers going after you. If anything Meta/OpenAI are getting harsher treatment than the average person because they're juicier defendants.
>Nvidia is playing some sort of shell game to pump their stock price
That's not even something that's illegal.
Do you have some examples ?