> “These technologies assist drivers, but do not replace them,” said the statement from the federal institution responsible for transportation policies and programs ...
> Tesla’s website states, “All new Tesla cars come standard with advanced hardware capable of providing Autopilot features today, and full self-driving capabilities in the future.”
... full self-driving capabilities in the FUTURE ...
So it is a DISHONEST marketing issue that creates a wrong perception in the consumer, and that's what the government should address.
Create regulation that do not allow such vehicles to be marketed as "self-driving". Create some kind of certification to earn the right to advertise a vehicle as "self-driving". And define and standardize better labels for the current technologies used, and publicize it well.
German courts share this opinion and Tesla is prohibited from naming their product Autopilot or referencing any FSD.
The person in this story wasn't confused and didn't even attempt a defense that Tesla had mislead them.
If something gets good enough you will rely on it more and more. It will work, until it doesn't.
Rhetoric like this works until it won't anymore. And that rhetoric breaks when innocent bystanders start getting killed in numbers. Line up enough grieving families in front of congress and congress will act, and then the new rhetoric will be "The car should be your mother."
Which there's two things to separate out; the lack of a feature on a vehicle doesn't particularly free the driver from responsibility for things they do, and then there's what sort of idiotic vehicles we are willing as a society to share the road with.
"A car is not your mother" doesn't really answer the second part.
People need to understand THEY are responsible for what they are doing, not the companies for not preventing them doing stupid things.
"A car is not your mother"
I am definitely going to use it:)
Given this and the man's other past incidents, the Tesla was only a minor contributing factor in the guy's global stupidity and danger that he inflicts on the world daily?
Edit: Btw, what would be the end accident scenario here? Presuming that the car wouldn't hit any other car on the road, would it lose control on a curve that was too tight for the speed, or fail at the end of the freeway or something? Would it alarm and then come to a gradual stop? Not a Tesla owner here.
I think the biggest problem with all of these self driving systems is that in cars they can require you to be fully 100% aware of your surroundings and take over in a split second. People compare it to plane autopilot, but I don't think that's quite right - in a plane, you will have at least a minute or two before you hit the ground, even if the plane takes a straight nose dive. In a car, you go from being fine to being headed for a head-on collision in no time at all.
[1] https://www.reuters.com/article/us-tesla-crash/tesla-driver-...
It also beeps when it thinks I’m going to hit a stopped car and other things that demonstrate that this claim is false.
Alarmed engineers taking the exact same route to reproduce the bug successfully.
Right now, in Canada, we have to meet certain criteria and pass multiple tests to get our drivers license without which we cannot legally drive on public roads.
Until there is some kind of similar certification for self driving systems I don't understand why there would be any legal difference between doing this in a Tesla Model S and doing it in a '94 Miata.
Two people in the car with both front seats reclined, “appearing to be asleep.”
The asleep part is almost certainly sensational drivel. This is a driver with a history of reckless driving who was screwing around with his friend.
But because he was screwing around in a Tesla it is therefore newsworthy.
The seats were reclined, but there is no evidence that the driver and passenger were both asleep. They “appeared asleep” because the seats were reclined.
This was undoubtably a stupid prank. I’ll be glad for the day when kids can’t pull stupid pranks in their cars. The trade-off is we’re unlikely to truly “own” our own cars at that time, they will operate as essentially private taxis.
12-car accident triggered by manual mode
defense attorneys stated a poor decision due to fatigue, while prosecutors provided evidence that the driver had taken a 25-minute power nap, woke up and performed well in a class she was taking, and then proceeded to override the autonomous vehicle system in a manner which showed specific and malicious intent to cause the accident.
The Law Commission instead suggests that responsibility should fall on the developer or manufacturer of the hardware that enables self-driving functions on the vehicle.
see https://thenextweb.com/shift/2020/12/18/autonomous-vehicle-m...
https://s3-eu-west-2.amazonaws.com/lawcom-prod-storage-11jsx...
Empirically, you’re absolutely right, except that it’s not wrong.
I’ve come to terms with it. How long are we going to bleat about morals while the world ignores it?
I’m asking genuinely, for what it’s worth. It’s one of the central questions I’ve faced. What are your morals worth? Why cling to them? It feels good to call out Tesla as immoral, but both legally and practically this seems to be mistaken.
Everyone has to use public roads, not just you. You are endangering all other diligent road users with your behavior, not just yourself.