https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
Driving a car in public traffic is very much a social situation. A wink here, a small hand or head gesture there. Even looking away and avoiding communication alltogether is part of social behaviour and tells something. Even the same gesture in a different context says something else.
Parking a car, driving in congested traffic and just keeping a lane is all doable by technology. Beyond that it needs a different and social way of thinking.
It really feels like it's driving just like a human would in so many situations, and understanding the intention of pedestrians and cars driven by humans.
Seems to me that they may be well ahead of Tesla, they just don't do much publicity or push it out to end users before it's actually ready.
We need to get rid of the social aspect of driving, and make it a pure, reliable system that doesn't depend on the mood or the alertness of billions of humans to not kill someone.
This is a social norm of driving today that makes little sense. It’s actually quite dangerous to stop in the middle of traffic and wave at someone. And the person waving usually acts in an impatient and unpredictable way. They will give you about 5 seconds to lurch into the road before they get irritated and drive on. Or they could easily commit insurance fraud by luring you into an accident that appears to be your fault.
If all followed the rules of the road it’d be a lot safer and also slightly easier to automate
Often it can be safer to establish some form of consensus instead of assuming the cars around you have both interpreted events the same as you and are about to behave in the way you'd predict.
I'm also not sure if this was what you intended in your example, but there are roads in my town where unless someone waits and yields to let someone on a side street turn onto or cross the main road, the person on the side street is literally going to be stuck until rush hour is over. You won't find it in the DMV handbook, but yielding in this situation really is a kindness.
[I mean: when both the main road and the secondary road that has to yield are clogged, everyone on the main road lets one car from the secondary pass.]
I've certainly applied it and benefited from it countless times.
And it'd be your fault for not yielding to the idiot stopping in the middle of the road, not knowing what "right of way" means.
These laws were made exactly to avoid the uncertainty of who goes first.
(much like maritime rights of way exist, but some navies prefer the simpler rule: "if it's grey, stay away")
[1] https://en.wikipedia.org/wiki/Venezuelan_patrol_boat_Naiguat... [2] https://en.wikipedia.org/wiki/Melbourne%E2%80%93Evans_collis... [3] https://en.wikipedia.org/wiki/Ehime_Maru_and_USS_Greeneville...
Couple of counter-arguments:
1. This is completely optional. I can get from point A to point B without ever looking at another human face (and that is the way I like it, aha aha).
2. In real life, you may see some middle finger as well, especially if the traffic is bad.
That tech probably wouldn't be economical to use in a warehouse though.
Extending these limited trials to every road in the world isn’t going to be fast, but a few cities with zero human taxi drivers could happen surprisingly quickly.
I know Tesla has made big claims like that, but I'm not aware of any other car companies having done so?
Back in the day, no one even considered these changes to be bad. Today if politican came up with the idea that we need to remove pedestrians (and bikes) from the streets so that self-driving cars could be safe, there would be an uproar.
Rightly so, telling people they can't enter their own neighbourhood unless they have a $50k+ car is beyond awful. I think it would be better to remove cars from towns and cities.
When a "self driving" car kills someone, guess who shares legal responsibility?
> When a "self driving" car kills someone, guess who shares legal responsibility?
When a rollercoaster crashes who takes legal responsibility?
(Answer in UK law: The rollercoaster owners are responsible for negligence if they do not maintain the rollercoaster adequately or follow safety rules. The rollercoaster's builders are responsible for negligence if there was a design fault or if the rollercoaster was fundamentally unsafe. The rollercoaster operators can be responsible if they do not follow operating procedures which they would be reasonably able and could be reasonably expected to follow and have been trained in).
Similar ideas have been established in self driving cars already with Mercedes taking legal responsibility in Germany if it's cars crash while in Level 3 driving mode (assuming operators are following the safety rules, for instance making sure that they are sober and able to take over driving if required).
Typically not the manufacturer. Because a roller coaster operates in a fixed and controlled environment within which it's design can usually be shown to be inherently safe.
The same can not be said for a "self driving" auto.
Given the current state of technology, one could make a convincing legal argument in many jurisdictions that just marketing an auto as "self driving" is itself a deceptive and inherently negligent act.
I don't think people get how utterly behind and unsafe the legacy carmakers products are.
I've rented current-year Cadillacs, Audis, BMWs, and it's all a terrible gimmick. They can't keep the vehicle in the center of the lane, "adaptive cruise control" repeatedly tries to kill you by accelerating into semi-trucks and trailers, the autonomous systems repeatedly disengage, and they certainly can't navigate around on city streets, can't change lanes, basically need 100% of your attention to operate in a remotely safe way.
Tesla may have their QC issues and their Elon issues, but the technology is a decade ahead. You can punch in an address and FSD Beta literally navigates you through the city with pedestrians, traffic, and so on to your destination in a fairly predictable manner. The adaptive cruise control, lane changing, and highway functions are predictable and reliable enough to handle a regular commute. It's easy to stand on the sidelines and be a critic but if you go out into the real world and actually drive all these vehicles it's obvious who is innovating and who is desperately playing catch-up.
Tesla's offerings are a real letdown. And they don't work as advertised either, which is why they're not allowed to advertise them in the same way anymore. It's missing way, way too many things that the car should notice. It's kind of like Siri. When it works, it's amazing. It often doesn't so you can't really rely on it. We also had to replace a rim on the right hand side because the car simply didn't turn the wheel back. Girlfriend won't use the feature anymore now because she's kind of scared.
The summons feature doesn't work reliably either even when the conditions are perfect. The car just stops and you stand there and don't know why it's not doing anything at all.
I can't speak for BMW or Cadillac but calling FSD a decade ahead is ridiculous. It fails at so many simple things and all videos online confirm this and they don't seem to be able to really fix this. Every new Beta shows the exact same problems again and again.
No. It's not. Legacy automakers are decades of ahead of Tesla in measuring/controlling quality and reliability as part of their long-term cost structure. Tesla is a VC hack that wasn't even founded by Musk.
E.g. should a self-driving car leave the road to avoid sudden accident? What if it then hits a pedestrian?
One advantage to software-driven decision making is, we can make a rule and they will all follow it. I.e. the law can decide no self-driving car should leave the road.
Rigid adherence to a law like that is a pretty major disadvantage to saving lives though...
They say this is why it's commmon for Tesla driving system to disingage 1-2 seconds before it knows the vehicle is going to crash.
https://www.motortrend.com/news/nhtsa-tesla-autopilot-invest...
The fly in the self-driving ointment is accidents are NOT 100% unavoidable for humans assisted by any technology.
People can jump in front of cars or intentionally interfere with said tech, for example.
There will be accidents, and the AI systems can predict with great certainty that a crash will occur before the human at the wheel does. In Tesla's case, it disingages before the time of impact to mitigate Tesla's culpability. (seemingly)
It never made any sense that manufactures of cars thought they could make self driving systems. Experience with supply chains, manufacturing, and marketing have zero translation to the problem space of self driving: sensing, interpreting, and decision making.
Surround the paying customers with gasoline powered opulence to enable them to enjoy the AutoBahn at 250 km/h.
A video of a 2021 S-Klasse on the AutoBahn:
Where the headline comes from, I don't know ... The original is roughly: VW annoys Argo AI ...
Compare that with autonomous vehicles. Several billion has been spent on development since 2010, but test fleets are already deployed in multiple cities globally. They also don't have to deal with the excruciating approvals process for further deployment that new public infrastructure has to, so there's a reasonable chance they'll actually exist before the heat death of the universe.
Would fixing the entire political system be great? Sure, but I'm tired of getting screamed at in public meetings.
Add the legal liabilities, and it's a fantasy project with hardly any takers like the metaverse.
I seriously question that as a general statement.
I don't mind driving for the most part and there are situations where I even enjoy it. But the congested drive into the 1 hr+ away city for an evening event or long boring highway drives? (And certainly a commute if I did one?) If there were an AI chauffeur I could engage for a price not that much elevated over driving myself? Sign me up. Not that I expect to see that anytime soon.
Give me more trains and then self driving cars only for the last few kilometres of the route.
Did you ever had to drive somewhere for more than a few hours?
Everyone I know, would rather have a chauffeur then. If the chauffeur is trustworthy.
There was a time when I was regularly doing 6 to 12 hour drives every other month or so.
Unfortunately. I guess it will take 50-100 years more for necessary changes in society values/worldview to realize that it's really not a good idea. Just like it happened with smoking.
I, for one, would love a self-driving rv. I can see it now, first taking a shower, then sitting back on the sofa, drinking a cup of coffee while silently gliding to the office in the morning.
Then on sunday night you simply go to bed, wake up, have the shower and coffee and boom you're back at the office. Or if you work remote you could potentially work while the van drives you places.
Why we act like the death toll of 35.000 A DAY is an acceptable price for "the fun of driving" I can't comprehend.
The only reason we actually (as a society) care about people dying is when they're dying from something novel.
Now going on a long weekend and having fun on a winding mountain road is another matter...
At the moment it's int he same place as IRC was back in the 90s before social media took off
But the big difference between the 1990s and now is that everybody already has perfectly good text chat. Plus free voice calls and video calls. Plus regular social media for non-synchronous interaction.
I can believe VR Chat is better for some niche of people. But I don't see much reason to think it's going to be enough better than all the other options that people will don special gear for it.