I own two classic cars (well, one of them will be soon enough). By virtue of their age, they are obviously not self-driving and never will be. But I hope to have them in my possession 30-40 years from now, because I like them very much.
I believe I like driving, because I only do it for fun. I either take the train or bike to work, depending on the weather, so I am not beholding to my cars. Hence why I don't own a modern car.
I recognise that my position is unique, but I would sad to lose my ability to drive my cars on the road (and my collection may grow in the future). I understand people's desire for self-driving cars, and I'm glad if they get them. As long as I can drive my old-timey cars.
Indeed, once there are more electric cars than petrol powered cars on the roads, getting petrol will be harder. But that's OK, as long as it's still possible.
My position is; I may not personally need self-driving cars, but I respect others' desire to want them. But articles like this make me cringe, I hope I don't need to explain why.
The way I see it is that once self-driving cars are safe and viable, anytime an accident involving a human and a self-driving car happens, the human is automatically assigned fault. It doesn't restrict your freedom to drive the car, it just makes you think twice since liability is against you should an accident happen.
This of course, assumes that self-driving cars are capable of driving perfectly within existing driving laws. If that is true, there's no way a self-driving car could legally be at-fault for an accident.
This is the middle ground, at least between "human control should be disallowed" and "automation should be disallowed".
I'm a huge fan of Ironman automation method - augment, not replace the humans - since it takes the boring and error prone parts away, and utilizes humans for the edge cases. There's a lot of potential for huge wins for low costs. And, absolutes aside, that's in line with this article.
The alternative Ultron method - take the humans out entirely - is simply too expensive and error prone. It's absurdly common to watch entire automated assembly lines grind to a halt because of corner cases that have to be fixed by humans. We're seeing the same thing with automated vehicles - when corner cases are hit, they pull over and signal for human intervention.
So, I guess I agree with the author, though in a less stylish way. Please make me a better driver; don't remove me from the seat entirely. Just imagine that fancy VR simulation they showed from the Waymo cars made available as HUD data for a regular driver. How much would that alone help with safety on the road?
Which, by the way, I'm all for. Keyboards, steering wheels, touch screens... they are all just too limited for proper human-computer interaction.
I'm fine with reserving the right to manually operate a vehicle; too many situations in which an autonomous vehicle will just not understand what needs to be done.
I'm rather more concerned that without needing to operate the vehicle, and the design modification of the car to be more conducive to social activities, riding in a vehicle will become an even more attractive leisure activity, clogging traffic while people netflix (& chill) in their pleasure barges.
I also expect once autonomous vehicles are the majority of cars we'll see stricter enforcement of speeding laws. When everyone is speeding because that's the norm, you don't stand out. When every other car is self-driving at the limit and you're going 20 over in your Corvette, you stand out like a sore thumb. The UK already uses average speed cameras. I think the only reason they aren't widespread elsewhere is simply because many people speed.
No, we'll see speed limits that reflect the speeds people actually travel and probably dynamic speed limits based on the 90th percentile rule (or whatever best practice rule we come up with to supersede that) instead of our current system by which speed limits are the max of what the various stakeholders will ok.
AITaxiCo and AIDeliveryCo are going to lobby the living hell out of reasonable (where reasonable is not a circular definition involving the current speed limit) limits because they don't want following the letter of the law to put them at a competitive disadvantage.
This makes the whole thing sound like a NRA-style clickbait article and nothing more.
That said, I am not exactly a fan of how autonomous vehicle scene is unfolding with lot of investments being made by ride sharing companies. And these guys have signed a "Shared Mobility Principles for Livable Cities" [0] which states:
WE SUPPORT THAT AUTONOMOUS VEHICLES (AVS) IN DENSE URBAN AREAS SHOULD BE OPERATED ONLY IN SHARED FLEETS. Due to the transformational potential of autonomous vehicle technology, it is critical that all AVs are part of shared fleets, well-regulated, and zero emission. Shared fleets can provide more affordable access to all, maximize public safety and emissions benefits, ensure that maintenance and software upgrades are managed by professionals, and actualize the promise of reductions in vehicles, parking, and congestion, in line with broader policy trends to reduce the use of personal cars in dense urban areas.
Oh, wait, at the bottom of the article, now we're saying that we need an NRA-style lobby for human driving? As in: an ideologically blinkered death cult beholden to narrow corporate interests (in this case, rather than gun manufacturers, car manufacturers dependent on personal ownership of vehicles for their business model) with zero scruples about what sort of body count it facilitates? Is this a parody then?
To generalize: your lack of interest in an activity is not a good reason to remove that activity opportunity from everybody. I'm sure you have a lot of freedoms that you don't take advantage of - should they be removed as well?
> Despite a storm of clickbait media reports, there is still little evidence that self-driving cars are safer than humans.
This is the central thesis, but the author made up the idea that we are supposed to already have these cars that are safer than humans. Nobody thinks this. Nobody. These cars will hopefully exist and hopefully be a lot safer than humans but they certainly don't exist yet.
The author thinks that killing 30,000 people per year is a-okay and worth it for the 'freedom'. I disagree. It's not even freedom you get from driving yourself. You still have to stay on the roads and use a seatbelt and use your turn signals 100% of the time, etc. I would place a large bet that when the author drives, he breaks 10-20 safety laws each drive. Everyone does.
We should aim for 0 deaths per year and do everything in our power to get there. Taking the epic weapons away from people is not a reduced freedom. Self driving cars do not take away your freedom just as not owning an AR-15 does not take away your freedom.
The goal of self-driving cars is 0 deaths/year. If that is not your goal of safety without self-driving cars, then you don't care about safety period. The selfishness exhibited in "I don't care if more people are dying on average, I want to drive and you can't take it away from me" is insane.
> If our safety was the experts' first principle, the billions invested in self-driving cars would have gone to subsidizing free professional driving school, raising licensing standards, and making critical safety technologies like seat belts, airbags, ABS and automatic emergency braking (AEB) standard as soon as they were invented
No that is not what they would have done. They would be building self driving cars if they cared about safety.
People are not going to be better drivers if you give them more classes. They will still get drink and kill 5 innocent people on their way home from the bar.
Even if fully autonomous cars remain impractical for real world commercial use cases because it can't match human capability to handle edge cases the investment in a programme pay off in terms of driver aids. And the fun of hands on a wheel can always be had on a racetrack.
But the goal of autonomous driving isn't "zero deaths" it's "sell cars", and convenience gets mentioned as least as often as safety as a selling point by its promoters. Since the errors autonomous vehicles make are different from those made by humans (and there's no reason to believe they will ever be zero), there's a strong argument that accidents can be further reduced by retaining an alert, responsible human driver, even if that driver's slightly less alert and responsible than they otherwise would be. That's been the case for Waymo's programmes so far (there's certainly no safety argument for making their backup drivers remote at this stage; that's all about a public show of confidence). The safety/convenience tradeoff isn't straightforward, because perfectly convenient fully autonomous vehicles means a safety benefit from fewer miles with inadequate or even drunk drivers having access to controls, but you probably wouldn't shoot for full autonomy and no controls if safety was the real goal. Frankly, even if the responsible human driver did absolutely nothing useful they'd probably net reduce road deaths simply by ensuring self driving technology doesn't massively inflate the number of miles driven per capita
No, you don't. If you're not on a road, there's no need for any of those. You don't even need a license. Should those usecases be taken away?
> They would be building self driving cars if they cared about safety.
Imagine, for a moment, a car with all of that data that it collects distilled down into a HUD for the driver. Can you honestly tell me that it wouldn't be better than the status quo? Should we not ask for such a thing, since it doesn't remove the human from the equation?
There will never be a 100% safe automatic pilot - putting off adding safety features to cars today until we get that perfect system is, to me, really quite silly.
I've never heard of any of this and I think you must be oversimplifying. You certainly need a driver's license even if you're going to drive on your lawn or in a parking lot or off the road somewhere, surely? I'll go do my research but there's just no way you can drive a car around private property or something without a license.
> Imagine, for a moment, a car with all of that data that it collects distilled down into a HUD for the driver. Can you honestly tell me that it wouldn't be better than the status quo?
I don't know, it sounds worse. HUDs are distracting and I would certainly be a worse driver personally if I had that in my car. I'm not sure that solution is an automatic improvement in safety.
> There will never be a 100% safe automatic pilot - putting off adding safety features to cars today until we get that perfect system is, to me, really quite silly.
Of course we shouldn't put off features for safety! No way should we do that. But if you have $50 billion to spend on the project, then no way should you not try to build a self-driving car with it. Obviously you can get a safer car with that much R&D capability if directed towards autonomy instead of just adding HUDs and better seatbelts or something.
Elon Musk does
https://www.inverse.com/article/38049-elon-musk-self-driving...
> Elon Musk Says Tesla's Self-Driving Tech Already as Good as a Human Driver
Which is the exact opposite of what I said above. Elon's quote implies he thinks his cars would still kill around 30,000 people per year. That's not what I'm talking about. I'm talking about a car that's thousands of times safer than a human. Not "as safe". "As safe" as a human is terrible performance and we should not allow those cars on streets without a driver if that is the case. No way.
If you want to stop 30,000 people from dying on the roads every year, don't allow multi-ton vehicles on the road.
If everyone was using motorized bikes, you would have almost no road deaths. Furthermore, people die choking on food every year. Please don't mandate us to use self-chewing teeth.
Okay, but the idea here is to drop the deaths without impacting the economy so much that it will tank. Yes we could remove all large vehicles, but then people would starve and die anyway because they wouldn't get food to grocery stores fast enough, etc.
The problem is that humans are de facto terrible at driving. Cellphones and other distractions, bad decision making, emotional clouding, etc.
I've literally seen someone have to slam on their brakes and pull in the shoulder to avoid a rear end collision because they were tailgating and speeding... and then repeat the exact same mistake in less than five minutes. People are garbage at driving and when a mistake can cost not only your own life, but the lives of others, you don't have some intrinsic right to endanger me because you can't follow a posted speed limit or maintain a safe distance.
I also notice that folks who think they're the best at driving are some of the worst. "I'm a good driver! I can talk on my cellphone and drive at the same time!"