I rarely use ride-sharing but other experiences include having been in a FSD Tesla Uber where the driver wasn't paying attention to the road the entire time (hands off the wheel, looking behind him, etc.).
I don't know if I trust Waymo cars with my life, but at least there are SOME standards, compared to the natural variance of humans.
I’ve ridden in a lot of Waymos – 800km I’m told! – and they’re great. The bit that impresses me most is that they drive like a confident city driver. Already in the intersection and it turns red? Floor it out of the way! Light just turned yellow and you don’t have time to stop? Continue calmly. Stuff like that.
Saw a lot of other AI cars get flustered and confused in those situations. Humans too.
For me I like Waymos because of the consistent social experience. There is none. With drivers they’re usually chatty at all the wrong moments when I’m not in the mood or just want to catch up on emails. Or I’m feeling chatty and the driver is not, it’s rarely a perfect match. With Waymo it’s just a ride.
This has been a 15+ year process and will probably take a few more years. I don't feel too bad if they didn't manage to pivot in that time period.
The one thing you can trust Waymo to do is spy on you. Hurray, more surveillance-on-wheels! Every one of these things has 29 visible-light cameras, 5 LIDARs, 4 RADARs, and is using four H100s to process all of its realtime imagery of you: https://thelastdriverlicenseholder.com/2024/10/27/waymos-5-6...
> A few months ago, I was in the city for a weekend and took Waymo for most of my rides.
> [...]
> I don't know if I trust Waymo cars with my life [...]
I'm sorry to be that guy, but didn't you already?These things must be saving lives, it's obvious. When my kids are riding their bikes around I want the other cars to be Waymos, not human drivers.
Waymo is 100% with zero fatalities.
But then again, the Concorde was the safest airplane ever built for nearly 30 years, until its first crash and then it was the most dangerous passenger jet ever with 12.5 fatal events per million flights.[1]
Lies, damned lies, and statistics.[2]
0: https://assets.ctfassets.net/vz6nkkbc6q75/3yrO0aP4mPfTTvyaUZ...
1: https://www.airsafe.com/journal/issue14.htm
2: https://en.wikipedia.org/wiki/Lies,_damned_lies,_and_statist...
Any different than with a human taxi driver?
It's not about absolute reliability, it's about how well it compares to the alternative, which is human taxi drivers. And the thing is, you don't hear about human car accidents because it's so common that it's not worth making the news.
These vehicles very regularly block traffic because they can’t maneuver in congested areas with the finesse of a human driver.
Aggressive driving isn’t always bad. Sometimes it’s to unblock others waiting behind you so they can get somewhere they need to be.
If you were running a private equity robotaxi firm and your bonus relied on 1% more rides wouldn't you be dialing up the aggressive driving? Repeat for a few quarters and the robot will be cutting the same corners that the human is forced to.
Some future Fight Club reboot will reference your ChatGPT logs that show you asked how much the corporation would need to pay to the people killed in crashes vs increased profit to find the profit maximising level of dangerous driving.
I get there. Basically isn't any laws for corporations anymore, is there any way I can see anything in regards to the safety of this at a statistical level?
Where is NHTSA? Oh right, no federal agencies exist anymore except for those that maintain the oligarchy.
And I don't give a crap if Uber has really good statistics and studies and evidence. We are talking about one of the least ethical companies in the last 20 years.
I want independent Federal testing.
Now, before you say this peer-reviewed paper is corporate propaganda, all self-driving companies are required by law to disclose accidents they are involved in, whether liable or not, in CA. You could access each raw accident report published by the CA DMV periodically and come up with your own statistics.
If you're driving 45 in a 40, that may sound like 12% faster, but once you add traffic, lights, stop signs, turns, etc - you'll find that the 12% all but evaporates. Even if you're really pushing it and going 15 over, at most speeds and for most typical commutes, it saves very little.
Most of the time speeding ends up saving on the order of seconds on ~30 minutes or shorter trips.
Just about the only time it can be noticeable is if you're really pushing it (going to get pulled over speeds) on a nearly empty highway for a commute of 1.5+ hours.
These sentances conflict. I recently took a taxi from JFK to Manhattan during rush hour, and I estimate if the driver didn't use all of the paved surface, it would have taken at least 10 more minutes to arrive. (And it wouldn't have been an authentic NYC experience)
It's ok if you prefer the Waymo experience, and if you find it a better experience overall, but if a human driver saves you time, the Waymo wasn't better in every single way.
I am assuming the Lyft driver used the shoulder effectively. My experience with Lyft+Uber has been hit or miss... Some drivers are like traditional taxi drivers: it's an exciting ride because the driver knows the capabilities of their vehicle and uses them and they navigate obstacles within inches; some drivers are the opposite, it's an exciting ride because it feels like Star Tours (is this your first time? well, it's mine too) and they're using your ride to find the capabilities of their vehicle. The first type of driver is likely to use the shoulder effectively, and the second not so much.
Lived in New York for 10+ years and still go back regularly. This is unacceptable behaviour by a cabbie.
Given the amount of construction and thus police presence on that route right now, you’re lucky you didn’t get a 60-minute bonus when the cab got pulled over. (The pro move during rush hour and construction is (a) not to, but if you have to, (b) taking the AirTrain and LIRR.)
My hot take is that people who "use all of the paved surface" because their whiny passenger is "in a rush" (which of course everyone stuck in traffic is) should permanently lose their license on the very first offense.
It is just gobsmackingly antisocial behavior that is 1) locally unsafe and 2) indicative of a deep moral rot.
Obviously exceptions can be made for true emergencies and what not, but "I need to save 10 minutes" is not one of them.
It was a really scary experience and I couldn’t do much about it in the moment.
In New York it's not too difficult. Fidgitiness, twitchiness, rambling series of non sequiturs that make even my ADD brain rattle. Screaming at traffic and running on the margin one second and then asking me if I know that the archangel who visited Muhammed was actually a demon the next. (I'm not Muslim. The conversation wasn't addressed to anyone in the vehicle.)
Like, I guess I can't say they're taking too much of a substance. But if they aren't, they're taking too little.
Maybe my memory is failing me, but I seem to remember people saying the exact opposite here on HN when Tesla first announced/showed off their "self-driving but not really self-driving" features, saying it'll be very easy to get working on the highways, but then everything else is the tricky stuff.
On highways the kinetic energy is much greater (Waymo's reaction time is superhuman, but the car can't brake any harder.) and there isn't the option to fail safe (stop in place) like their is on normal roads.
One thing that's hard with highways is the fact that vehicles move faster, so in a tenth of a second at 65 mph, a car has moved 9.5 feet. So if say a big rock fell off a truck onto the highway, to detect it early and proactively brake or change lanes to avoid it, it would need to be detected at quite a long distance, which demands a lot from sensors (eg. how many pixels/LIDAR returns do you get at say 300+ feet on an object that's smaller than a car, and how much do you need to detect it as an obstruction).
But those also happen quite infrequently, so a vehicle that doesn't handle road debris (or deer or rare obstructions) can work with supervision and appear to work autonomously, but one that's fully autonomous can't skip those scenarios.
https://enewspaper.latimes.com/infinity/article_share.aspx?g...
It was a common but bad hypothesis.
"If you had asked me in 2018, when I first started working in the AV industry, I would’ve bet that driverless trucks would be the first vehicle type to achieve a million-mile driverless deployment. Aurora even pivoted their entire company to trucking in 2020, believing it to be easier than city driving.
...
Stopping in lane becomes much more dangerous with the possibility of a rear-end collision at high speed. All stopping should be planned well in advance, ideally exiting at the next ramp, or at least driving to the closest shoulder with enough room to park.
This greatly increases the scope of edge cases that need to be handled autonomously and at freeway speeds.
...
The features that make freeways simpler — controlled access, no intersections, one-way traffic — also make ‘interesting’ events more rare. This is a double-edged sword. While the simpler environment reduces the number of software features to be developed, it also increases the iteration time and cost.
During development, ‘interesting’ events are needed to train data-hungry ML models. For validation, each new software version to be qualified for driverless operation needs to encounter a minimum number of ‘interesting’ events before comparisons to a human safety level can have statistical significance. Overall, iteration becomes more expensive when it takes more vehicle-hours to collect each event.”
https://kevinchen.co/blog/autonomous-trucking-harder-than-ri...
The real reason I see for not running freeways until now is that the physical operational domain of for street-level autonomous operations was not large enough to warrant validating highway driving to their current standard.
> “Freeway driving is one of those things that’s very easy to learn, but very hard to master when we’re talking about full autonomy without a human driver as a backup, and at scale,” Waymo co-CEO Dmitri Dolgov said
and
> While many assume freeway driving is easier, it comes with its own set of challenges, principal software engineer Pierre Kreitmann said in a recent briefing. He noted that critical events happen less often on freeways, which means there are fewer opportunities to expose Waymo’s self-driving system to rare scenarios and prove how the system performs when it really matters.
Both point to freeway driving being easier to do well, but harder to be sure is being done well.
The emergency breaking system gives you a lot of room for error in the rest of the system.
Once you’re going faster than 35mph this approach no longer works. You have lots of objects on the pavement that are false positives for the emergency breaking system so you have to turn it off.
Really it’s a common difficulty with utilitarianism. Tesla says “we will kill a small number of people with our self driving beta, but it is impossible to develop a self driving car without killing a few people in accidents, because cars crash, and overall the program will save a much larger number of lives than the number lost.”
And then it comes out that the true statement is “it is slightly more expensive to develop a self driving car without killing a few people in accidents” and the moral calculus tilts a bit
I think anyone back then would be totally shocked that urban and suburban driving launched to the public before freeway driving.
So then they pivoted to full time automation with a safe stop for exceptions. That's not useful to start with highway driving. There are some freeway routed mass transit lines, but for the most part people don't want to be picked up and dropped off at the freeway. In many parts of freeways, there's not a good place to stop and wait for assistance, and automated driving will need more assistance than normal driving. So it made a lot f sense to reduce scope to surface street driving.
Note, in July of this year, Musk predicted robotaxi service for half the country by the end of 2025. It's November now and they haven't even removed the safety monitors, in any city!
Flying used to be like this - my grandpa had 3 airplanes, used one to fly a calf back to his farm. But flying got regulated till it's quite rare to meet a casual pilot with his own plane.
Though I’ve heard people treat it differently in the US
It makes less sense in an urban environment with 5 or more lanes in your direction. Vehicles will be traveling at varying speeds in all lanes, ideally with a monotonic gradient, but it just doesn't happen, and it's unlikely to.
In California, large trucks generally have a lower speed limit (however many trucks are not speed governed and do exceed the truck limit and sometimes the car limit) and lane restrictions on large highways. Waymo may do well if it tends toward staying in the lanes where trucks are allowed as those tend to flow closer to posted car speed limits. But sometimes there's left exits, and sometimes traffic flow is really poor on many right lanes because of upcoming exits. And during commute time, I think the HOV lane would be preferred; taxis are generally eligible for the HOV lane even when only the driver is present, but I don't know about self-driving with a single or no occupant.
If "we'll have too many cars on the freeway following the speed limit" ranks as a serious concern, I think we've really lost the plot.
I recently drove by a fatal accident that had just happened on the freeway. A man on the street had been ripped in half, and his body was lying on the road. I can't imagine the scene is all that unlike the 40 thousand other US road deaths that happen every year.
As a driver I'm willing to accept some minor inconvenience to improve the situation. As a rider I trust Waymo's more than human drivers.
Plenty of people do not follow the rules about staying to the right.
If you watch the videos more carefully, you will notice the people who speed by at 85 MPH later enter the screen again, because that is the nature of freeway traffic.
I predict that a few hundred of these on the road will measurably improve safety and decrease severe congestion by being that one sane driver that defuses stop-and-go catastrophes. In fact I think CHP should just contract with them to pace 101 in waves.
"Waves" are really what we would want them to prevent: https://en.wikipedia.org/wiki/Traffic_wave
The autonomous cars can prevent these waves from forming, which would get people to their destinations faster than speeding.
If you actually thought adoption would benefit us on it's own rather than seeing it a roundabout way to enforce rules that you want to see enforced without buy in from the public you'd want these cars to behave in a way that makes it easier for them to exist in typical traffic.
My dudes, I have been driving the speed limit, even on freeways, for decades.
Nothing bad happens. Your car doesn't explode. You don't instantly create thousand-car pileups.
You get passed slightly more often than when you are speeding. You pass fewer cars. You get to your destination a few minutes later.
A car going the speed limit on the freeway is not a problem.
There’s no making sense of it, people who speed will come up with infinite excuses why they are right and traffic engineers are wrong.
I’ve never been in an accident in over 40 years, I’m never late cause I leave on time and plan ahead and driving isn’t some stressful event.
Until it does.
The biggest problem in car accidents is speed differential. When you are not driving the prevailing speed, your speed differential is significantly higher and the accident will be worse than average.
Or public transit on a track.
Did we get less dumb drivers starting in the '70's?
> Google said it has tweaked its software to "more deeply understand that buses and other large vehicles are less likely to yield to us than other types of vehicles."
Maybe that got lost.
[1] https://phys.org/news/2016-03-apnewsbreak-video-google-self-...
https://www.electronicsweekly.com/news/business/waymo-gets-a...
https://www.reuters.com/business/autos-transportation/how-te...
I keep seeing them around my home in Menlo Park (Redwood City), but they're still in testing phase and not available for booking yet.
Also I appreciate many of the random human interactions I've had with Uber/Lyft drivers. Of course not every ride was great, but many drivers had stories and experiences that no one I usually meet would have. For me, the safe but bland experience of a self-drivng car isn't worth losing the human touch, not to mention taking away income for human drivers.
So far the answer of the current economic system has been to invent new products/services and redirect the workforce there. It's been working so far, but isn't without issues - ever-increasing consumption is bad for the environment; the jobs are getting more and more pointless; people wonder why automation doesn't result in shorter working hours for everyone.
There is also a need for maintenance, cleaning, and so on. Lots of human labor is still needed to maintain a car.
I will say, I was surprised that the interior of the car was kind of dirty. I would imagine this is going to be a growing issue these FSD taxi fleets are going to have deal with. Lots of people will behave poorly in them.
I have taken at least 50 Waymo rides and have never experienced anything remotely like what you have described here.
I am not saying it never happened, just that I expect that if a bone-headed move of this magnitude was at all commonplace with Waymo, we would be hearing about it and probably with a lot more details.
Otherwise the App frustratingly runs you through onboarding and then tells you it is unavailable in your area. I had tried because they were supposed to be coming to New Orleans.
in no particular order, my problems w/ lyft:
1) driver trying to talk to you when you just want some quiet time
2) unclean/smelly car, have no idea if it's some econobox or actually decent
3) sometimes questionable driving, talking on the phone or talking to you or watching some youtube video or using their phone trying to grab the next fare
waymo i just get in, it takes me where i need to go and i get out, no fuss. maybe i'll eat my words if i get into some catastrophic situation, but honestly i'll take that over the "feeling" i get when i step into current ride shares.
people don't drink starbucks because it's the best coffee, they do it because it's consistent for the most part and that's what i want. i don't want to roll the damn dice everytime i call for a car.
There are cases where the onboard computer can't make a decision or needs "help" - in which case a support specialist is presented with options the onboard computer needs help deciding between. To be clear - the human is not driving it's more the car asks "Hey - there's something ahead and I am unsure if it's safe to proceed. Here's a video clip of the thing I'm seeing. Help?" Common cases might be an out of distribution thing like steam or an unidentifiable object in the road.
In a "worst case" mode - a human can remotely give the onboard computer a directed path to follow - eg "draw points and follow this path" to get back to where it needs to be. Even then - the onboard computer is following the path but still maintaining it's constraints "eg don't hit pedestrians."
You can read more about this here: https://waymo.com/blog/2024/05/fleet-response
it seems like these robotaxis have been around long enough to have conclusions now
What I’ve noticed from those other systems is that a human in the loop makes the system so much more comfortable. I’ve had times where I can see the red lights ahead and the system is not yet slowing because the car immediately in front of me isn’t slowing yet. It’s unsettling when the automated system brakes at the last moment.
Because of this experience the highway has been the line in the sand for me personally. Surface streets where you’re rarely traveling more than 45 mph are far less likely to lead to catastrophic injury vs a mistake at 70 mph.
I don’t think Waymo is necessarily playing fast and loose with their tech but it will be interesting how this plays out. A few fatal accidents could be a fatal PR blow to their roll out. I’m also very curious to see how the system will handle human takeover. Stopping in the middle of a freeway is extremely dangerous. Other drivers can have a lapse in attention and getting smoked by a semi traveling 65 mph is not going to be a good day.
The political climate is VERY suspicious of autonomous vehicles, but they most serious incident I can really recall was the recent one where a car ran over a cat. You can see the reaction here: https://www.reddit.com/r/cats/comments/1omortk/the_shrine_to...
If the biggest black mark against the company is running over a cat on the street at 11:40 PM (according to Waymo, after it darted under the car), I feel pretty good.
You may be thinking of the ACC these cars offer, which is a standard feature, but different than their premium "self-driving" services they offer.
We had Waymo and Cruise in SF at the same time for a while and by god Cruise was shit and felt unsafe. Waymo is year ahead of Cruise and better in every manner.