- Try to accelerate to 45mph in a parking lot b/c it was within 10ft of the road
- Decelerate from highway speeds suddenly to 30mph, as though it saw something it might hit (I stopped it at 30-ish and hit the gas)
- Decelerate to 50mph because of "emergency vehicles" even though there were no vehicles around (sometimes it mistakes lights that strobe b/c they are seen through median dividers as "emergency lights")
- Take up two lanes because they gradually separated and the car thinks it should stay evenly between the left and right divider line
- choose absolutely bonkers limits, like 30mph on two lane country highways.
- Stop on the highway with a big red screen and a message that says "Take control now fatal error"
- Not so much a problem any more, but when I was first getting used to it, it would beep a message at me, then scold me for looking at the message (and not the road), then ask me to do some kind of hand grip on the wheel to prove I'm paying attention, but I have to look at the message to figure out what it wants.
My wife tells me "Just keep your foot on the gas to keep up the speed and your hands on the wheel to keep it in line" and I am just left wondering what FSD is for
I am now absolutely convinced that we will have full self-driving from Tesla when we have a beautiful wall all the way from the east to the west coast along both the Mexican and Canadian borders. Both will be beautiful.
At the time, 2016, I trusted their promotional video showing it driving hands-free; I'm not going to make the mistake of taking them at their word again after it was revealed to have not been as it appeared: https://www.businessinsider.com/tesla-faked-video-in-2016-pr...
> I am just left wondering what FSD is for
The vision and promise, or the actually demonstrated use case?
The demonstrated use case is to charge people more money for the same product.
The vision? That is exactly what Musk keeps saying: in principle, a self-driving car never gets tired or drunk, so it can be safer than the mean human even if it only operates at the level of the median human. And it wouldn't need to be limited to median human level, as the whole fleet could learn from every member, so gain experience a million times faster than any human.
But at this point, I'm sufficiently skeptical of all of this, that I think they (and everyone else) should be banned from direct observation of the entire fleet's cameras — it's a huge surveillance network operating on every public road and several private ones.
Self-driving in my opinion will require an AI that is, if not very close to, an AI capable of general intelligence.
Why?
Because in the real world to be able to drive a car as well as a human across all of the edge cases a human can you probably need something approaching general intelligence.
Humans understand that a person isn't just something with 4 limbs, but also can be that thing that looks like a white sheet with eyes by the side of the road on Oct 31st. And its these types of weird edge cases that humans instinctively understand because they have a deep world model to reason about which cannot be reasoned about by the narrow FSD AI systems we currently have.
When you think about what humans need to do when driving it's so much beyond just watching the road and turning a wheel that it seems almost absurd to imagine our current AI is anywhere near capable of handling all of the edge cases humans currently are.
And I also don't buy this argument that the goal should be to simply to reduce the total number of accidents per mile... I'd grant that it's very possible that FSD could reduce the total number of accidents per mile driven because most miles are driven in the much more narrow environment of highway driving. And here AI probably could do better job than a human on average when you factor into the equation human tiredness and distractibility. But no one is going to be comfortable with FSD occasionally plowing into a group of kids outside a school because statistically the total number of people who die in road traffic accidents is reduced on a per mile basis.
I'd be interested if anyone strongly disagrees.
The market for such cars would be very limited IMO.
I really don't see why not. Since those deaths must be counted too, if it still is safer with than in mind then it can't be something that happens even rarely.
Trouble is, when the company deliberately ties one hand behind its back by insisting on camera-only vision, it is never going to be perfect at not hitting stuff. Either multispectral imaging, radar, or lidar would help avoid edge cases like the Halloween costume. The camera might not even realize there's a three-dimensional object in front of it if there's snow on the ground. Stupid, stupid, stupid.
Meanwhile in the Midwest, we have potholes, uneven roads, sometimes roads with different surfaces mixed together (gray concrete with black asphalt patches). Lines are often badly worn by the weather and road salt and can be quite difficult to see.
I strongly suspect with no evidence that FSD likely has more problems on roads that are in poor condition.
Even my parents and sister use FSD v13 regularly now in their Teslas.
It's come a long way from the early days when I first started testing it.
It makes me wonder how many people are using Autopilot (included as standard) instead of FSD on a newer Tesla with the new AI hardware?
It's pretty wild to be able to start from park. Tap a button, and go.
Just the other day, it managed merging onto the interstate and then immediately changing 7 lanes to the left to merge onto the next interstate exit heading north. It performed flawlessly.
Really?:
https://youtu.be/OIjp_NyAfMw?t=269
https://youtu.be/tax1THe_VO0?t=412
https://www.youtube.com/shorts/pU1mpmOa6RM
https://youtu.be/pHLPu6_tIag?t=128
https://youtu.be/-NwhtNjc4N8?t=30
https://www.youtube.com/shorts/OS_SiMZkc00
"Tesla Plans Robotaxis in Texas by June, in a State with No Regulations" - https://www.auto123.com/en/news/tesla-robotaxi-texas-regulat...
Except all you have to do is go try it and it becomes clear to any layperson that it's probably getting there but, and this is really crucial, it's not there yet.
I think FSD definitely has utility, but not in the hands of laypeople. There are still far too many edge cases that it just doesn't handle well, and your average person can't be trusted to stay alert and attentive while using a feature so heavily marketed as not needing either of these things.
Still though it has quirks.
On long trips, I LOVE it. LOVE it. Being able to just tap in and relax, make phone calls, listen to an audiobook, etc is so nice. The first time I ever used it I had to leave early from the All Things Open conference in Raleigh because I was getting sick. Having it essentially drive me home for 5 hours when I wasn’t well, including stopping to charge, was a huge relief.
It’s also great in traffic jams where you’d otherwise be dealing with stop and go traffic until you get through it. Just tap in and relax til you’re on the other side.
Day to day driving, it’s a little more iffy. I’ve dealt with seemingly random slowdowns on otherwise empty roads. It feels odd especially because it’s sudden.
Early on it would have difficulty on roads without well marked lines too.
I’ve never felt like it was going to run into an object though. Usually it errs on the “too cautious” side and I just take over to get where I’m going quicker.
My 12 year old Ford Focus does that
It was this guys fault for not monitoring the car, but also Tesla's for using a double-speak name like Full Self Driving.
If FSD is a statistically significant enough risk factor for injury above Teslas that don't use it, it should be banned.
Really a human + AI hybrid experience.
I like the level 1 to level 3 features: Lane keeping, emergency braking (when there's something there), adaptive speed control, etc. But a new minivan has all those too.
For long highway driving it does remove 99% of the things I hate, but there's 1% of the time it just annoys the hell out of me, and it tarnishes the whole experience.
At what ratio of good anecdotes to bad anecdotes should we trust it? For me, the ratio has to be astonishingly high, such that if there are a few people in the discussion saying it did something suspect (much less dangerous), they're always going to be the ones I listen to. Not that I'm doubting your experience; it's just not enough to outweigh the other.
Obligatory xkcd: https://xkcd.com/937/
Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
I walked away without a scratch. This could have easily killed an innocent pedestrian or bicyclist. How is this best safety engineering? If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.There actually is. The Automatic Emergency Braking functions separately from FSD and can prevent collisions in some cases. It doesn't work 100% of the time so I wouldn't rely on it, but at least it works as well as or better than competitors' systems.
What you're asking for, though, is definitionally impossible: obviously the cameras didn't detect the obstacle, so FSD or no, they can't react to it. The actual solution would be to do what every other car maker with self-driving pretensions does and augment the cameras with LIDAR or other sensors.
Judging by the (illegal in Europe) design, passive safety is the only safety Cybertruck has, and the safety of others have absolutely zero importance. Fits with how the rest of the world sees the typical American as well, so maybe not a big shocker.
> What you're asking for, though, is definitionally impossible
Why is it impossible for the car to stop (legally obviously) if it fails to merge, or even hit the curb, instead of continue straight forward like nothing happened?
Passive safety usually is defined as reducing the risk of injury or death to vehicle occupants in an accident AND also protecting other road users. You left off the second part.
My head hurts with how oxymoronic this is. My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner. “Thank you sir for doing treat work and for fixing this problem in the future”
Well I imagine that since Musk was handed the keys to various government agencies and installed his henchmen you can see why you want to tread lightly and kiss the ring. Such a wonderful future this is becoming.
There's a lot of possible flavors to that ideology, it COULD be right wing political affinity, but it also could be a belief that technology is superior to human judgement, or that self driving cars are the future, or it could just be that spending 6 figures on an ugly pickup wasn't a waste of money.
> Soooooo my @Tesla @cybertruck crashed into a curb and then a light post on v13.2.4.
> Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
> It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb.
> Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven't heard of any accident on V13 at all before this happened. It is easy to get complacent now - don't.
> @Tesla_AI how do I make sure you have the data you need from this incident? Service center etc has been less than responsive on this.
> I do have the dashcam footage. I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material.
> Spread my message and help save others from the same fate or far worse.
unless this is sarcasm that can, at best in the current times, be construed as serious.
In my EU country either however they have already been spotted on the streets with valid license plates. There are loopholes everywhere usually if you classify it as a commercial utility vehicle for a business instead of a passenger car. There's plenty of people with money and no scruples.
Very glad to hear no pedestrians got hit. Really hope the driver takes some kind of lesson away from this experience.
If I didn't know better, I think they are trying to farm engagement.
It's sad that we CT drivers seem to be caught in a crossfire between Tesla and Tesla/Elon haters, when all we want to do is enjoy our cars.
The good news is that it’s just the ones who drive their truck with the now-known-to-be-faulty FSD turned on who are caught in the crossfire. (I’d say they are actually and should be in the “crosshairs”, but that’s moot.) Doesn’t have to be you.
> you're all blind from Elon hate
This is very ironic.
Level 2+, though, is a big worry. It fails enough to be dangerous, but many of these systems fail too little for humans to effectively monitor them.
I'm still waiting for Waymo to safely drive in the snow.
At least they are not pretending to offer anything more than level-2 adaptive cruise control and lane centering.
> This paper investigates the effects of mandatory seat belt laws on driver behavior and traffic fatalities. Using a unique panel data set on seat belt usage rates in all U.S. jurisdictions, we analyze how such laws, by influencing seat belt use, affect traffic fatalities. Controlling for the endogeneity of seat belt usage, we find that it decreases overall traffic fatalities. The magnitude of this effect, however, is significantly smaller than the estimate used by the National Highway Traffic Safety Administration. Testing the compensating behavior theory, which suggests that seat belt use also has an adverse effect on fatalities by encouraging careless driving, we find that this theory is not supported by the data. Finally, we identify factors, especially the type of enforcement used, that make seat belt laws more effective in increasing seat belt usage.
[0] http://www.law.harvard.edu/programs/olin_center/papers/pdf/3...
Don’t mistake my post as a defense of FSD or Tesla. They’ve been lying about their capabilities for what feels like a decade.
I don’t want to see FSD and human drivers share a road. I want all cars to be meshed and communicating their intents with vehicles around them to avoid collisions. We will never see that in our lifetime
Snowball: "So FSD failed but you still managed to find a way to praise Tesla. You failed too for not taking over in time. But your concern isn't for the lives of third parties that You and FSD endangered. No, you are worried about Tesla getting bad publicity. You have misplaced priorities."
Jonathan Challinger (the driver who crashed): "I am rightly praising Tesla's automotive safety engineers who saved my life and limbs from my own stupidity, which I take responsibility for. [...]"
Fair points from both sides I think.
It is worth noting that this picture is a reply to a screenshot of someone saying the following:
> I've lived in 8 different states in my life and most roads I've seen do everything they can to prevent human error (or at least they do once the human has shown them what they did wrong). The FSD should not have been fooled this easily, but the environment was the worst it could have been, also.
- Tweet source: https://x.com/MuscleIQ2/status/1888695047044124989
I point this out because I think probably the biggest takeaway here is how often people will bend over backwards to reach the conclusion that they want, rather than update their model to the new data (akin to Bayesian Updating for you math nerds). While this example is egregious, I think we all should take a hard look at ourselves and question where we do this too. There's not one among us that isn't resistant to change our beliefs, yet it's probably one of the most important things we can do if we want to improve things. If we have any hope of being able to not be easily fooled by hype, if we are to be able to differentiate real innovation from cons, if we are able to avoid joining Cargo Cults, then this seems to be a necessity. It's easy to poke fun at this dude, but are we all certain that we're so different? I would like to think so, but I fear making such a claim is repeating the same mistake I/we are calling out.In all fairness he really should have been paying attention.
You don't get to abdicate your responsibility to Team Elon because reasons. At the end of the day you will be sitting in the defendant's chair while Tesla will just quietly settle out of court.
And just because it's the driver's responsibility doesn't mean it's not also Tesla's.
Just can't win as a light pole https://abc13.com/suspected-drunk-driver-milwaukee-wisconsin...
But if that's what you need to build a FSD product, then you shouldn't be releasing the existing FSD product onto public streets.
I could be wrong though.
Be afraid. Be very afraid.
Tesla is in a bind. They've been promising self driving Real Soon Now since 2016, with occasional fake demos. Meanwhile, Waymo slowly made it work, and is taking over the taxi and car service industry, city by city.
This is a huge problem for Tesla's stock price and Musk's net worth. Now that everybody in automotive makes electric cars, that's not a high-margin business any more. Tesla is having ordinary car company problems - market share, build quality, parts, service, unsold inventory. Tesla pretends they are a special snowflake and deserve a huge P/E ratio, but that's no longer the reality.
Tesla doesn't want to test in California because of "regulation". This is bogus. The California DMV is rather lenient on testing driverless cars, and California was the first state to allow them. There was no new legislation, so DMV just copied the procedures for human drivers with a few mods. Companies can get a "learner's permit" for testing with a safety driver easily, and quite a few companies have done that. The next step up is the permit for testing without a safety driver, which is comparable to a regular driver's license. It's harder to get, and there are tests. About a half dozen companies have reached that point. No driving for hire at that level. Finally there's the deployment license, which Waymo and Zoox have. That's like a commercial drivers license, and is hard to get and keep. Cruise had one, but it was revoked after a crash where someone was killed.
That's what really scares Tesla. The California DMV can and will revoke or suspend an autonomous driving license just like they'd revoke a human one. Tesla can't just pay off everyone involved and go on.
Waymos are all over San Francisco and Los Angeles, dealing with heavy traffic, working their way around double-parked cars, dodging bikes, skateboarders, and homeless crazies, backing out when faced with an oncoming truck in a one lane street, and doing OK in complex urban settings. Tesla has never demoed that level of performance. Not even close.
[1] https://www.reuters.com/technology/tesla-robotaxis-by-june-m...
Tesla's P/E ratio is currently 328. Ford is around 9. GM is around 8. Evaluated as a mature car company, TSLA is maybe 20x - 30x overpriced compared to the rest of the industry. A hype injection is needed to keep the price up. Optimus and fake self driving isn't enough.
[1] https://www.investors.com/news/tesla-stock-distraction-elon-...
I think if a car with an engine had it, the pole would have knocked over rather than any sort of t-bone.
Hello? Whether it's Full Self-Driving or not, it's always your fault.
I try it every major release, and am disappointed every time. In situations where I'd be confident, it is overly cautious. In situations where I'd be cautious, its overly confident and dangerous.
I think its best use is to keep the car in the lane while I'm distracted by something (pulling out a sandwich to eat, etc). And it seems like newer Teslas have eye tracking, so it might not even be useful for that.
I'm not saying it wasn't FSD, but it is a possibility FSD wasn't even enabled.
― George CarlinFSD is clearly not even beta quality
people keep saying it's "trying to commit suicide"
and it's being fixed on the fly at everyone else's cost in life
But now they are removing federal reporting requirements so buyers will NEVER know
Got a link for that one?
I couldn't find anything suggesting this has been put in place yet though.