The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.
The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.
In addition to the pedestrian’s death, another crash involved an injury, the agency said.
Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”
Eyes can’t see in low visibility.
musk “we drive with our eyes, cameras are eyes. we dont need LiDAR”
FSD kills someone because of low visibility just like with eyes
musk reaction -
It’s worse than that, though. Our eyes are significantly better than cameras (with some exceptions at the high end) at adapting to varied lighting conditions than cameras are. Especially rapid changes.
Not only that, when we have trouble seeing things, we can adjust our speed to compensate (though tbf, not all human drivers do, but I don’t think FSD should be modelled after the worst of human drivers). Does Tesla’s FSD go into a “drive slower” mode when it gets less certain about what it sees? Or does its algorithms always treat its best guess with high confidence?
Hard to credit without a source, modern cameras have way more dynamic range than the human eye.
Not in one exposure. Human eyes are much better with dealing with extremely high contrasts.
Cameras can be much more sensitive, but at the cost of overexposing brighter regions in an image.
What pisses me off about this is that, in conditions of low visibility, the pedestrian can’t even hear the damned thing coming.
I hear electric cars all the time, they are not much quieter than an ice car. We don’t need to strap lawn mowers to our cars in the name of safety.
You can hear them, but manufacturers had to add external speakers to electric cars to make them louder.
https://en.wikipedia.org/wiki/Electric_vehicle_warning_sounds
I think they are a lot more quiet. I’ve turned around and seen a car 5 meter away from me, and been surprised. That never happens with fuel cars.
I think if you are young, maybe there isn’t a big difference since you have perfect hearing. But middle aged people lose quite a bit of that unfortunately.
National Highway Traffic Safety Administration is now definitely on Musk’s list of departments to cut if Trump makes him a high-ranking swamp monster
Why do you think musk dumping so much cash to boost Trump? The plan all along is to get kickbacks like stopping investigation, lawsuits, and regulations against him. Plus subsidies.
Rich assholes don’t spend money without expectation of ROI
He knows Democrats will crack down on shady practices so Trump is his best bet.
He’s not hoping for a kickback, he is offered a position as secretary of cost-cutting.
He will be able to directly shut down everything he doesn’t like under the pretense of saving money.
Trump is literally campaigning on the fact that government positions are up for sale under his admin.
“I’m going to have Elon Musk — he is dying to do this… We’ll have a new position: secretary of cost-cutting, OK? Elon wants to do that,” the former president said"
This is legitimately one of the real reasons Musk is pushing for Trump so hard. NHTSA (and all the other regulatory agencies) were effectively gutted completely by the Trump admin and it’s basically the entire reason Elon could grift his way to where he is today. The moment Biden got into office, basically every single agency in existence began investigating him and pushing blocks out of the proverbial Jenga tower of the various Musk companies. He’s praying that Trump will get elected and allow him to keep grifting, because otherwise he’s almost definitely going to jail, or at a minimum losing the vast majority of his empire.
Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.
how is it legal to label this “full self driving” ?
“I freely admit that the refreshing sparkling water I sell is poisonous and should not be consumed.”
“But to be clear, although I most certainly know for a fact that the refreshing sparkling water I sell is exceedingly poisonous and should in absolutely no way be consumed by any living (and most dead*) beings, I will nevertheless very heartily encourage you to buy it. What you do with it after is entirely up to you.
*Exceptions may apply. You might be one.
If customers can’t assume that boneless wings don’t have bones in them, then they shouldn’t assume that Full Self Driving can self-drive the car.
The courts made it clear that words don’t matter, and that the company can’t be liable for you assuming that words have meaning.
Now go after Oscar Meyer and Burger King. I am not getting any ham in my burger or dog in my hot’s. They are buying a product which they know full well before they complete the sale that it does not and is not lawfully allowed to auto pilot itself around the country. The owners manuals will give them a full breakdown as well I’m sure. If you spend thousands of dollars on something and don’t know the basic rules and guidelines, you have much bigger issues. If anything, one should say to register these vehicles to drive on the road, they should have to be made aware.
If someone is that dumb or ignorant to jump through all the hoops and not know, let’s be honest: They shouldn’t be driving a car either.
That’s pretty clearly just a disclaimer meant to shield them from legal repercussions. They know people aren’t going to do that.
Last time I checked that disclaimer was there because officially Teslas are SAE level 2, which let’s them evade regulations that higher SAE levels have, and in practice Tesla FSD beta is SAE level 4.
and in practice Tesla FSD beta is SAE level 4.
In theory this is pure bull, and in practice it is level 4 bull.
It is a legal label, if it was safe it would be “safe full self driving”.
Tesla: Why would we need lidar? Just use visual cameras.
Humans know to drive more carefully in low visibility, and/or to take actions to improve visibility. Muskboxes don’t.
They also decided to only use cameras and visual clues for driving instead of using radar, heat cameras or something like that as well.
It’s designed to be launched asap, not to be safe
I’m not so sure. Whenever there’s crappy weather conditions, I see a ton of accidents because so many people just assume they can drive at the posted speed limit safely. In fact, I tend to avoid the highway altogether for the first week or two of snow in my area because so many people get into accidents (the rest of the winter is generally fine).
So this is likely closer to what a human would do than not.
The question is, is Tesla FSD’s record better, worse, or about the same on average as a human driver under the same conditions? If it’s worse than the average human, it needs to be taken off the road. There are some accident statistics available, but you have to practically use a decoder ring to make sure you’re comparing like to like even when whoever’s providing the numbers has no incentive to fudge them. And I trust Tesla about as far as I could throw a Model 3.
On the other hand, the average human driver sucks too.
Yeah, I honestly don’t know. My point is merely that we should have the same standards for FSD vs human driving, at least initially, because they have a lot more potential for improvement than human drivers. If we set the bar too high, we’ll just delay safer transportation.
You can’t measure this, because it has drivers behind the wheel. Even if it did three “pedestrian-killing” mistakes every 10 miles, chances are the driver will catch every mistake per 10000 miles and not let it crash.
But on the other hand, if we were to measure every time the driver takes over the number would be artificially high - because we can’t predict the future and drivers are likely to be overcautious and take over even in circumstances that would have turned out OK.
The only way to do this IMO is by
- measuring every driver intervention
- only letting it be driverless and marketable as self-driving when it achieves a very low number of interventions ( < 1 per 10000 miles?)
- in the meantime, market it as “driver assist” and have the responsibility fall into the driver, and treat it like the “somewhat advanced” cruise control that it is.
low visibility, including sun glare, fog and airborne dust
I also see a ton of accidents when the sun is in the sky or if it is dusty out. \s
The median driver sure, but the bottom couple percent never miss their exit and tend to do boneheaded shit like swerving into the next lane when there’s a stopped car at a crosswalk. >40,000 US fatalities in 2023. There are probably half a dozen fatalities in the US on any given day by the time the clock strikes 12:01AM on the west coast.
Edit: some more food for thought as I’ve been pondering:
FSD may or may not be better than the median driver (maybe this investigation will add to knowledge), but it’s likely better than the worst drivers… But the worst drivers are the most likely to vastly overestimate their competence, which might lead to them actively avoiding the use of any such aids, despite those drivers being the ones who would see the greatest benefit from using them. We might be forever stuck with boneheaded drivers doing boneheaded shit