New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired
I hope the cops win. Autopilot allows for a driver to completely disengage their attention from the car in a way that’s not possible with just cruise control.
There’s no way you can drop a human in a life threatening critical situation with 2.5 seconds to make a decision and expect them to make reasonable decisions. Even stone cold sober, that’s a lot to ask of a person when the car makes a critical mistake like this.
On cruise, the driver would still have to be aware that they were driving. With auto pilot, the driver had likely passed out and the car carried on it merry way.
This is stupid. Teslas can park themselves, they’re not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.
That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver’s fault and they should be held responsible for their actions. It’s not the courts job to legislate.
It’s actually the NTSB’s job to regulate car safety so if they don’t already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.
Officers injured at the scene are blaming and suing Tesla over the incident.
…
And the reality is that any vehicle on cruise control with an impaired driver behind the wheel would’ve likely hit the police car at a higher speed. Autopilot might be maligned for its name but drivers are ultimately responsible for the way they choose to pilot any car, including a Tesla.
I hope those officers got one of those “you don’t pay if we don’t win” lawyers. The responsibility ultimately resides with the driver and I’m not seeing them getting any money from Tesla.
Well, in the end it’s up to whether Tesla’s ADAS is compliant with laws and regulations. If there really were 150 warnings by the ADAS without it disengaging, this might be an indicator of faulty software and therefore Tesla being at least partially at fault. It goes without saying that the driver is mostly to blame but an ADAS shouldn’t just keep on driving when it senses that the driver is incapacitated.
Tesla’s are not safe.
Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn’t change these facts.
In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.
So Tesla says. There is no independent verification of this data. It could all be bullshit.
Perhaps. I’m sure you’ll provide me with the independent data you’re basing that “Teslas are not safe” claim on