Remember that the auto industry was so resistant to putting speed governors in cars 100 years ago that they invented the term Jaywalking as a way of blaming the victims of their manslaughter.
The one rule I would dream of seeing is soft speed throttling to ensure that cars and trucks stay a safe 3 second distance or more apart from each other. That should be relatively easy to do with basic distance sensing and calculations.
Fucking tailgaters. No idea why so few people seem to be aware of how dangerous and stupid it is to tailgate.
obligatory reference
Duno about newer cars, but in a 2017 model bmw it tends to brake for parked cars quite often when using radar cruise control…
Was going to ask the same question - cruise control is for open roads like motorways. Not around town. No wonder they had issues with it.
Maybe you don’t. But ancestor post is suggesting to make it mandatory to avoid tail gating and then it had better work properly.
The Cupra Born I drove the other day (don’t own a car and rely on carsharing and rentals for my business) while doing deliveries for a catering event did this. It was really annoying driving in narrow streets with it braking for parked cars.
My 2017 Volvo just warns me if there’s a parked car in a curve, never had it brake automatically for parked cars no matter the scenario, so I guess it’s just that BMW’s system wasn’t quite there yet at the time…
Ah true, yeah I test drove a polestar and a Hyundai ioniq 5 before deciding to go with the bmw and they both worked a lot better, but were also way more expensive since they were new and the bmw was second hand 😅
Unfortunately there weren’t any second hand phev volvos available in my area at the time.
That means it’s the right call to make. Whatever auto industry is complaining about the opposite is beneficial to consumer.
So they want self driving cars, which do not brake for pedestrians and cyclists? Do I understand this correctly?
They want dystopia. Ideally you should pay per door handle use. Pay by kilometer and horn sounds are extra DLC. If possible, you’d keep paying and wouldn’t be allowed to change manufacturer and car for number of years so they don’t have to be as competitive and innovative. If possible government should mandate each human should have at least one car.
Well, since most of it sounds stupid and exploitative, they take what they can. Rent a heated seat, extra for autopilot and other gadgets, etc. The rest they lobby like crazy pushing against EV, pushing against different zoning laws other than suburban sprawl. Etc. Hyperloop anyone?
I think it’s worth thinking about this in a technical sense, not just in a political or capitalist sense: Yes, car companies want self driving cars, but self driving cars are immensely dangerous, and there’s no evidence that self driving cars will make roads safer. As such, legislation should be pushing very hard to stop self driving cars.
Also, the same technology used for self driving is used for AEB. This actually makes self-driving more likely, in that the car companies have to pay for all that equipment anyway, they may as well try and shoehorn in self driving. On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.
This means someone can get into a situation where they are:
- in a car, on a road, nothing of interest in front of them
- the software determines that there is an imminent crash
- Car brakes hard (even at 90mph), perhaps losing traction depending on road conditions
- may be hit from behind or may hit an object
- Driver is liable even though they never actually pressed the brakes.
This is unacceptable on its face. Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars, not this AI automation bullshit.
Under what circumstances does being hit from behind result in liability to the lead vehicle. It’s the responsibility of the vehicle behind you to keep appropriate distance. This sounds like you’re regurgitating their talking points like a bot.
but self driving cars are immensely dangerous, and there’s no evidence that self driving cars will make roads safer.
This is a horrible take, and absolutely not true. Maybe for the current state of technology, but not as an always-true statement.
Humans are horrible at driving. It’s not hard to be better at driving than the average human. Perfect doesn’t exist, and computer-driven cars will always make some mistakes, but so do humans (and media will report on self-driving cars much more than on the thousands of vehicle deaths caused by human error). AEB and other technologies have already made cars much safer over the previous decades.
On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.
Tell me you’ve never used or tested AEB without telling me.
Dirty sensors trigger a “dirty sensor warning”, not a full emergency brake. There’s more than one sensor, and it doesn’t emergency brake on one bad sensor reading. Again, perfect doesn’t exist, but it isn’t close to the 50/50 you’re trying to portray here.
- Car brakes hard (even at 90mph), perhaps losing traction depending on road conditions
Any car with AEB will also have ABS and traction control, so losing traction is unlikely. Being rear-ended is never on the liability of the front car.
Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars,
Absolutely agree on all of this. Slower speeds and safer roads make accidents less likely and less lethal, for human and computer drivers both.
As such, legislation should be pushing very hard to stop self driving cars.
Legislation should push hard for setting clear boundaries on when self-driving is good enough to be allowed on the road, and where the legal responsibilities are in case of problems. Just completely stopping it would be wasted potential for safer roads for everyone in the long run.
These rules are convoluted and near impossible to apply. Specific braking speeds for some objects compared to others? That requires reliable computer vision, which hasn’t been demonstrated anywhere yet.
And those speeds? 92mph is 148kph! Why the fuck are cars even permitted to be capable of that when no road in the country allows it? And why would you want to introduce unpredictable braking scenarios at such speeds?
What is feasible is a speed limiter based on the posted limit, but that’d be too practical.
What is feasible is a speed limiter based on the posted limit, but that’d be too practical.
I have recently got a car that tells me the currently posted limit and it is frequently wrong. It misses sign posts and sometimes thinks that a signpost for a side road applies to you.
It also has a speed limiter and a button to set the limit to the detected speed which I use a lot but I wouldn’t want it to do it itself.
Thing is like none of our roads are properly tested for the posted speed limits. Interstates can often go up to a 75 limit and regular traffic will go at 85 (because cops dont care til more than 10 over and that difference adds up on long trips) with some people going 90+.