Am I an old fart for being concerned about future generations growing up with self driving cars?
I guess I grew up with a lot of automated things, but I feel like driving is something that people should be able to do for safety reasons.
My dad said the same thing about GPS Vs the ability to map read
Nearly drove us off the road a couple of times while trying to read a map lol
I don’t know about your city, but I trust technology a lot more than the average driver. At least technology can detect a red light vs a green light. I nearly got hit by a ford mega truck in broad daylight who thought the small, green bicycle symbol was his indicator to ignore his massive red “no left turn” indicator across a protected bike lane. :P
I don’t know about your city, but I trust technology a lot more than the average driver.
I don’t. Technology can be subject to glitches, bugs, hacking, deciding to plow right through pedestrians (hello Tesla!), etc.
While the case can be made that human drivers are worse at reaction time and paying attention, at least a “dumb” car can’t be hacked, won’t be driven off the road due to a bug, won’t try to knock people over itself without stopping, etc.
A human, when they catch these things happening, can correct them (even if it is caused by them). But if a computer develops a fatal fault like that, or is hijacked, it cannot.
EDIT: It seems like this community is full of AI techbro yes-men. Any criticism or critical analysis of their ideas seems to be met with downvotes, but I’ve yet to get a reply justifying how what I said is wrong.
Plenty of dumb cars get recalls all the time for shitty parts or design. Remember that Prius with the brakes that would just decide to stop working?
I’m a driving instructor and I believe automated roads would be a godsend. I’d actually be happy the day I don’t have to do my job, not because I hate it - I actually love my job - but because automated roads would be so much safer overall.
I guess that’s fair. Machines would be much less accident prone than humans, but you can’t automate everything on the road (e.g. people, bikes, non-automated vehicles). People are going to have to be able to know how to get out of situations manually. What about emergencies where you have to do something that the automated roads aren’t programmed for?
Yes, you probably are. Please don’t forget that the current available technology constantly improves, and that we actually don’t see any good examples of self - driving cars that much - the most prominent displays are from Tesla, and they arguably build the worst cars we’ve seen since Ford came up with the assembly line.
The technology used in autonomous vehicles, e. g. sensors, have been used in safety critical application for decades in other contexts, and a machine is capable of completely different reaction times. Also, if autonomous vehicles cooperate in traffic sticking to their programmed behavior, observing traffic rules etc., you will get less reckless driving, with traffic flow becoming more deterministic. These benefits will particularly increase once self-driving cars don’t have to share the road with human drivers.
I would always trust a well-engineered, self-driving car more than one driven by a human.
Disclaimer: I used to work on these things in a research lab. Also, we’re not quite there yet, so please have a little patience.
What about things on the road that are not automated? There will be situations where a machine’s ethics might override a human driver’s ethics. It would be good for us to be able to override the system and know how to safely drive in emergencies.
It’s not about everything being automated. We also have to differentiate between early incarnations of autonomous vehicles and the desired, final state.
A manual override will of course be part of early models for the foreseeable future, but the overall goal is for the machine to make better decisions than a human could.
I don’t have any quarrel with life or death decisions being made by a machine if they have been made according to the morals and ethics of the people who designed the machine, and with the objective data that was available to the system at the time, which is often better than what would be available to a human in the same situation.
It’s the interpretation of said data that is still not fully there yet, and we humans will have to come to terms with the fact that a machine might indeed kill another human being in a situation where acting any different might cause more harm.
I don’t subscribe to the notion that a machine’s decision is always worth less than the decision of another entity participating in the same situation, just because it so happens that the second entity happens to be a human being.
I guess I grew up with a lot of automated things, but I feel like driving is something that people should be able to do for safety reasons.
Have you not met “people”? People drive like absolute idiots. I had to emergency brake just yesterday because the woman driving the car to my right was drifting into my lane at highway speeds while she negotiated a large 24oz can of some beverage in her left driving hand while simultaneously holding a cigarette. Her right hand and both eyes were busy doing something on her cell phone.
There is very little safe about people driving.
Yeah people suck at driving, but they’ll just get worse while the automation is being ironed out and eventually they won’t know how to drive at all. What about an emergency situation that the automated cars and/or roads aren’t programmed for?