Music Lover
Give me a museum and I'll fill it. (Picasso) Give me a forum ...
I'll just let the early adopters work out all the bugs...that philosophy has served me well over the years.
Fine with me too, except that innocent bystanders or other people sharing the road become "collateral damage".I'll just let the early adopters work out all the bugs...that philosophy has served me well over the years.
"Innocent bystanders and other people sharing the road" are already "collateral damage" - every day. That's the point.Fine with me too, except that innocent bystanders or other people sharing the road become "collateral damage".
There aren't any true self driving cars commercially available yet. When there are, there will be many fewer victims when all is said and done, not more.
I used to think so, but after understanding more about the current technology, I got more interested about the technical difficulties. It turned out to be more complicated than I first thought. See my post #80 about limitations of the sensors that the computer uses for its eyes.I think the biggest challenges are not technical but legal and cultural...
Maybe I'm wrong, but by only highlighting ignoramuses who own Teslas as "more of those" I think you're argument is biased.There are ignoramuses out there who do not know (see the videos of those idiots leaning back pretending to sleep in a moving Tesla), and we do not want more of those on the road to make it more unsafe than it already is.
Absolutely. Careful drivers weren't misled. But Tesla marketing got out over their skis, and were complicit, inevitably the ignoramuses proved it. We'll never be free of ignoramuses in vehicles, but self driving cars will do a lot to protect us from them in time (at least 10 years away, probably more).OK. I see your arguments that stupid people are dangerous whether they drive a Tesla with autopilot or not. This I agree.
I think we will both agree that the careful drivers should not be mislead into trust an infantile technology with their life. Oui?
...Self driving cars are "supposed" to be safer, but I've been a programmer long enough to know you can't foresee every possible outcome in the future. Not to mention things break. Even if a car can avoid kids, pets, icy roads, etc., what will it do if there's a flat tire, or the brakes fail, or the throttle sticks?
Not disagreeing, but same theme. Software just has to be significantly better at it than humans, not perfect. There will still be some accidents, but if software can deliver 90% fewer than today as some estimates project, I'd consider that progress and a success.By the way, another difficulty with self-driving cars is that they use software. And software is written by programmers. This brings us back to what written by the OP.
Programmers are human, and they make mistakes. Hence, there has to be a lot of testing. People who work in aerospace would know about DO-178 guidelines for software testing in avionics. I don't know if there is anything like it for car software.
About hardware failures, of course everything can fail. Here also, autopilots for commercial jets with automatic landing capability have employed fault-tolerant systems for many decades. It only costs money.
PS. I guess because of my engineering background in aerospace, I got really alarmed by the cavalier attitude that Tesla demonstrated. If we were like that, none of you would ever flown in an airplane after seeing them drop like flies. And of course the industry would also be bankrupt.
Not disagreeing, but same theme. Software just has to be significantly better at it than humans, not perfect. There will still be some accidents, but if software can deliver 90% fewer than today as some estimates project, I'd consider that progress and a success.
Pretty sure most are going for level 4 automation, not level 3 where you have to take over.
They know that people aren't going to be able to react in time to take over and avoid a collision.
As for software, the genie is already out of the bottle. We rely on software for a lot now, such as auto-pilot on commercial flights. Already software is deployed on cars with certain collision-avoidance or auto-braking or adaptive cruise control too.
Software used by doctors and other health care providers for medicare records and for banking.
Rather than bugs, the bigger threat is malware.
Here's an article on the Tesla Autopilot stats:
https://electrek.co/2017/01/19/tesla-crash-rate-autopilot-nhtsa/
"NHTSA’s Office of Defects Investigation (ODI) reviewed crash rate data from Tesla’s vehicles before and after the introduction of Autosteer:
ODI analyzed mileage and airbag deployment data supplied by Tesla for all MY 2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by miles traveled prior to and after Autopilot installation.
They came to the conclusion that “the data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.”"
That 40 percent stat was just for autosteer, not the traffic-aware cruise control functions.
The certification of avionics software is about the above. The testing costs are tremendous.
If the user of a Tesla "autopilot" engages this mode as a safeguard instead of relying on it as an autonomous driver, then I can see how it can be safer than the driver alone. You have another set of eyes watching out for you for augmentation, even if these eyes have cataracts. ...
If the user of a Tesla "autopilot" engages this mode as a safeguard instead of relying on it as an autonomous driver, then I can see how it can be safer than the driver alone. You have another set of eyes watching out for you for augmentation, even if these eyes have cataracts.
In watching the 3 accident videos that I shared above, I wonder how I would feel if I were the driver, when I saw an obstacle in the left lane approaching, approaching... I would wait for the car to steer itself, wait, and wait and wait....
At which point do I say "Darn, it really does not see that car stopped in the left lane!", and veer to the right myself?
Seems to me having to second guess this "autopilot" all the time would cause me stress. It would be the same when I have to wait for the car to the last minute to steer clear of a bicyclist to my right. "Is it going to do it or not?"
I bring up the avionics software standard because people like to say airplanes are safely flown by automatic systems, therefore car systems will also be safe. They don't know what goes into an aircraft design, the disciplines that were developed from many decades.
The self driving car likley will have a remote kill switch just like onstar has now (designed to make theft more difficult) you call onstar and they make the car stop.What would a bank robber do with a self driving getaway car?