Self Driving Cars?

I'll just let the early adopters work out all the bugs...that philosophy has served me well over the years.
 
I'll just let the early adopters work out all the bugs...that philosophy has served me well over the years.
Fine with me too, except that innocent bystanders or other people sharing the road become "collateral damage".
 
Fine with me too, except that innocent bystanders or other people sharing the road become "collateral damage".
"Innocent bystanders and other people sharing the road" are already "collateral damage" - every day. That's the point.

There aren't any true self driving cars commercially available yet. When there are, there will be many fewer victims when all is said and done, not more.
 
There aren't any true self driving cars commercially available yet. When there are, there will be many fewer victims when all is said and done, not more.

The above fact that I highlight, people who know about the current status of the art understands.

There are ignoramuses out there who do not know (see the videos of those idiots leaning back pretending to sleep in a moving Tesla), and we do not want more of those on the road to make it more unsafe than it already is. Those should be prosecuted the same way drunk drivers are.

And Tesla should be held responsible for not disconnecting the "lane-keeping assistance feature" when the driver's hand is off the wheel, the same way other car makers are doing. If the media is more technical savvy, it would publicize the distinction to counter the propaganda from irresponsible car makers, but we know how dumb most reporters are.

When the truly self-driving cars arrive (nobody knows when) after being fully tested and certified, of course they will be safe. Who can argue with that?
 
Last edited:
I think the biggest challenges are not technical but legal and cultural.

There will be wrangling over liability laws. Many of the companies pushing this technology have deep pockets so if their product causes deaths and property damage, those companies will be sued.

So there are laws to be enacted to sort out liability issues.

Then of course, as people have noted, people have to accept the technology, which probably will take years if not decades.
 
I think the biggest challenges are not technical but legal and cultural...
I used to think so, but after understanding more about the current technology, I got more interested about the technical difficulties. It turned out to be more complicated than I first thought. See my post #80 about limitations of the sensors that the computer uses for its eyes.

First thing first. Car makers must have something working better than a careful driver (not an idiotic driver, mind you), before they've got to worry about public acceptance.
 
There are ignoramuses out there who do not know (see the videos of those idiots leaning back pretending to sleep in a moving Tesla), and we do not want more of those on the road to make it more unsafe than it already is.
Maybe I'm wrong, but by only highlighting ignoramuses who own Teslas as "more of those" I think you're argument is biased.

If you want a balanced argument, you would also highlight all the ignoramuses in manually driven cars who text and drive, drive under the influence of drugs/alcohol, drive sleep deprived, turn to the backseat to discipline or help kids, are distracted by their nav/infotainment system, put on makeup, smoke and talk on the phone while driving, change lanes or speed excessively, fumble around for objects on the floor/glove compartment, and on and on.

But more important, the ignoramuses in Teslas you're selectively highlighting are the leading candidates to be ignoramuses engaged in the unsafe activities in the paragraph above in manually driven cars. It's not the car, they're going to be ignoramuses making us all less safe whatever car they're driving.

So you don't have any sound basis to summarily conclude there are 'more of them on the road' due to Tesla. It's just your speculation, unless you have data beyond a couple videos.

Sorry :horse:
 
Last edited:
OK. I see your arguments that stupid people are dangerous whether they drive a Tesla with autopilot or not. This I agree. :LOL:

I think we will both agree that the careful drivers should not be mislead into trust an infantile technology with their life. Oui?
 
OK. I see your arguments that stupid people are dangerous whether they drive a Tesla with autopilot or not. This I agree. :LOL:

I think we will both agree that the careful drivers should not be mislead into trust an infantile technology with their life. Oui?
Absolutely. Careful drivers weren't misled. But Tesla marketing got out over their skis, and were complicit, inevitably the ignoramuses proved it. We'll never be free of ignoramuses in vehicles, but self driving cars will do a lot to protect us from them in time (at least 10 years away, probably more).
 
Last edited:
By the way, another difficulty with self-driving cars is that they use software. And software is written by programmers. This brings us back to what written by the OP.

...Self driving cars are "supposed" to be safer, but I've been a programmer long enough to know you can't foresee every possible outcome in the future. Not to mention things break. Even if a car can avoid kids, pets, icy roads, etc., what will it do if there's a flat tire, or the brakes fail, or the throttle sticks?

Programmers are human, and they make mistakes. Hence, there has to be a lot of testing. People who work in aerospace would know about DO-178 guidelines for software testing in avionics. I don't know if there is anything like it for car software.

About hardware failures, of course everything can fail. Here also, autopilots for commercial jets with automatic landing capability have employed fault-tolerant systems for many decades. It only costs money. :)

PS. I guess because of my engineering background in aerospace, I got really alarmed by the cavalier attitude that Tesla demonstrated. If we were like that, none of you would ever flown in an airplane after seeing them drop like flies. :LOL: And of course the industry would also be bankrupt.
 
Last edited:
By the way, another difficulty with self-driving cars is that they use software. And software is written by programmers. This brings us back to what written by the OP.



Programmers are human, and they make mistakes. Hence, there has to be a lot of testing. People who work in aerospace would know about DO-178 guidelines for software testing in avionics. I don't know if there is anything like it for car software.

About hardware failures, of course everything can fail. Here also, autopilots for commercial jets with automatic landing capability have employed fault-tolerant systems for many decades. It only costs money. :)

PS. I guess because of my engineering background in aerospace, I got really alarmed by the cavalier attitude that Tesla demonstrated. If we were like that, none of you would ever flown in an airplane after seeing them drop like flies. :LOL: And of course the industry would also be bankrupt.
Not disagreeing, but same theme. Software just has to be significantly better at it than humans, not perfect. There will still be some accidents, but if software can deliver 90% fewer than today as some estimates project, I'd consider that progress and a success.
 
Last edited:
From what I've read, I think Ford perhaps is doing away with human intervention. They said the testers were falling asleep while testing self driving car and couldn't intervene fast enough and that was the reason for the elimination.
 
I believe Tesla was able to show that driving with Autopilot was 40% safer than driving with Autopilot off. Don't have a reference at hand.

In addition to all the bad videos, there are also good videos of accidents avoided.

Currently Teslas are not very good at seeing stationary objects in their path, a common problem with radar. That seems to catch out many inattentive drivers. The latest was a Tesla that hit a barrier at a road construction lane shift that was poorly marked. The original lane markings were left in place, leading right into the barrier, and temporary lane markings merging to the right were simply added. I would have done a double take at that, and Autopilot won't change lanes on its own yet. An alert driver wouldn't be using Autopilot in a construction zone and would be ready to take over at any time. It's still not much more advanced than simple lane keeping.

Tesla does require drivers to move the steering wheel a little bit every minute or two. No more climbing into the back seat. Many really hate that requirement, but it seems to be fairly normal for most to keep hands on the wheel now. The bad videos are one factor contributing to that.
 
Pretty sure most are going for level 4 automation, not level 3 where you have to take over.

They know that people aren't going to be able to react in time to take over and avoid a collision.

As for software, the genie is already out of the bottle. We rely on software for a lot now, such as auto-pilot on commercial flights. Already software is deployed on cars with certain collision-avoidance or auto-braking or adaptive cruise control too.

Software used by doctors and other health care providers for medicare records and for banking.

Rather than bugs, the bigger threat is malware.
 
Not disagreeing, but same theme. Software just has to be significantly better at it than humans, not perfect. There will still be some accidents, but if software can deliver 90% fewer than today as some estimates project, I'd consider that progress and a success.

Ah hah! To demonstrate that the software is better than humans requires proof or demonstration, not by hand waving. How do you quantify that? That's the difference between an engineer and a salesman.

The certification of avionics software is about the above. The testing costs are tremendous.
 
Here's an article on the Tesla Autopilot stats:

https://electrek.co/2017/01/19/tesla-crash-rate-autopilot-nhtsa/

"NHTSA’s Office of Defects Investigation (ODI) reviewed crash rate data from Tesla’s vehicles before and after the introduction of Autosteer:

ODI analyzed mileage and airbag deployment data supplied by Tesla for all MY 2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by miles traveled prior to and after Autopilot installation.

They came to the conclusion that “the data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.”"

That 40 percent stat was just for autosteer, not the traffic-aware cruise control functions.
 
Pretty sure most are going for level 4 automation, not level 3 where you have to take over.

They know that people aren't going to be able to react in time to take over and avoid a collision.

As for software, the genie is already out of the bottle. We rely on software for a lot now, such as auto-pilot on commercial flights. Already software is deployed on cars with certain collision-avoidance or auto-braking or adaptive cruise control too.

Software used by doctors and other health care providers for medicare records and for banking.

Rather than bugs, the bigger threat is malware.

Yes, we rely on software and computers a lot. The testing and certification rigors depend on the applications. Record keeping and accounting errors can be corrected after the fact. An aircraft autopilot is a much more critical application. I have told about the FAA guidelines that commercial jetliners have to follow. We do not have any standard now for car software.

Software to run a nuclear plant or to control an ICBM launch can have extremely catastrophic failures. You cannot just "try it and see what happens". If it fails, no you do not get to reboot and try again.

About malware, yes, that is a serious treat too, but a separate one. First, we talk about what is needed to have a real working system. Then, we talk about protecting it from sabotage.
 
Last edited:
Here's an article on the Tesla Autopilot stats:

https://electrek.co/2017/01/19/tesla-crash-rate-autopilot-nhtsa/

"NHTSA’s Office of Defects Investigation (ODI) reviewed crash rate data from Tesla’s vehicles before and after the introduction of Autosteer:

ODI analyzed mileage and airbag deployment data supplied by Tesla for all MY 2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by miles traveled prior to and after Autopilot installation.

They came to the conclusion that “the data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.”"

That 40 percent stat was just for autosteer, not the traffic-aware cruise control functions.

If the user of a Tesla "autopilot" engages this mode as a safeguard instead of relying on it as an autonomous driver, then I can see how it can be safer than the driver alone. You have another set of eyes watching out for you for augmentation, even if these eyes have cataracts. :)

In watching the 3 accident videos that I shared above, I wonder how I would feel if I were the driver, when I saw an obstacle in the left lane approaching, approaching... I would wait for the car to steer itself, wait, and wait and wait....

At which point do I say "Darn, it really does not see that car stopped in the left lane!", and veer to the right myself?

Seems to me having to second guess this "autopilot" all the time would cause me stress. It would be the same when I have to wait for the car to the last minute to steer clear of a bicyclist to my right. "Is it going to do it or not?"
 
Last edited:
The certification of avionics software is about the above. The testing costs are tremendous.

Not disagreeing, but will automotive software have to be held to that exacting standard? While the PR disaster following Ford's decision on the Pinto gas tank design was bad for publicity, it didn't change the math. Some losses are acceptable, and they can be much higher than in aviation. Society makes that decision every day with manually operated cars. The question is, which losses and how many? If 30-40k people died every year in airplane crashes flying would almost certainly be banned. And yet it is tolerated for cars.

Cars are a lot cheaper than airplanes (Duh!) but the costs of the automotive software will be spread over (eventually) millions of units sold vs. the hundreds or low thousands of aviation software thus making it a lot cheaper. And since a failure will not cost near as much as an airplane crash a lower standard of performance will be acceptable. I think....
 
I do not suggest that a car system has to be at the same level as that for an airplane. But there is currently no standard at all.

I bring up the avionics software standard because people like to say airplanes are safely flown by automatic systems, therefore car systems will also be safe. They don't know what goes into an aircraft design, the disciplines that were developed from many decades.
 
Last edited:
If the user of a Tesla "autopilot" engages this mode as a safeguard instead of relying on it as an autonomous driver, then I can see how it can be safer than the driver alone. You have another set of eyes watching out for you for augmentation, even if these eyes have cataracts. :) ...

Yes, I think this whole discussion would be short-circuited, and we would all benefit, if Tesla got away from the "autopilot" terminology, and instead focused on keeping the driver involved, and augmenting the driver with warnings.

The system should be monitoring that you are actively steering, that your eyes are scanning the road, you are moving your head from side-to-side to actively observe, and correcting for the things it does see. Then warn you to things approaching, that you may not see.

Letting the driver sort of 'drift off', and then flashing a warning isn't so great - the driver won't have time to 'catch up' to what happened, and may over react, or react the wrong way. Or as NW-Bound describes, wait, expecting the car to react, which could then be too late.

I think keeping the driver involved would be the best of both worlds. And at some point, with enough data and enough technology, maybe we can go full autonomous. But one step at a time.

-ERD50
 
Last edited:
If the user of a Tesla "autopilot" engages this mode as a safeguard instead of relying on it as an autonomous driver, then I can see how it can be safer than the driver alone. You have another set of eyes watching out for you for augmentation, even if these eyes have cataracts. :)

In watching the 3 accident videos that I shared above, I wonder how I would feel if I were the driver, when I saw an obstacle in the left lane approaching, approaching... I would wait for the car to steer itself, wait, and wait and wait....

At which point do I say "Darn, it really does not see that car stopped in the left lane!", and veer to the right myself?

Seems to me having to second guess this "autopilot" all the time would cause me stress. It would be the same when I have to wait for the car to the last minute to steer clear of a bicyclist to my right. "Is it going to do it or not?"

Maybe they can have a display indicating whether the autopilot system is detecting some of these obstacles ahead. If it doesn't highlight them, the driver is suppose to conclude that it won't take action.

But it appears every player is aiming for at least level 4 automation.
 
I bring up the avionics software standard because people like to say airplanes are safely flown by automatic systems, therefore car systems will also be safe. They don't know what goes into an aircraft design, the disciplines that were developed from many decades.

True enough. I have a private pilot's license (not active now) and am well aware that flying an airplane is mentally much easier than driving a car (at least VFR). Once when flying over Chesapeake Bay in calm air with the airplane trimmed out I sat there with my feet on the floor and my hands in my lap for at least ten minutes. Not so much as a wing-leveler on the airplane (Piper Tri-Pacer). Never been able to duplicate that driving a car.:LOL:
 
I have worked on flight control systems for commercial jetliners as well as military aircraft. The standards for commercial jetliners are a lot stricter. We are also more careful for selfish reasons: you get your ass sued if blamed for a crash. Military aircraft is different because there are no passengers involved. Military aircraft also get shot at, so there are other risk factors involved (the chance of it getting shot can be a lot higher than a crash due to malfunctions), plus pilots have ejection seats (except for helicopters :) ).

Working on missiles is the most fun, as you do not feel apprehensive about hurting passengers or pilots, plus missiles can pull a lot of G's and take off like a scared cat. It can still be stressful because a launch failure means a big loss in money, reputation, and future business. Generally, one does not fool around with stuff that flies.
 
Last edited:
What would a bank robber do with a self driving getaway car?:D
The self driving car likley will have a remote kill switch just like onstar has now (designed to make theft more difficult) you call onstar and they make the car stop.
 
Back
Top Bottom