Let's talk Self Driving Cars again!

While we have accidents all the time with "people" in control, it appears we will not tolerate a single error in self-driving cars. Each time an accident occurs when a "self-driving" car is involved, it's national news. That's a major hurdle to overcome. YMMV

We need to hold computer driven vehicles to a higher standard, but I think the standard as it is emerging is going to be unobtainable.

I agree about the "national news" aspect. You might as well say international news.

Emotional driven news and decision making is currently at a high. In case you didn't notice, our new connectivity allows us to worry about every event that occurs in the far corners of the world. We then transfer and generalize this to our own experience, even if it doesn't make sense.

The point being that since we are in an emotion driven world where "systems" and "cultures" are blamed for all the evils that befall us, then the "system" of self driving cars will have to be pristine to succeed. It may be an impossible task.
 
The point being that since we are in an emotion driven world where "systems" and "cultures" are blamed for all the evils that befall us, then the "system" of self driving cars will have to be pristine to succeed. It may be an impossible task.

But "systems and cultures" are how our human minds work. We automatically and necessarily organize the world into categories and hierarchies and have since the beginning, including scientists and engineers :). We don't just say, "well, that was a chance death," we look for a cause that we can then approach. We have overcome our fears of mounting an animal, of driving a vehicle that we do not understand, of getting into trolleys that run without conductors, and of landing on the moon. We even ovrcome our fears of running into hails of bullets in wartime - all through mentally organizing the world into systems that justify our doing so.

[Aside: the Aristotelian universe still makes lots of sense to me - too bad it doesn't work, lol!]

Back to your point, I DO think people will not demand total perfection, since we do not demand it of airplanes. I'd be satisfied with an airplane standard. I get on a plane even though I know it *could* crash, and (after 10,000 people precede me), I'll do it with a car. :)
 
I'll stick in my $.02 worth, as a former software engineer. Years ago I attended a software development conference, and there was a speaker that talked about this subject. He said imagine a scenario where the car is humming along, and then suddenly there is a situation where no matter what the car does, it is going to hit either car A or car B in front of you. The software has to choose (or not choose) which path to take. The car ends up hitting car A (which has children in it), and some of those occupants are killed. The lawyers for family A sue the software developers for not choosing to steer the car towards car B. Do you really want to be the software engineer that worked on that code where 1) children were killed, and 2) you were sued because of it?

Also, ALL software has bugs (is not perfect). Especially complex stuff like this. The question is, if it works 99.99% of the time, and saves more lives than it takes, are the occasional spectacular deaths acceptable by society?

The other day I was up north (pothole country) driving near dusk, on a road with lots of shadows. I hit a pothole that nearly bent a rim (it should of had its own zip code it was so deep), that I simply could not see. I really think the software/hardware would not be able to see it either. But, guess what, the next time I went along that road, I went around it. I doubt software would do that.

When I drive my car, I frequently think about or experience situations where I wonder how software would handle this situation. My frequent personal conclusion is it simply cannot.
 
Also, ALL software has bugs (is not perfect). Especially complex stuff like this. The question is, if it works 99.99% of the time, and saves more lives than it takes, are the occasional spectacular deaths acceptable by society?

As a retired SW engineer myself, I totally understand your point.

However, why do we accept things like crankshafts failing and causing loss of control? Or even more common, why do we accept the shoddy work of auto techs causing wheel lug failure and wheels falling off?

I mean, mistakes (bugs) are made every day in this world resulting in hundreds of wheels falling off daily around the world*. Cars frequently spectacularly careen out of control. Wheels become cruise missiles bounding along and crossing the center line causing impacts with oncoming traffic.

Yet we "accept" this.

* - Source: plenty of cam cars catching wheels falling off on the subreddit r/idiotsincars
 
I think there is a good possibility of total transportation replacement in small rural towns. Just have the cars share their knowledge of the town, disable their ability to long distances.

Keep a supply of long distance gas vehicles for the real road trips. All the local transportation could be absorbed by a shared fleet. this would help make rural small town living more affordable and practical. Very little repair and maintenance cost, no personal insurance. Cheeeep.
 
Yes, things fail as a matter of fact. A bolt is broken under stress. And that's why we have design standards as to what material, what kind of steel a critical bolt is made of. Things like bearings are worn out with time. And that's why we have inspections.

Most of my career was in avionics (electronic systems used on aircraft). I spent a few years on an autopilot with automatic landing capability. It's for a commercial jet liner, and not the same as autopilots on little aircraft that can only hold altitude or heading.

We had to show that the risk of a crash is down to 1 in a billion. This was an FAA requirement. In order to show the system could land an aircraft in adverse weather, it was impossible to demonstrate enough flight tests under various conditions, so we did it via many Monte Carlo simulations. Hardware failures would be detected by the system architecture employing multiple channels for redundancy. To show that the design would work to catch hardware failures, we employ FMEA and FTA methods.

FMEA: Failure mode and effects analysis is the process of reviewing as many components, assemblies, and subsystems as possible to identify potential failure modes in a system and their causes and effects.

FTA: Fault tree analysis is a type of failure analysis in which an undesired state of a system is examined.

The above was the established method for certifying a critical flight control system that could land an aircraft in zero-visibility weather when the pilots could not see the runway to land manually. I have not worked in this field for quite a few years, but don't think the methodology has changed much.

Car driving is a lot more complex than flying an aircraft. An aircraft does not negotiate with another about which one gets to land first. The air traffic controller decides who gets to land, and who has to wait. Similarly, the environment that a car finds itself in is a lot more diverse and complex than the atmosphere that the aircraft flies in. Our Autoland autopilot was certified to land the aircraft in a maximum gusty crosswind of 15 knots. How did it know if the crosswind was not higher than that? Again, it had the help from air traffic control advisory. What kind of external help can an SDC count on?

The SAE has defined different levels of sophistication for car driving automation, from simple lane keeping to the ultimate robot taxi. But has anyone been able to devise tests to see which systems would qualify?

A car maker keeps bragging that its system is safer than human drivers, yet it keeps plowing into other vehicles, rear-ending motorcyclists, and running into barricades. Oh, I am sure that there are human drivers that do the above daily. But most people, myself and you too hopefully, do not do that routinely. If you are asking me if this system is good enough, I won't know to laugh or not. What kind of joke is that?

Now, how good should a Level 4 SDC be? How to define the requirements? And how to verify that a system meets the requirements? These are really tough questions. I am sure many people are working on finding some answers, but I have not seen any.
 
Last edited:
We used to lament that the Autoland autopilot had to be so safe that the risk of it crashing the plane is less than that of a wing breaking off the plane. :)

People who are curious can look up FAA Advisory Circular AC 25.1309-1A.

A synopsis can be found on Wikipedia here: https://en.wikipedia.org/wiki/AC_25... Regulations. Revision A was releases in 1988.

AC 25.1309–1 establishes the principle that the more severe the hazard resulting from a system or equipment failure, the less likely that failure must be. Failures that are catastrophic must be extremely improbable.


Extreme improbability is then defined to be 10^-9, which is called "ten to the minus 9, meaning a risk of one crash per billion flights.
 
I don't see the parallels with aircraft auto-land systems. Aircraft have a set of people with sophisticated systems keeping them far apart. Aircraft (generally) do not have little kids, skateboarders, potholes, deer, people opening doors, etc on the runway. If something does go wrong, aircraft have TCAS systems where the planes communicate with each other and agree on paths that keep them from colliding.
 
Right!

In a past thread, people said if we had autopilots for planes and they had proven safe, then why not for cars too.

My point has been that Autoland autopilots are safe because a lot of work and money goes into a system that is working a simpler problem than autopilots for cars.
 
We used to lament that the Autoland autopilot had to be so safe that the risk of it crashing the plane is less than that of a wing breaking off the plane. :)

People who are curious can look up FAA Advisory Circular AC 25.1309-1A.

A synopsis can be found on Wikipedia here: https://en.wikipedia.org/wiki/AC_25... Regulations. Revision A was releases in 1988.




Extreme improbability is then defined to be 10^-9, which is called "ten to the minus 9, meaning a risk of one crash per billion flights.

Yeah, I flew 125 hours in my brief flying career. I had one loss of engine emergency in that period of time and ended up in a winter wheat field. My math is rusty, but I think that's on the order of 10^-2. 10^-9 is totally unachievable with humans or machines. Once in 10^-9 you use an electrical appliance, you will likely burn down your house! YMMV
 
Back to your point, I DO think people will not demand total perfection, since we do not demand it of airplanes. I'd be satisfied with an airplane standard. I get on a plane even though I know it *could* crash, and (after 10,000 people precede me), I'll do it with a car. :)

If we see an actual drop in accidents and severity, I think that will help convince people. But, it has to happen and be sustainable.
 
It will be a long, long time before I trust self driving cars. I still don't see the need for any car I buy for myself to have antilock brakes, traction control, lane departure warning, back up cameras or even seatbelt buzzers. Although with the design of some of the newer vehicles, blind spot warning is helpful.

I do miss the availability of simplicity and the affordability of the cars of years gone by.
 
It will be a long, long time before I trust self driving cars. I still don't see the need for any car I buy for myself to have antilock brakes, traction control, lane departure warning, back up cameras or even seatbelt buzzers. Although with the design of some of the newer vehicles, blind spot warning is helpful.

I do miss the availability of simplicity and the affordability of the cars of years gone by.

Respectfully disagree about not needing your list of safety stuff. These are proven to be effective - especially for folks who aren't as savvy as those who learned to drive back in the 60s (Pump the brakes, Ko'olau, PUMP the brakes on ice, you idiot!") Now it's "Stomp and steer!" It really works, too.

I'd rather go back to manual window-winders to save money than give up (especially) ABS. But YMMV.
 
It will be a long, long time before I trust self driving cars. I still don't see the need for any car I buy for myself to have antilock brakes, traction control, lane departure warning, back up cameras or even seatbelt buzzers. Although with the design of some of the newer vehicles, blind spot warning is helpful.

I do miss the availability of simplicity and the affordability of the cars of years gone by.

+1000

I'm sure I'll never buy a "self driving" car. Sounds like an invitation to disaster, to me.

I remember when a brand new VW bug cost $2,000 or so, and the teenagers I knew who had one would repair it at home. (sigh) Those days are long gone, and it's a shame young people these days will never get the chance to experience those times and cars.
 
+1000

I'm sure I'll never buy a "self driving" car. Sounds like an invitation to disaster, to me.

I remember when a brand new VW bug cost $2,000 or so, and the teenagers I knew who had one would repair it at home. (sigh) Those days are long gone, and it's a shame young people these days will never get the chance to experience those times and cars.

Heh, heh, my first car was a '49 Chevy. I paid $50 for it. I could keep it running with a Sears set of tools (sockets, screw drivers, torque wrench, etc.) I eventually ground the valves on it. Sold it for $75. Heh, heh, paid $100 for my last oil change.:facepalm: What's wrong with this picture?:(
 
Heh, heh, my first car was a '49 Chevy. I paid $50 for it. I could keep it running with a Sears set of tools (sockets, screw drivers, torque wrench, etc.) I eventually ground the valves on it. Sold it for $75. Heh, heh, paid $100 for my last oil change.:facepalm: What's wrong with this picture?:(

Sounds like a wonderful first car! I'll bet you learned a lot keeping it repaired and running like that. Great educational experience. :D
 
Sounds like a wonderful first car! I'll bet you learned a lot keeping it repaired and running like that. Great educational experience. :D

Yeah, I learned I never wanted to buy another car that I had to work on all the time!:LOL: But up to the 90's, I did still did stuff like oil changes and tune-ups. Not anymore! I'm too old and cars are too complicated for my limited (and perishable) skills.
 
Yeah, I flew 125 hours in my brief flying career. I had one loss of engine emergency in that period of time and ended up in a winter wheat field. My math is rusty, but I think that's on the order of 10^-2. 10^-9 is totally unachievable with humans or machines. Once in 10^-9 you use an electrical appliance, you will likely burn down your house! YMMV

So, do you think the 10^-9 that the FAA requires is BS? :D

One thing I need to make clear. It is not that the Autoland autopilot is allowed to fail only once per billion flights. No, electronics fail more frequently than that. A lot more frequently.

A crash by an autopilot is of course a failure, but a failure of the autopilot does not mean a crash has to happen.

There's no electrical appliance that will work for 1 billion days or 1 billion hours, or even 1 billion seconds (1 billion seconds is 32 years!). But if your coffee pot passes UL testing, then when it goes kaput after 1 or 2 years (and it's nowhere at 24/7 operation), it is not likely to set itself on fire. It means the coffee pot is designed and built to a certain safety standard to be UL listed.

For the Autoland autopilot, it is standard practice to use multiple sensors, and multiple computers for redundancy. It is so that a single failure will likely result in a discrepancy that will be picked up by a monitoring system. The system then warns the pilots to take over, and shuts itself down. In a landing, the pilots then execute a go-around if they cannot see the runway in zero visibility weather, and divert to another airport.

I am not aware of any Autoland autopilot that crashes its airplane. Part of it is that pilots usually prefer to land the plane themselves. Most of them only use the Autoland system in zero-visibility weather, and this does not happen frequently.

So, I doubt that there have been 1 billion landings made with an Autoland system for us to have a crash caused by an Autoland autopilot. There are plenty of crashes for other reasons, as we know.
 
Last edited:
I should think that the safety systems on a $50M airplane costs many $10Ks to a $100k dollars, maybe more. As we have been discussing, the systems in self driving cars would have to be even more sophisticated. How on earth are they going to get the price down to maybe $1K?
 
I'm sure I'll never buy a "self driving" car. Sounds like an invitation to disaster, to me...


Maybe wait until they make the car safe enough that they guarantee that if any accident happens they assume all responsibilities, and pay a zillion dollars to the victims that your car kills, and also to your estate if you also, ahem, perish.
 
Last edited:
I should think that the safety systems on a $50M airplane costs many $10Ks to a $100k dollars, maybe more. As we have been discussing, the systems in self driving cars would have to be even more sophisticated. How on earth are they going to get the price down to maybe $1K?


Perhaps you were not around in the first SDC thread back then, when I brought up the same issue. Some posters say, technology is so good now, and electronics are getting cheaper all the time, and we have the economy of scale with a lot more cars being produced than airplanes. So it will be cheap.

I dunno. I expressed my doubts, but was called a Luddite. :D

Now, I never said a Level 4 and Level 5 SDC would be impossible. Never say never.

I was just trying to say it would be more difficult and more expensive than some bozo kept promising. And I also listened to the researchers at Carnegie Mellon who were the pioneers of this technology. They all said the problem was tough, and they could not tell when they would have it.

By the way, the Autoland autopilot I participated in developing in the early 80s cost a few $100K. That price does not include all the sensors on the airframe that the autopilot interfaced to. It should be less expensive now.
 
Last edited:
So, do you think the 10^-9 that the FAA requires is BS? :D

Not sure I understand your question. But if FAA (or anyone) suggests ANYTHING can have a no-fault record of one in a Billion - THAT's BS. Nothing happens a billion times without a fault. YMMV
 
Did you read my earlier post?

The Autoland autopilot is not expected to make 1 billion landings without a fault!

What is expected is that in the case of a fault, it will detect that condition, warns the pilot to take over, and disconnects.
 
Back
Top Bottom