Self Driving Cars?

Speaking of tickets, here's an interesting angle: Who makes up for loss of the significant revenue stream from traffic fines that many states and towns depend on?

Maybe a related issue: When was the last time you saw a police officer? I'd be willing to bet that the majority of the time, the officer was in their car, with radar, waiting for speeders.

What will all those policemen do when all cars obey the law? How about all the judges, clerks and attorneys in traffic court? The millions spent on mandatory driver safety classes for drivers who accumulate too many points? The whole industry of stop light traffic cameras?

There's a lot of entrenched bureaucracies which depend on traffic violations.
They’ll attempt to resist but ultimately they’ll just have to change. They’ll adapt, just as all industries adapt when faced with a sea change. For example, the music industry has been transformed for artists, distributors, etc. And I am sure people asked what would happen to all the services supporting horses, before automobiles replaced them. Early in the history of the US, something like 90% of the population was involved in agriculture, now it’s about 3%. We worry needlessly and think of imagined barriers but we adapt...nothing new.
 
Last edited:
On the topic of punishment of corporations vs individuals, it seems like the current model will prevail. We all remember the Pinto, which occasionally burst into flames when rear-ended. Ford knew it, but did the math. Law suits we're going to cost less than a recall, so they decided to forego the recall. But they have it easier with SDC's...a software push is essentially free. Oh, but here's a snag...we make them go through a laboriously long process to verify every new version...think along the lines of the drug approval process. There needs to be lots of shareholder pain for messing up. Something certainly missing from what we have today.
 
Tesla started modulating their autopilot claims after the May 2016 crash with a semi, and they seem to be (wisely) sticking with the more conservative stance - as they should IMO. Too bad they got out over their skis several years ago. In this recent video, Musk acknowledges SDC will never be perfect, but the goal is still a 90% reduction in accidents/fatalities. Seems to be the shared goal of most leading SDC teams.

https://youtu.be/AO33rOofFpg
 
He's doing damage control.

They've been aggressive about blaming the driver in the recent Model X fatal crash, breaking with the NTSB, which is still investigating.
 
He's doing damage control.

They've been aggressive about blaming the driver in the recent Model X fatal crash, breaking with the NTSB, which is still investigating.
IF what Musk says about the NTSB taking a year to conclude their investigation, I can understand why the company doesn’t want questions hanging out there that long, and therefore why they released information themselves. Doesn’t preclude or contradict what the NTSB does? So I looked back and it took over a month for the NTSB to release information on the May 2016 Tesla autopilot/semi truck crash/fatality, and over 8 months to issue their final report.

And it seems like Tesla has consistently said drivers must pay attention and intervene as needed with their “autopilot” since the semi fatality, so the driver in the latest crash would be to “blame.”

I’m not particularly a fan or foe of Tesla, but I can understand both sides.

https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.pdf
 
Last edited:
Tesla started modulating their autopilot claims after the May 2016 crash with a semi, and they seem to be (wisely) sticking with the more conservative stance - as they should IMO. Too bad they got out over their skis several years ago. In this recent video, Musk acknowledges SDC will never be perfect, but the goal is still a 90% reduction in accidents/fatalities. Seems to be the shared goal of most leading SDC teams.


The reporter who rode in the car with Musk later said that he ran through a few stop signs! I guess that's the danger of driving while distracting yourself by talking.

He's doing damage control.

They've been aggressive about blaming the driver in the recent Model X fatal crash, breaking with the NTSB, which is still investigating.
+1

Tesla said that the driver had his hands off the wheel for 6 seconds prior to the crash. Indeed that was way too long, as I saw a youtube video where a Tesla owner tested his car with a similar divider. The car headed straight for the divider, and if the owner did not override it, in 2 seconds he would be dead.


 
Last edited:
Tesla started modulating their autopilot claims after the May 2016 crash with a semi, and they seem to be (wisely) sticking with the more conservative stance - as they should IMO. Too bad they got out over their skis several years ago. In this recent video, Musk acknowledges SDC will never be perfect, but the goal is still a 90% reduction in accidents/fatalities. Seems to be the shared goal of most leading SDC teams.

https://youtu.be/AO33rOofFpg

I have to disagree with this statement. This video dated Nov 2016 is on their website right now. It emphasizes that the "driver" is only there for legal reasons and his/her hands remain off the steering wheel throughout the demo. It even states he/she is "not doing anything" (e.g. not paying attention or monitoring AP).
This leads to expectations that autopilot's capability is greater than these accidents would suggest.
https://www.tesla.com/videos/autopilot-self-driving-hardware-neighborhood-short
 
Yes. We do not know how they rehearsed that route, and how many takes it took for them to get a perfect demo video. Actual real-life videos posted by car owners are not so impressive.

And that's why the NTSB chairman, in the report on the Florida accident where a Tesla drove under a semi-trailer, said that Tesla seemed to talk out of both sides of its mouth.

In the demo with the reporter riding in the car as linked in post #1231, the reporter said that the car drove through several stop signs, with Musk at the wheel! And he had his hands off the wheel quite often.

Was the car on autopilot then when it ran the stop signs? Why wasn't Musk paying attention to apply the brake? If it ran into another car, that would make it a very exciting piece for the reporter to broadcast.
 
Last edited:
. In this recent video, Musk acknowledges SDC will never be perfect, but the goal is still a 90% reduction in accidents/fatalities.

The reporter who rode in the car with Musk later said that he ran through a few stop signs! I guess that's the danger of driving while distracting yourself by talking.
Maybe Musk thinks it will be easy for SDCs to reduce fatalities by 90%--because he thinks other drivers operate a vehicle just as proficiently as he does. :)
 
If the autopilot was on, both it and Musk failed to see the stop signs! Two are worse than a normal driver.

Well, there will always be people who believe in video demo pitches, the same as there are always people who believe in buying newsletters with stock tips.

The difference is the latter group simply impoverish themselves. Somebody else enjoys their money, so it is not really wasted. On the other hand, the first group may hurt innocent bystanders in addition to themselves.
 
Last edited:
I have to disagree with this statement. This video dated Nov 2016 is on their website right now. It emphasizes that the "driver" is only there for legal reasons and his/her hands remain off the steering wheel throughout the demo. It even states he/she is "not doing anything" (e.g. not paying attention or monitoring AP).
This leads to expectations that autopilot's capability is greater than these accidents would suggest.
https://www.tesla.com/videos/autopilot-self-driving-hardware-neighborhood-short
Fair point. As I noted earlier, it’s probably because of Tesla legal, but where Tesla (and some owners on YouTube) got way ahead of themselves regarding their autopilot claims before the May 2016 semi crash, they have walked back their claims repeatedly since then. Here’s a more current statement that clearly states the autopilot is only level 2 technology, where true SDC is level is level 4/5. https://www.tesla.com/presskit#autopilot
Tesla said:
Tesla Autopilot is an increasingly capable suite of safety and convenience features that make personal transportation safer and more enjoyable. Since September 2014, Autopilot hardware has come standard in all Tesla vehicles, and Tesla has continued to refine and enhance the Autopilot system since its features were first enabled in cars in October 2015 via over-the-air software updates. Data shows that, when used properly, drivers supported by Autopilot are safer than those operating without assistance. Eventually, full autonomy will enable a Tesla to be substantially safer than a human driver.

In its current form, Autopilot is an advanced driver assistance system (ADAS) that classifies as a Level 2 automated system by the National Highway Transportation Safety Administration (NHTSA). It is designed as a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable by reducing the driver’s workload.

Autopilot’s safety and convenience capabilities are designed to be additive to the driver’s by augmenting their perception, improving their decision making, and assisting in their control of the vehicle. Its user interface has been carefully designed to encourage proper use and to give drivers intuitive access to the information the car is using to inform its actions, via a detailed visual display on the instrument panel and clear audible cues. As Autopilot technology continues to be developed, more advanced functionality will be made available to Tesla owners over time nearing full self-driving capabilities; however, until truly driverless cars are developed and approved by regulators, the driver is responsible for and must remain in control of their car at all times.
This probably still summarizes best:
And that's why the NTSB chairman, in the report on the Florida accident where a Tesla drove under a semi-trailer, said that Tesla seemed to talk out of both sides of its mouth.
 
https://www.msn.com/en-us/autos/news/teslas-autopilot-boss-jim-keller-leaves-the-company/ar-AAwoaQs

It sure does seem like Tesla can't catch a break lately, what with the high-profile Autopilot-related fatality, its Model 3 production woes, and a boss that sleeps at the factory. Things are about to get a little more complicated, because Tesla's head of Autopilot development, Jim Keller, has just flown the coop.

Keller joined Tesla two years ago from chip-maker AMD, where he was a legendary chip designer. His departure isn't a huge surprise given the mad rush by Silicon Valley and Detroit alike to hire anyone with autonomous vehicle system engineering experience, though he has apparently returned to his previous work designing superconductors.

Keller will be replaced by Pete Bannon, a two-year Tesla vet who came over from Apple, as head of Autopilot hardware while Andrej Karpathy, director of AI and Autopilot Vision, will take over the development of Autopilot software.
 
Last edited:
I have not read all posts in the thread and someone may have posted earlier. But one big worry is that bad actors can use self driving cars as WMD. Probably the most lethal one given the ubiquity of cars and in crowds
 
I have not read all posts in the thread and someone may have posted earlier. But one big worry is that bad actors can use self driving cars as WMD. Probably the most lethal one given the ubiquity of cars and in crowds
I also am joining the discussion late, but I believe if there is profit to be made, no concerns will prevent auto-driving cars or auto-driving trucks for that matter.
 
I have not read all posts in the thread and someone may have posted earlier. But one big worry is that bad actors can use self driving cars as WMD. Probably the most lethal one given the ubiquity of cars and in crowds
SDC is still a work in progress, but as they’re being designed to stay on roads and avoid obstacles including pedestrians, other vehicles, fixed objects, etc. - I’m no tech wizard but I’m having trouble figuring out how they’d be deployed as WMDs without a sophisticated and prohibitively technical effort. Hitting one person much less plowing into a crowd would seem completely counter to any programming I’ve read about. Fully autonomous cars like Waymo’s have done millions of miles encountering lots of people without hitting anyone (the Uber Arizona car had a driver whose sole purpose was to intervene if necessary, but wasn’t paying attention).
 
Last edited:
I have not read all posts in the thread and someone may have posted earlier. But one big worry is that bad actors can use self driving cars as WMD. Probably the most lethal one given the ubiquity of cars and in crowds

They could do that today.
 
According to some sources, in the fatal accident in Tempe where an Uber experimental SDC hit a pedestrian crossing the road, the car software decided to ignore what the sensors reported.

Uber has determined that the likely cause of a fatal collision involving one of its prototype self-driving cars in Arizona in March was a problem with the software that decides how the car should react to objects it detects, according to two people briefed about the matter.

The car’s sensors detected the pedestrian, who was crossing the street with a bicycle, but Uber’s software decided it didn’t need to react right away. That’s a result of how the software was tuned. Like other autonomous vehicle systems, Uber’s software has the ability to ignore “false positives,” or objects in its path that wouldn’t actually be a problem for the vehicle, such as a plastic bag floating over a road...

See: https://www.theinformation.com/arti...sed-by-software-set-to-ignore-objects-on-road.
 
This will happen over generations. Eventually there will be a generation who’s never driven a car, all they’ll know will be SDC’s. It’s the transition that’ll be messy and contentious for some, but Boomers may never face having driving privileges taken away.

Except as they get older and can no longer see etc. Consider that the average car is now 11 years old, so that once self driving cars become available in lower priced vehicles that do work on rural dirt roads, it will take probably 15 years or more to begin to ban them starting with interstates and divided highways (will start with were alternate possible slower routes are available).
 
I can't understand why Uber isn't banned from any public testing. They're dragging the whole initiative down.
 
Except as they get older and can no longer see etc. Consider that the average car is now 11 years old, so that once self driving cars become available in lower priced vehicles that do work on rural dirt roads, it will take probably 15 years or more to begin to ban them starting with interstates and divided highways (will start with were alternate possible slower routes are available).

Those who look forward to banning non self driving cars shouldn't be so hasty as to make the assumption that everyone else agrees with that ideology.
 
Those who look forward to banning non self driving cars shouldn't be so hasty as to make the assumption that everyone else agrees with that ideology.
Who made that assumption?
 
There have been plenty of comments in this thread from people suggesting that they can't wait until self-driving cars take over and driving is banned.

Very true.
But did anyone say that everyone else agrees with them?
 
Back
Top Bottom