Let's talk Self Driving Cars again!

I think one problem with AI accident acceptance will be that autonomous car accidents will be different than human car accidents. Most likely humans will think they could have avoided most of the AI accidents. At the same time AI will probably avoid most human-style accidents. That can be a difficult sell, even if AI has a lower overall accident rate.

Jumping in and suddenly taking over the driving responsibility can be a challenge. First and foremost is that your hands must be on the wheel. That allows you to feel the car turn before you notice it visually, and you know where your hands are on the wheel. Second, your right foot should be over the accelerator pedal. That's a place you should be accustomed to, ready to accelerate or brake with your usual muscle-memory actions. You absolutely should be observing everything around you, as if you were driving the car.

That's the elephant in the room with self-driving cars...you must be fully aware and ready to intervene. But even qualified test drivers who clearly know that the software is in development stage have been involved in collisions due to their distraction or failure to act quickly.

What happens to the average driver in an emergency who hasn't taken control of the vehicle in a year and suddenly has to intervene? What happens to a driver with diminished capacity in the same situation? Even on this board...which is filled with a lot of intelligent people, there are those looking forward to self driving cars when their driving skills diminish.

The only way a self driving car can work is if it can operate under the assumption that the driver can't or won't intervene on time. Because that will be the reality.
 
That's the elephant in the room with self-driving cars...you must be fully aware and ready to intervene. But even qualified test drivers who clearly know that the software is in development stage have been involved in collisions due to their distraction or failure to act quickly.

What happens to the average driver in an emergency who hasn't taken control of the vehicle in a year and suddenly has to intervene? What happens to a driver with diminished capacity in the same situation? Even on this board...which is filled with a lot of intelligent people, there are those looking forward to self driving cars when their driving skills diminish.

The only way a self driving car can work is if it can operate under the assumption that the driver can't or won't intervene on time. Because that will be the reality.
And that’s why Google (now Waymo) concluded in 2012 they would not offer level 2 or 3 cars to consumers - and they say most automakers are planning to skip that gray zone between driver assist and level 4/5. Tesla is the exception among automakers so far. Waymo has not allowed a consumer in the drivers seat, it’s either a Waymo employee or no one.

And fully autonomous cars still aren’t allowed in most parts of the country. It remains to be seen if level 3 cars will be permitted everywhere. Tesla seems to be skirting the issue, relying on disclaimers, with legislators?

I’m not aware of examples of “qualified test drivers” have been involved in avoidable accidents, do you have an example. The handful of accidents with qualified drivers I’ve read about were autonomous cars that were hit, some not even underway at the time of the accident.
The then-Google prototype car followed an experiment by Google in 2012 when the search engine giant allowed a group of employees to borrow self-driving Lexuses the company had designed. The employees were warned that at all times they had to be ready to take back control of the car when prompted, especially in the event of an emergency. Instead, they often climbed in the backseat, watched videos, or otherwise didn't pay attention. Google determined that humans simply couldn't be trusted to monitor the car's controls once driverless control was switched on. In a report, Google described this as "automation bias."

We saw human nature at work: people trust technology very quickly once they see it works. As a result, it's difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax.

Given the nascent stage of driverless cars, it is impossible to quantify the exact percentage of Level 3 cars that will be rolled out ahead of Level 4 models. This uncertainty is compounded by how Level 3 cars, which require input from the driver when prompted to take control of the vehicle, as well as Level 4 models, which auto-pilot the vehicle from point A to B without human intervention, have yet to become legal to commercially sell and operate. But according to suppliers, industry analysts, and OEM executives speaking on background whom Driverless has interviewed, most carmakers are following Waymo's lead by skipping Level 3 altogether.

"We have spoken with the majority of carmakers and we can confirm that the majority of carmakers or skipping level III," Louay Eldada, CEO and founder of Quanergy, told Driverless. "Not all OEMs are skipping Level III, but the majority of them are."

The main issue with Level 3 is that is that an emergency situations, humans just cannot be trusted to take over, Egil Juliussen, an analyst for IHS Automotive, told Driverless.

The fundamental problem with level III is that you can be told minutes before that the driver must take over, and that because of the nature of humans to trust technology, they may not properly heed warnings. This is especially true in case of an emergency situation when the driver would not have time to take over. There have been lots of tests that show the average wake-up time is anywhere from a couple of seconds to 10 seconds or longer. But the time to react in an emergency situation is typically less than a second. So OEMs are saying we are not going to go there.
https://driverless.wonderhowto.com/...-every-car-maker-should-skip-level-3-0178497/
 
Last edited:
... even qualified test drivers who clearly know that the software is in development stage have been involved in collisions due to their distraction or failure to act quickly...


There has been only one fatal accident involving an experimental SDC car with a test driver on board. It was a test car by Uber, and the accident happened in Tempe, AZ, about 10 miles from my house. Yes, Uber also chose to test its cars here, along with Waymo.

The test driver was an employee of Uber. The accident happened in March 2018 at night, when the Uber test car hit and killed a homeless woman who was crossing the road pushing a bicycle. The victim was jaywalking, meaning not using a marked crosswalk, but the accident would not happen if the driver was not looking down at his phone, watching a streamed video.

After a lengthy investigation involving the NTSB, the test driver was charged with negligent homicide in Sep 2020, more than 2 years after the accident. The DA office decided not to press criminal charge against Uber. The company quite early in 2018 settled a civil lawsuit pressed by family of the victim, a mere 2 weeks after the accident.

The above accident was the topic of a long thread on this forum. I am surprised that people's memory is so short. :)

See: https://www.bbc.com/news/technology-54175359, which has a good video of the accident.

For more details, see: https://en.wikipedia.org/wiki/Death_of_Elaine_Herzberg.


PS. The Uber test cars have a lidar among its sensor suite, like Waymo cars do. Recorded data shows that the lidar picked up the jaywalker at more than 6 seconds before impact, but the software for various reasons decided not to apply the brake until 1.3 seconds before impact. The car was traveling at 39 mph, within the speed limit of that street.

From an excerpt of the above Wikipedia article which quoted the NTSB , it was clear that Uber relied on its test driver to be constantly alert and to monitor the experimental system, and he failed.

The recorded telemetry showed the system had detected Herzberg six seconds before the crash, and classified her first as an unknown object, then as a vehicle, and finally as a bicycle, each of which had a different predicted path according to the autonomy logic. 1.3 seconds prior to the impact, the system determined that emergency braking was required, which is normally performed by the vehicle operator. However, the system was not designed to alert the operator, and did not make an emergency stop on its own accord, as "emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior", according to NTSB.
 
Last edited:
The above Uber fatal accident was the only one so far involving a test driver employed by a company in the testing of its experimental cars.

On the other hand, there have been at least 4 fatal accidents involving Tesla cars when the Autopilot was engaged, as far as I know. Two were in Florida, when the car drove itself under a semi trailer. One was in California, when the car ran into a freeway barrier. The 4th one was in China, when the car ran into a street sweeper.

There have been other serious accidents involving Tesla AP, but they did not result in a fatality, hence got little media coverage.

The above accidents did not involve an official test driver employed by the developer, hence did not get the same coverage as the Uber accident. They also killed only the driver ("it is his fault!"), not a bystander, hence did not stir the same public attention.
 
Last edited:
...even qualified test drivers who clearly know that the software is in development stage have been involved in collisions due to their distraction or failure to act quickly.
The above Uber fatal accident was the only one so far involving a test driver employed by a company in the testing of its experimental cars.
Sorry, employee does not automatically confer “qualified.” The Uber driver in the Tempe AZ was not qualified, she was texting at the time of the accident and his/her background doesn’t suggest qualified. And it doesn’t help that Uber chose to de-activate the automatic emergency braking system on its vehicle. ‘An accident waiting to happen.’
The NTSB noted that the driver of the Uber was moving for 31 minutes prior to the crash, spending 34 percent of that time on the phone, which goes against Uber policy. Vasquez had previously spent more than four years in prison for two felony convictions — making false statements when obtaining unemployment benefits and attempted armed robbery — before starting work as an Uber driver, according to court records.
 
Last edited:
I'm pretty sure, IIRC, that all of the Tesla fatalities were caused by drivers who acted as if the car was self driving. That is definitely not the case with the current Autopilot system. Some of that seemed to be due to sales people at Tesla. I value my life more than that.
 
I'm pretty sure, IIRC, that all of the Tesla fatalities were caused by drivers who acted as if the car was self driving. That is definitely not the case with the current Autopilot system. Some of that seemed to be due to sales people at Tesla. I value my life more than that.
While true, it doesn’t help that the company uses the terms “autopilot” and “full self driving” for what’s been categorized as level 2 capability. Or that Elon himself has chronically over promised, the latest example when Elon Musk claimed they would be “close to Level 5” by the end of 2021. Or Elon suggesting Tesla owners would be able to play Minecraft while driving.

If Tesla ever does reach level 4/5 capability I wonder what they’ll market that as?

That said, it’s probably just a matter of time before I buy a Tesla - because I want an EV. I’m not holding my breath that they’ll ever reach level 4 much less level 5. But someone will sooner or later...
 
Last edited:
Somethings are just different and we humans resist them, including me. My model Y doesn't have FSD it's of little value where I live. If I was still in KC I probably would have but on two lane roads it doesn't add enough value over base autopilot. Is autopilot perfect? No but it does add another level of security.

I had one experience early on when the vehicle went into emergency collision avoidance and it kept me from tboning a guy who stopped in the middle of the highway. However when a deer jumped in front of me it did nothing, just like me. I didn't hit the brakes until the deer was flying through the air like Rudolph on a bad drunk. (when I put autopilot on, I was thinking I could spend more time looking for game, ten seconds later, boom....)


Is it possible that "animal detection and braking" is a feature only available with FSD, which you don't have?

About your car braking for a car in front, it's good to hear. Yet, here's a video of a Tesla 3 on autopilot plowing into an overturned truck on the highway. It was obvious the driver was busy texting, sleeping, or watching a streamed video. This happened on June 2020, on a Taiwanese highway. The truck bounced from the road surface when accepting a powerful punch. :) Miraculously, the Tesla driver lived. Tesla cars appear very crashworthy. :)

The driver said that he applied the brake too late, and you can see that when the car is about past the truck driver by the side of the road.

PS. Note that this accident is similar to the two fatal accidents in Florida, when the Teslas in two different accidents went under a semi-trailer, shearing off their tops.

 
Last edited:
Before the other threads that got closed occurred, we had this one from 2015:
Are you looking forward to self driving cars?


harley opened that thread in response to an article in Wired which came about in a flurry of publicity activity from various component and vehicle manufacturers, many of which targeted 2020 for full self driving vehicles. In reviewing that thread, just about everyone here pumped the brakes on rosy predictions. We were right.

2015 was a year where many advances were made, and for some reason, there was an escalation war in claims by various manufacturers. Many of us engineers saw it for what it was: the publicity department does what the publicity department does. :LOL:

So here we are in 2021, and advances have been made. They will continue to be made. But this is going to be a long process and I think that realization is coming through to most. Tesla is one thing, and focusing there just closes threads. How about all the other assist features that dixoge mentions above? We are essentially beta testers for all this.

In my view, all these assist features are going to have to work flawlessly before we can think about moving to higher levels of self control. This will happen over the next few years. It is just a step.

I worked my entire damn career as an incrementalist on products. It is generally how development works. Marketing and publicity always touts breakthroughs. The reality is incrementalism.

So this short IEEE (electrical engineering society) article gives a recent view of things. It is a nice little read. Surprise! 2020 Is Not the Year for Self-Driving Cars

Here's an excerpt. The embedded live links to "Nissan" and "Toyota" still work and describe claims by both manufacturers that have, uh, fallen short. They'll probably claim that the assist-features are what they are talking about. Not so fast. I'm surprised the Nissan link is still live, it is directly from their site.

Yes, SDC technology caused a stir back then in 2015. People on this forum were also excited. Of course, I was much interested (and still am), hence watched and read much on the Web to learn about what developers had achieved, and what different approaches they took.

I particularly liked Waymo (Google), mostly because of the TED presentations by Chris Urmson, the lead of the SDC effort at that company. As a graduate student, Urmson was part of the team at CMU that built the car that won the DARPA "Urban Challenge" in 2007. Urmson was involved in this technology since its infancy. He had credentials, and worked on this stuff himself, not just managed people.

He showed many convincing videos showing the performance of his system to date, and also pointed out some tough problems that remained to be solved. He showed a great understanding of the problems, hence I gave him more credit than someone who simply made promises.

Urmson was confident that in the future, all cars would be driverless, and manual driving would be banned. Indeed, Waymo's position has been that an SDC that relies on people to take over in some circumstances will open itself up to other problems.

Sure, but when could people expect that super SDC?

Urmsom said that his son was 11-year-old then, and could get his driver's permit within the next four and a half year. Then, he concluded "My team and I are committed to making sure that doesn’t happen". Big applause.

Now, more than 5 years later, where are we? Yes, I read the IEEE article. Thanks. I let my membership to the society lapse since I retired, and have also thrown out all the periodicals.

To be continued...
 
When hearing of the promise of real SDC in 5 years back then, I was dubious. I thought that perhaps Waymo would have the capability in that time frame (a Level 4), but a car in production and offered to the public in 5 years? No way. They did not even have a pre-production prototype.

In 5 years, they may have an experimental car that qualifies, but what would be its cost? How much maintenance does it need, with all the sophisticated sensors and hardware? To build an SDC where money is no object is one thing, but to be able to sell it to the public and not just the DoD (DARPA was the agency that started all this), you have to build it cheaper for the masses. What's the market for a $200K car?

Two years later, Waymo's CEO John Krafcik in a 2017 Auto Show revealed the sensors that Waymo developed in house. They were impressive. And the number of sensors on their car made other experimental cars looked like toys. While Tesla has no lidar (because they "don't need it"), Waymo has multiple lidars. Not just the main high-resolution lidar on the roof looking all around, but also a small chin-mounted lidar looking down right in front of the car. Ditto for down-looking side lidars. Down-looking rear lidar too, of course.

And high-resolution cameras for driving, plus smaller peripheral cameras. Forward roof-mounted radar and peripheral radar sensors too. Sensors pointing every which way. It's only money. :)

Soon after that, I encountered Waymo cars (built on the Chrysler Pacifica) running all around my town with these sensors protruding from the cars. And since, I have seen their appearances change with time. Of course, they are going through evolutions, trying to make the sensors better, cheaper, etc...

More sensors are better, unless you have such smart software that can make do with a few 1.2-megapixel cameras like on the Tesla 3. But how far can the software go? Can the software make up for the lesser sensors in order to make it safe? Ah hah, that's not so easy to answer, is it?

And that's why earlier, I said I don't know how smart Waymo software is, compared to Tesla. Maybe Waymo software is not as good, hence they need the "crutches" like Tesla says? Or perhaps Waymo software is just as good or even better, and the expensive elaborate sensors are still needed to do the job? Can Tesla keep improving the software until it gets to at least Level 4?

That's why I like to watch Youtube videos to deduce these things for myself.
 
Last edited:
Back on Chris Urmson, where is he now?

He quit Waymo some time after that 2015 TED presentation (got mucho option vested from what I read), and started his own SDC company called Aurora.

After Uber got bad publicity from the fatal accident with a pedestrian, and then other scandals, including the lawsuit where Google charged that an employee had stolen IP from Google to bring to Uber, the new Uber CEO withdrew from the SDC business, and sold the developmental operation to Aurora. I don't follow, and don't know what Aurora has achieved.
 
Last edited:
I think it is safe to say autonomous driving will have a long way to go but the tech keeps advancing to solve new problems / questions. I am buying and holding ARKQ since it is likely to grow.
 
I suggest paying extra might make sense... might. If the sensors/actuators/controls become standardized pieces. Then the software becomes the limiting factor, and that can be updated. My first Mac years ago was like that. As the system software advanced, the computer ran BETTER than new. And it continued to improve over the years with incremental releases.
On the other hand, if improvements are contingent on new hardware... you are obsolete and it was a waste of money.
 
The big upgrade in tech is when the cars communicate with the roadway system. When they are alerted to actions of nearby cars, possible accidents several miles ahead, etc. When the car I am in can tell that several cars ahead of me are hitting their brakes (animal on the shoulder, etc) then they are getting to where they need to be. The car cant really do it all by itself at this point. The car needs to operate in an intelligent corridor.
 
I am a non-believer simply because people's lives are at stake and the system has to be perfect.

For example, What if your self-driving car is in a dangerous critical situation and at the exact same time, the computer freezes up or the battery voltage regulator fails. Without electrical power or computer power, the system does not work. Even the backup system depends on electrical power. The human must intervene in those imperfect critical cases. If the human does not intervene during a loss of power or loss of computer situation, the car may go off a 200 foot cliff and hit a semi truck head-on.

No thank you. I will use the system but I will be ready to intervene if necessary and I will NEVER let my life depend on a computer. My PC and smart phone freezes up occasionally so I do not believe a self driving car will be any different. Did people forget the movie... "Fail-Safe" or the movie "2001 Space Odyssey"?
 
Last edited:
I am a non-believer simply because people's lives are at stake and the system has to be perfect.
They don’t have to be perfect. Human drivers certainly aren’t, when/if autonomous cars are significantly better, you’d have to be awfully short sighted to cling to human drivers. And if they’re at all affordable, insurance companies among others will easily induce us to accept self driving cars.

It’s well documented that human error is the root cause of over 90% of accidents, often DUI, distracted, sleepy, other.

There are about 37,000 fatal accidents annually in the US and 1.35 million people are killed on roadways around the world. It is estimated that fatal and nonfatal crash injuries will cost the world economy approximately $1.8 trillion dollars from 2015-2030.

Nothing is perfect but if for example autonomous cars can reduce fatal accidents by 80% that would save almost 30,000 per year in the US and over 1 million worldwide, along with billions of dollars.

And if autonomous cars never do better than humans, consumers won’t buy them. But I wouldn’t bet on that, it may take (quite) a while but it’s just a matter of time IMO.

Waiting for “perfect” would be a huge mistake...

Don’t let perfect be the enemy of good.
 
Last edited:
I am a non-believer simply because people's lives are at stake and the system has to be perfect.

For example, What if your self-driving car is in a dangerous critical situation and at the exact same time, the computer freezes up or the battery voltage regulator fails....

But already we know human drivers are way far from perfect. Some are drunk, or texting, or talking on the phone, or sleepy, or have kids in the back yelling, or still just learning to drive, or losing their eyesight, or inexperienced in a bad storm... I believe the "human failings" what-ifs will far exceed SDC what-ifs long before they become ready for mass market consumption.

But there will always be some folks who won't adopt the new systems (in everything), which is fine. Freedom of choice and all that.
 
I suggest paying extra might make sense... might. If the sensors/actuators/controls become standardized pieces. Then the software becomes the limiting factor, and that can be updated. My first Mac years ago was like that. As the system software advanced, the computer ran BETTER than new. And it continued to improve over the years with incremental releases.
On the other hand, if improvements are contingent on new hardware... you are obsolete and it was a waste of money.

Tesla claims that the current Model 3 has all the hardware it needs, and new software will get it to Level 5. At Level 5, you can just rip out the steering wheel and the brake and accelerator pedals.

I am highly skeptical of the above claim. :) I think Tesla sensor suite is inadequate. And when you upgrade the sensors, for example putting in cameras better than the current 1.2-megapixel cameras, you will need more powerful/faster CPUs to process more pixels and to run more sophisticated AI software.

You cannot update a Pentium II PC so that it can be useful with Windows 10. :) I don't think Tesla knows yet - nor has anyone - the level of sensors and CPU power it will need to do a true SDC to Level 5.

It is exactly because of my skepticism that I am interested in following Tesla FSD updates, to see how far they have got. If they prove me wrong, I will not mind. A cheap and affordable SDC is something very useful to the world, and Tesla FSD is a lot cheaper than all the hardware I see on Waymo cars, which are only Level 4 now.
 
Last edited:
Back on Chris Urmson, where is he now?

He quit Waymo some time after that 2015 TED presentation (got mucho option vested from what I read), and started his own SDC company called Aurora.

After Uber got bad publicity from the fatal accident with a pedestrian, and then other scandals, including the lawsuit where Google charged that an employee had stolen IP from Google to bring to Uber, the new Uber CEO withdrew from the SDC business, and sold the developmental operation to Aurora. I don't follow, and don't know what Aurora has achieved.


Ah, just saw that Aurora is in the news again. Volvo chose to partner with Aurora to make its electric semi-truck driverless. This is for intercity highway transit only; a human driver will climb onboard to drive the truck into urban areas if this is needed. I guess such restriction makes it a Level 4 autonomous vehicle.

Volvo has been working on electric semitrucks for a few years, and has working prototypes.

See: https://www.msn.com/en-us/autos/new...?ocid=finance-verthp-feeds#image=BB1f7Ufd_1|7
 
I'll chime in as I am waiting patiently for a car that is electric and actually functions well as a SDC. I have a 2016 VW Passat with all the passive assistance functions which are very close to being able to drive itself. It helps me a great deal and I don't do a whole lot of anything when going down a highway. The systems used are all radar-based and the car has something like 12 radar sensors plus a camera system that reads road signs and even moves the headlights at night to light up signs which is kind of weird. It also re-directs the headlights if I stupidly leave the brights on when it senses an oncoming car. It warns me with flashing lights on the side mirrors if I attempt to lane change and someone is in my blind spot. It also reads the lanes fairly well and will not let you change lanes without signaling. You can over-ride it easily enough but it provides steering wheel resistance if you make the attempt. However, it is the lane sensing that is weak. It fails without warning if the paint on the lanes doesn't match what it expects often because of soil on the lane edge and never reads the lanes well in sharper curves such as on smaller backcountry roads. It also has serious problems reading lanes when there is construction and painted over lane lines or the usual bizarre haphazard lanes you often see in construction. It fights you if you follow the correct lanes and that can be somewhat disconcerting. On the other hand, it warns you with a sharp vibration if it thinks you fell asleep and if you don't respond pulls over to the side of the road and stops. It also gives you a warning if you aren't making steering wheel inputs enough and thinks that you are nodding off. It has excellent warning systems for when pedestrians walk in front or behind you and slams on the brakes if it thinks there is an imminent collision. The adaptive cruise control works perfectly and you can adjust how far you want to be behind the cars in front. All in all, it works perfectly with the exceptions of lane assist which is where it falls short.

IMHO Musk made a huge mistake in refusing to use radar technology and Musk did this as a conscious decision as some kind of attempt to show he is better and smarter than everyone else by using only cameras. Typical for someone with his personality defects. It was an idiotic decision and I think why Tesla will never succeed at being fully autonomous. Every collision I have seen with the system could have been easily avoided by radar technology. My relatively simple car actually is better at it with far simpler systems and zero AI. At some point, someone will convince him to adopt radar, and then we will see a large improvement. However, I do not see any systems coming that can figure out road construction issues or idiotic drivers behavior such as passing on the right on the shoulders (I saw this often in Maryland in youthful drivers) or here in Hungary where I live they have this insane habit of passing you and jamming back into your lane a few feet in front of you which always sends my radar and brake system into a panic. It can only be prevented by tailgating the car in front of you which I never do so this is a constant problem on the large highways. Hungarians also never use turn signals and going the speed limit is to them, insane. On the other hand, I have never gotten a ticket for speeding with this car. Best of all, it really helps in dense fog which we get a lot of here.

So, I am patiently waiting for a car that self-drives reliably, is fully electric, and has a 500-mile range. It is close to becoming a real thing but not quite there yet. Perhaps in 3 to 5 years? I am okay with or Diesel Passat until then as we drive less than 10k KM a year. I mostly bicycle everywhere local unless I have to haul a large amount of stuff.
 
I'll chime in as I am waiting patiently for a car that is electric and actually functions well as a SDC. I have a 2016 VW Passat with all the passive assistance functions which are very close to being able to drive itself. It helps me a great deal and I don't do a whole lot of anything when going down a highway. The systems used are all radar-based and the car has something like 12 radar sensors plus a camera system that reads road signs and even moves the headlights at night to light up signs which is kind of weird. It also re-directs the headlights if I stupidly leave the brights on when it senses an oncoming car. It warns me with flashing lights on the side mirrors if I attempt to lane change and someone is in my blind spot.

This is my concern with SDCs. I'm not singling you out as I have no idea what your capabilities are. But too many people will want these cars as their skills diminish, and if they have to intervene in an emergency then someone with limited capacity will be forced to make an emergency maneuver with no warning.

IMO, if you require the additional safety features of a car for daily driving (blind spot warning, lane drift, etc.) then you shouldn't be on the road in the first place.
 
<mod note> Can we please have a thread on self driving cars without it being infected with politics
 
I'll chime in as I am waiting patiently for a car that is electric and actually functions well as a SDC. I have a 2016 VW Passat with all the passive assistance functions which are very close to being able to drive itself. It helps me a great deal and I don't do a whole lot of anything when going down a highway. The systems used are all radar-based and the car has something like 12 radar sensors plus a camera system that reads road signs and even moves the headlights at night to light up signs which is kind of weird. It also re-directs the headlights if I stupidly leave the brights on when it senses an oncoming car. It warns me with flashing lights on the side mirrors if I attempt to lane change and someone is in my blind spot. It also reads the lanes fairly well and will not let you change lanes without signaling. You can over-ride it easily enough but it provides steering wheel resistance if you make the attempt. However, it is the lane sensing that is weak. It fails without warning if the paint on the lanes doesn't match what it expects often because of soil on the lane edge and never reads the lanes well in sharper curves such as on smaller backcountry roads. It also has serious problems reading lanes when there is construction and painted over lane lines or the usual bizarre haphazard lanes you often see in construction. It fights you if you follow the correct lanes and that can be somewhat disconcerting. On the other hand, it warns you with a sharp vibration if it thinks you fell asleep and if you don't respond pulls over to the side of the road and stops. It also gives you a warning if you aren't making steering wheel inputs enough and thinks that you are nodding off. It has excellent warning systems for when pedestrians walk in front or behind you and slams on the brakes if it thinks there is an imminent collision. The adaptive cruise control works perfectly and you can adjust how far you want to be behind the cars in front. All in all, it works perfectly with the exceptions of lane assist which is where it falls short.

IMHO Musk made a huge mistake in refusing to use radar technology and Musk did this as a conscious decision as some kind of attempt to show he is better and smarter than everyone else by using only cameras. Typical for someone with his personality defects. It was an idiotic decision and I think why Tesla will never succeed at being fully autonomous. Every collision I have seen with the system could have been easily avoided by radar technology. My relatively simple car actually is better at it with far simpler systems and zero AI. At some point, someone will convince him to adopt radar, and then we will see a large improvement. However, I do not see any systems coming that can figure out road construction issues or idiotic drivers behavior such as passing on the right on the shoulders (I saw this often in Maryland in youthful drivers) or here in Hungary where I live they have this insane habit of passing you and jamming back into your lane a few feet in front of you which always sends my radar and brake system into a panic. It can only be prevented by tailgating the car in front of you which I never do so this is a constant problem on the large highways. Hungarians also never use turn signals and going the speed limit is to them, insane. On the other hand, I have never gotten a ticket for speeding with this car. Best of all, it really helps in dense fog which we get a lot of here.

So, I am patiently waiting for a car that self-drives reliably, is fully electric, and has a 500-mile range. It is close to becoming a real thing but not quite there yet. Perhaps in 3 to 5 years? I am okay with or Diesel Passat until then as we drive less than 10k KM a year. I mostly bicycle everywhere local unless I have to haul a large amount of stuff.
Maybe you’re confusing technologies. Elon Musk has criticized lidar, not radar - “Currently, all of Tesla's vehicles have a forward-facing radar hidden at the front of the car” and Tesla has filed with the FCC to use a new type of “millimeter-wave radar sensor.”

And none of the commercially available driver assist systems with cameras and/or radar are anywhere near as reliable as you’re alleging. We have a 2018 Subaru and a 2019 Honda with all the features you describe (and more) and they’re nice and sometimes helpful but very hit or miss in terms of effectiveness. The reviews of the 2016-17 Passat driver assist features are mediocre https://www.techradar.com/reviews/car-tech/volkswagen-passat-1310816/review/2
 
Last edited:
Tesla not truly self driving

"Harris County Precinct 4 Constable Mark Herman told KPRC 2 that the investigation showed “no one was driving” the fully-electric 2019 Tesla when the accident happened. There was a person in the passenger seat of the front of the car and in the rear passenger seat of the car."
Two more for Darwin awards. In the Texas Woodlands area.
https://www.click2houston.com/news/...ium=social&utm_campaign=snd&utm_content=kprc2
 
They might just be the winners of the Darwin awards for this year.
There is a lot of work you need to do to get the car to keep moving while not in the driver's seat.
However, this doesn't really apply to the topic of self driving cars.
 
Back
Top Bottom