Let's talk Self Driving Cars again!

ERD50

Give me a museum and I'll fill it. (Picasso) Give me a forum ...
Joined
Sep 13, 2005
Messages
26,901
Location
Northern IL
Seems the previous threads on this went off the rails, into the weeds, off course, or some other non-clever analogy? And I haven't kept up with what I assume are many, many youtube videos of the latest incarnations of Tesla's Auto-Pilot (or whatever they are calling it now) software.

But a few weeks ago, I spent about two hours as a passenger in a Tesla (about 80% highway driving), the owner said it had all the latest updates (and it had "Ludicrous" mode, so you know this owner didn't skimp on anything). As I previously thought, I am super impressed with what the system can do. At the same time, it was very enlightening to experience quite a few of these routine conditions (for a human driver) that the car just seemed to ignore, or handle poorly. As a few of us have been saying, getting these situations handled is a whole 'nother challenge. Some examples:

Shortly after getting on the expressway, a large chunk of tire from a semi was lying in the road, just about centered between our lane and the adjacent lane. A normal human driver would veer off a little bit if space allowed (it did) - you never know, a gust of wind, or draft from another vehicle might toss that tire chunk right in your path, and it could do considerable damage. It was unnerving to have the Tesla just plow straight ahead and take no action at all.

It had snowed the day before, the roads were clear, but several of the semis had large accumulations of snow on their roofs. Several times, we would be following a truck, and large clumps of snow would blow off the truck into our path. Again, a human driver is thinking "This isn't good, that snow could obstruct my vision for a second or two, or maybe there is a big ice chunk embedded in that snow - I'm gonna get away from that truck". But no, the Tesla just didn't seem to care at all. The driver took over, and slowed down to get some space, and then passed quickly two lanes over when able to do so, to limit our exposure to that snow. Again, very unnerving to be following a truck with stuff falling off the roof, a totally unnatural feeling, your instincts just tell you to take action, get away from that.

At another point, he let the Tesla make a lane change, and it had handled a few of these very well, but this time, all the sudden the Tesla is putting on the brakes fairly hard. As a passenger, I'm saying "What the heck, what's going on??!!", there was plenty of room ahead of us, why the sudden braking? Well, apparently the car ahead of the open spot slowed down, and the Tesla slowed down (pretty aggressively) to match speeds to that open spot. It just felt weird. A human driver would have re-evaluated, maybe sped up to change lanes ahead, or waited for another opening, or dropped back slowly. But the Tesla just seemed to be 'committed' to doing that lane change in that spot, regardless.

The owner mentioned he has had the car slow down to what he felt were dangerously slow speeds on the highway, with concerns of getting rear ended, for no apparent reason at all.

Again, the capabilities were very impressive. And I stand by my earlier assertions that if a confidence in the system leads the driver to be less attentive, I'd expect to see some accidents caused by the car just not knowing how to handle these myriad of situations that humans can figure out. And until/if affordable cars get there, I feel the system needs to keep the driver fully engaged, so it is more like "two sets of eyes are better than one", rather than the car saying "I got this for now", until it doesn't.

It's possible that a self driving system that doesn't handle these cases might on average still be safer than the average driver. But I don't see it as an either/or. Both working together makes the most sense to me.

-ERD50
 
That's the current Autopilot system. And yes, that's all true. I'm ready for most of the quirks most of the time since they are relatively predictable.

However, that is not what's used in Tesla's coming (eventually) FSD system. There's a whole new vision and driving system that is in limited testing now, with very little in common with Autopilot. It will require the Tesla FSD computer, with something like 10x the capability of the computer Autopilot runs on. Judging by the tester's videos it's a credible FSD attempt but has a long way to go.

I'm pretty sure the neural networks can be trained to handle any particular problem they may encounter while driving. However, we won't know how many special cases the current hardware can accommodate. Tesla may find they need another step up in computer power in order to complete the task.

I bought FSD (in 2017 for $3k, and again in 2019 for $2k for our second car) and am having fun with what's in it so far. I'm looking forward to trying out the latest when it becomes available. I've always thought it would be a drawn out process, but the cost was OK even if it just resulted in a few new features. If it can drive me to doctor appointments in the future, so much the better.
 
Not to head into the weeds too early, but it's my opinion that so-called self-driving cars will be held to a higher standard than will human driven cars. What do I mean? It's been BIG news when a self-driven car has been involved in an accident. I don't know whether subsequent evaluation ever concluded that a human could have done it better (either reacting and maneuvering OR BETTER anticipating and avoiding.)

Humans are involved in incidents, accidents and fatalities every day. It's LOCAL news unless a VIP is involved. But, when a self-driver is involved, it's national news. I don't see how we can ever expect that a self-driver will prevent all accidents.

Take an absurd case that has, I'm sure, happened. You are driving along the freeway with good weather, good roads, moderate traffic. You are beside and passing a slower 18 wheeler. It blows a tire, the load (maybe pigs) shifts, and the rig heads rapidly for the median (and you.) Your choices are to get to the median faster than the rig, finish your pass - very quickly - or brake hard enough to NOT be there when the big rig would otherwise crush you. I submit that there are cases where you're just a goner. If you were driving, the accident report would maybe site the rig for poor condition of tires - maybe not. If the car were driving, the news would likely be "another self-driven car fails its owner."

I could be wrong about this, but I think the natural tendency would be to "blame" the new technology for all accidents - even when a human could not have avoided the inevitable. I have NO data so admit to pure speculation. Probably weed-patch territory, but I bring it up because it's the first thing I thought about when I heard about the very first accident involving self-driven cars. YMMV
 
^^^ When cars first appeared, back when horses dominated the roads, many people objected on the grounds of safety and every car accident was highlighted. But today you accept humans with cars without a second thought. These transitions take time...

All truth passes through three stages. First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as being self-evident. Arthur Schopenhauer

rfxlhjb7ple51.jpg
 
Gosh, this doesn't sound any smarter than the "smart cruise control" on my 2015 Hyundai Sonata, which isn't smart at all and which in fact I hate. It slows down drastically for trucks, including trucks that aren't even in our lane, in order to maintain what I consider a ridiculously long interval, which every other driver immediately speeds up to pass me and get into, which leads to the car slowing down even more...at one point everyone else was doing in the 80's in a 70 zone, but my dumb car had me down to 62 mph.

"At another point, he let the Tesla make a lane change, and it had handled a few of these very well, but this time, all the sudden the Tesla is putting on the brakes fairly hard. As a passenger, I'm saying "What the heck, what's going on??!!", there was plenty of room ahead of us, why the sudden braking? "
 
I feel that the Achilles heel of current "assistive driving technology" is the assumption that the human will leap into the breach when the car gets confused. But who can pay such close attention for miles and miles, without actually doing the driving oneself? I mean, we call such people backseat drivers, and make fun of them.

This, the loss of driver control and "responsibility," is what will make everyone blame the car. It really is the driver's responsibility to be ready to take over at any second - like a driving instructor. Yet, that defeats the purpose of assistive driving - which is to let the driver relax.

Not to head into the weeds too early, but it's my opinion that so-called self-driving cars will be held to a higher standard than will human driven cars. What do I mean? It's been BIG news when a self-driven car has been involved in an accident. I don't know whether subsequent evaluation ever concluded that a human could have done it better (either reacting and maneuvering OR BETTER anticipating and avoiding.)

Humans are involved in incidents, accidents and fatalities every day. It's LOCAL news unless a VIP is involved. But, when a self-driver is involved, it's national news. I don't see how we can ever expect that a self-driver will prevent all accidents.

Take an absurd case that has, I'm sure, happened. You are driving along the freeway with good weather, good roads, moderate traffic. You are beside and passing a slower 18 wheeler. It blows a tire, the load (maybe pigs) shifts, and the rig heads rapidly for the median (and you.) Your choices are to get to the median faster than the rig, finish your pass - very quickly - or brake hard enough to NOT be there when the big rig would otherwise crush you. I submit that there are cases where you're just a goner. If you were driving, the accident report would maybe site the rig for poor condition of tires - maybe not. If the car were driving, the news would likely be "another self-driven car fails its owner."

I could be wrong about this, but I think the natural tendency would be to "blame" the new technology for all accidents - even when a human could not have avoided the inevitable. I have NO data so admit to pure speculation. Probably weed-patch territory, but I bring it up because it's the first thing I thought about when I heard about the very first accident involving self-driven cars. YMMV
 
...
It's possible that a self driving system that doesn't handle these cases might on average still be safer than the average driver...

Umm... The cases you described are quite benign compared to what else I have seen on Youtube videos, and these are observed with the latest FSD beta version to boot.

That's the current Autopilot system. And yes, that's all true. I'm ready for most of the quirks most of the time since they are relatively predictable.

However, that is not what's used in Tesla's coming (eventually) FSD system. There's a whole new vision and driving system that is in limited testing now, with very little in common with Autopilot. It will require the Tesla FSD computer, with something like 10x the capability of the computer Autopilot runs on. Judging by the tester's videos it's a credible FSD attempt but has a long way to go.

I'm pretty sure the neural networks can be trained to handle any particular problem they may encounter while driving. However, we won't know how many special cases the current hardware can accommodate. Tesla may find they need another step up in computer power in order to complete the task.

I bought FSD (in 2017 for $3k, and again in 2019 for $2k for our second car) and am having fun with what's in it so far. I'm looking forward to trying out the latest when it becomes available. I've always thought it would be a drawn out process, but the cost was OK even if it just resulted in a few new features. If it can drive me to doctor appointments in the future, so much the better.

I am not a Tesla owner, nor a company shareholder (nor a stock shorter :) ). However, I have always been very interested in SDC (self-driving-car) technology, and try to keep up with who's doing what. I knew about Tesla's releasing FSD beta software to selected owners, and a few of them are Youtubers who uploaded their test drive to share with the public (they are making some money from their effort via Google inserted ads). And I have watched quite a few videos.

I have to say that Tesla's software has shown substantial improvements. However, it occasionally screws up, and the error could have resulted in a head-on collision with vehicles approaching in the opposite direction if the driver did not intervene. In lesser incidences, the car might have jumped the curb, or collided with a car parked by the side of the road. The driver took over in all these cases, and no accidents have been recorded on Youtube, but you can imagine the stress it caused.

If one is interested, he can search Youtube for these videos. It is very interesting for me to see where the car acted up for no obvious reasons, and the poster/driver said so.

There are also several videos where the posters declared that a test drive was perfect, in a short 20-min test drive, and it was true.

The problem is that the AI software is erratic and unpredictable, and Tesla acknowledges this by saying that the software may make the wrong decision at the worst possible time, and the test drivers must be fully alert at all times to override and to take over.

For me, watching these videos is enough to learn what the current Tesla FSD can do. I do not care to be a volunteer test driver for no pay. It's stressful and takes a lot of work and attention. Imagine waiting for the car to make a left turn and you don't know if/when the car decides to lurch onto oncoming cars. :) Yes, it happened.
 
Last edited:
Not to head into the weeds too early, but it's my opinion that so-called self-driving cars will be held to a higher standard than will human driven cars. What do I mean? It's been BIG news when a self-driven car has been involved in an accident. I don't know whether subsequent evaluation ever concluded that a human could have done it better (either reacting and maneuvering OR BETTER anticipating and avoiding.)


In the fatal cases so far, the drivers were obviously asleep or not watching the road (driving under semi-trailers, driving into highway barrier, driving into parked fire trucks). Both man and machine failed in these accidents. But I don't think a reasonably decent driver would fail like the above, do you?

There are many more accidents with Tesla Autopilot, but if there's no fatality, it has no media coverage. It would be the driver's fault for not paying attention to what his car is doing. Tesla says so, and the driver agrees to it when he engages the Autopilot.

Would he do better if he was in control? I would hope so. How was he able to drive before he bought his Tesla with the AP?


Gosh, this doesn't sound any smarter than the "smart cruise control" on my 2015 Hyundai Sonata, which isn't smart at all and which in fact I hate...

No, the Tesla FSD is better than that. Way better.

But would it be able to drive without human intervention? Nobody believes so. Not Tesla, nor any of the testers.
 
Last edited:
... I could be wrong about this, but I think the natural tendency would be to "blame" the new technology for all accidents - even when a human could not have avoided the inevitable. I have NO data so admit to pure speculation. Probably weed-patch territory, but I bring it up because it's the first thing I thought about when I heard about the very first accident involving self-driven cars. YMMV

I don't think you are wrong at all. The new technology gets the attention, because it is new (and different). That is to be expected. Sure, it may not seem "fair", and in some ways it isn't. But each new bit of data informs us of what be coming as the tech becomes more widespread.

Similarly, you don't hear about all the 100% safe airplane take-offs and landings each day. But even though airplanes have been around for about 100 years, a crash makes big news, even though more people probably die in cars in the US every day than dies in the once-in-a-blue-moon air crash.


...

Take an absurd case that has, I'm sure, happened. You are driving along the freeway with good weather, good roads, moderate traffic. You are beside and passing a slower 18 wheeler. It blows a tire, ...

But it is not absurd, because things like this do happen. On average, will a SDC react better than a human to the almost uncountable variations on that theme?

-ERD50
 
....
For me, watching these videos is enough to learn what the current Tesla FSD can do. I do not care to be a volunteer test driver for no pay. It's stressful and takes a lot of work and attention. Imagine waiting for the car to make a left turn and you don't know if/when the car decides to lurch onto oncoming cars. :) Yes, it happened.

Yes, this is a very important issue, IMO.

I agree it would be very stressful to be responsible and being alert at all times and ready to take over. The problem is, it may look like the car is doing the right thing, then a fraction of a second later, you think, wait, this looks dicey, but then you give it another fraction of a second, thinking it will right itself, or maybe you just didn't know what it planned to do.

By the time you realize the car is maybe doing the wrong thing, you are now a half-second behind where you would have been if you were just driving the car yourself. Now you have to take over and try to correct. That half-second could be the difference between life and death.

Come to think of it, it's kind of like teaching a teenager to drive, That is stressful, you are ready to yell "STOP!", or maybe even grab the wheel if that can help. You are on high alert at all times, needing to second guess what the driver is doing, rather than being confident and understanding what you are doing as a driver. Very stressful!

-ERD50
 
... You are on high alert at all times, needing to second guess what the driver is doing, rather than being confident and understanding what you are doing as a driver. Very stressful!

-ERD50


Yes, very stressful, as anyone could imagine without even watching these Youtube videos. :)

I hope these free beta testers make enough money from their Youtube videos to make it worth their effort, because Tesla is not paying them any.


PS. In the case of the left turn, many video commenters said that it would be extremely helpful if the computer made an annunciation if it decided to "go for it". This way, the test driver did not have to second-guess the computer's intention.
 
Last edited:
...
For me, watching these videos is enough to learn what the current Tesla FSD can do. I do not care to be a volunteer test driver for no pay. It's stressful and takes a lot of work and attention. Imagine waiting for the car to make a left turn and you don't know if/when the car decides to lurch onto oncoming cars. :) Yes, it happened.

Fair enough, if you aren't comfortable testing the FSD software you certainly shouldn't.
As to stressful, it really isn't any more stressful than the typical drive without any assistance.
A driver always has to be on the lookout for the unexpected.
Driving any vehicle always takes a lot of work and attention. Most of us allow that attention to lapse from time to time.
For most of us, it is just for a split second and most of the time, that doesn't result in an accident.

But about 25,000 time a year, someone lets their attention lapse at just the wrong time, and there are deadly consequences.

There are no cars on the road today that don't require the driver to be aware and in control.
Some day, there will be, but not until the cars have accidents less than 10% of the rate of human caused ones.

And yes, even then, a car caused accident may happen that would have been avoided by a human. However, as long as there are 10 accidents that humans would have gotten into that the technology avoided, society should be ok with that.
 
I’m a big fan of Tesla vehicles. We own two Model 3’s, one with FSD, one with basic autopilot. I think Tesla vehicles are great for a variety of reasons. But FSD is not one of them. Every time we turn it on it makes me nervous, so I rarely use it.

All of the observations in the OP are accurate. It’s not ready to be used in a busy high traffic area unless you are capable of monitoring it extremely closely. I consider it similar to driving with a student driver where you are letting them drive but are constantly required to be ready to take over if they do something stupid. For me, I’d rather just drive the car myself than try to supervise the car driving on its own.

At the current price of $10,000 for FSD I think people are just being down right silly buying it. Who would pay $10,000 for beta software that doesn’t really work?
 
... However, as long as there are 10 accidents that humans would have gotten into that the technology avoided, society should be ok with that.


This contention keeps arising in any thread about this topic. But it does not have to. :)

There are aircraft accidents, yet we all fly. The death rate due to flying is lower than that of cars.

But my point is that we are nowhere near that point with SDC at least as demonstrated by Tesla. I am talking about the current Tesla FSD, as reported by Youtubers just a few days ago.

When the tester has to override a dozen times in a drive lasting 15-20 minutes, in a not so challenging street scene at that, we are nowhere at the point when we can say SDC is safer than humans. So, why bring that up? When SDC is good and affordable, I will be a buyer. I don't care to have to drive myself.

When will SDC be ready? Maybe not Tesla, but Waymo already is, I don't know. However, Waymo system is expensive and perhaps they don't even know the real price in production yet.

One thing about Tesla FSD is that they let outsiders, meaning Youtube beta testers, have access to their system, so that we know about the shortcomings. Waymo is very tight-lipped about their stuff, so I don't know what their real capability is.
 
Last edited:
I’m a big fan of Tesla vehicles. We own two Model 3’s, one with FSD, one with basic autopilot. I think Tesla vehicles are great for a variety of reasons. But FSD is not one of them. Every time we turn it on it makes me nervous, so I rarely use it.

All of the observations in the OP are accurate. It’s not ready to be used in a busy high traffic area unless you are capable of monitoring it extremely closely. I consider it similar to driving with a student driver where you are letting them drive but are constantly required to be ready to take over if they do something stupid. For me, I’d rather just drive the car myself than try to supervise the car driving on its own.

At the current price of $10,000 for FSD I think people are just being down right silly buying it. Who would pay $10,000 for beta software that doesn’t really work?


Tesla says that all its cars now come with the hardware for FSD. For $10K, they will enable the software. Recently, Musk talked about the idea of "renting" the software for short terms, for example an owner may pay to have FSD enabled for a vacation trip of a few weeks or something like that.

Some owners are irate that they paid for software that was promised but not delivered. And some of them are avid Tesla owners who upgrade to newer models, and they want that FSD feature to be transferred onto their new cars. It sounds reasonable and fair to me, but I don't think Tesla allows that yet.
 
Somethings are just different and we humans resist them, including me. My model Y doesn't have FSD it's of little value where I live. If I was still in KC I probably would have but on two lane roads it doesn't add enough value over base autopilot. Is autopilot perfect? No but it does add another level of security.

I had one experience early on when the vehicle went into emergency collision avoidance and it kept me from tboning a guy who stopped in the middle of the highway. However when a deer jumped in front of me it did nothing, just like me. I didn't hit the brakes until the deer was flying through the air like Rudolph on a bad drunk. (when I put autopilot on, I was thinking I could spend more time looking for game, ten seconds later, boom....)

I find driving with autopilot to be more relaxed. I do find those nitpick things that the vehicle does different from me. Who is to say I was doing it right?

ETA: There's a certain amount of trust you develop to make it easier to believe the vehicle will slow/stop in traffic. Sometimes I remember when I was a programmer and managed to crash online systems on a daily basis from a two line code change it took a team of experts months to find. Sure it'll stop [emoji113]. [emoji854]
 
Last edited:
^^^ When cars first appeared, back when horses dominated the roads, many people objected on the grounds of safety and every car accident was highlighted. But today you accept humans with cars without a second thought. These transitions take time...



rfxlhjb7ple51.jpg



We’re probably in late stage 1
 
Fair enough, if you aren't comfortable testing the FSD software you certainly shouldn't.
As to stressful, it really isn't any more stressful than the typical drive without any assistance.
A driver always has to be on the lookout for the unexpected...

I forgot to address the above in my earlier reply.

Yes, I try to be on the lookout for unexpected things while driving, and not always successfully.

But what ERD50, myself, and Ready (an FSD owner) talk about is that we normally do not have to watch out for "someone" to suddenly step on the accelerator pedal while we are waiting to make a left turn at an intersection with oncoming traffic.

Nor do we have to be constantly on the alert for someone invisible to suddently yank on the steering wheel to steer the car onto the curb, or a parked car by the side of the road.

Yet, the above scenarios were among the mishaps I saw on Youtube videos.

Again, there are also videos of 15-20 min test drives that went perfectly well. And then, the next day, the tester reported something weird again. Tesla FSD is quite unpredictable. Is Waymo like that? I don't know, but they are offering limited driverless taxi service, so I hope Waymo is better. But what is Waymo's limitations? We don't know and they don't tell.
 
Last edited:
About SDC projects around the world, not just Waymo but a Chinese company called AutoX has offered robotaxi rides to the public in Shenzhen earlier this year. Its fleet is only 25 cars, a lot less than Waymo's fleet.

Chinese self-driving pioneer AutoX is opening up its fully driverless robotaxi pilot program to the public in Shenzhen this morning. This is the first time the general public in China can book a ride in a robotaxi that doesn’t have a safety driver, AutoX claimed.

Interested parties can sign up for a ride via this registration page. Shenzhen has the highest level of population density of any city in China. To start, AutoX told The Robot Report its fleet of 25 Pacifica minivans will operate within Shenzhen’s Pingshan district. Pingshan is 65 square miles in size. For comparison’s sake, Waymo’s robotaxi service operates within a 50-square-mile radius of Chandler, Ariz.

AutoX’s vehicles will have teleoperators standing by in case the systems encounter situations they can’t handle. AutoX said the “teleoperator can give very high level instructions to help in situations when the car is stuck, such as giving a routing suggestion. The onboard AI still decides how the car drives.”

AutoX is also testing a robotaxi service in Shanghai, but uses safety drivers.


Note that AutoX and Waymo both operate their limited service in a geofenced area. They don't yet have enough confidence to allow unlimited operation, nor sell the vehicles to the public.

Interested people can search Youtube for videos covering both above companies' vehicles as well as Tesla's FSD drives by beta testers.

PS. Waymo also employs teleoperators to monitor each of their driverless vehicles, to help out when the onboard computer encounters a situation it does not know how to handle. The assistance is in the form of high-level instructions such as "Back up, do a U-turn, and take a different route".

In one Youtube video, a Tesla beta tester had his car going into a dead-end street, and the FSD computer stopped the car and asked him to take over. This kind of things, although not safety related, is the sort of ramifications that SDC designers have to allow for.
 
Last edited:
My experience with Autopilot is that I'm very comfortable letting it drive under normal conditions. That's primarily freeway driving. There just aren't a whole lot of weird situations on a freeway. It's not going to suddenly make a hard left turn while traveling on a normal freeway. A construction zone with re-painted temporary lane lines, or a road with faded lane lines can sometimes be a problem. For me too. I've also had good success with FSD stopping for red lights and handling yellow lights. That's pretty fun.

I think one problem with AI accident acceptance will be that autonomous car accidents will be different than human car accidents. Most likely humans will think they could have avoided most of the AI accidents. At the same time AI will probably avoid most human-style accidents. That can be a difficult sell, even if AI has a lower overall accident rate.

Jumping in and suddenly taking over the driving responsibility can be a challenge. First and foremost is that your hands must be on the wheel. That allows you to feel the car turn before you notice it visually, and you know where your hands are on the wheel. Second, your right foot should be over the accelerator pedal. That's a place you should be accustomed to, ready to accelerate or brake with your usual muscle-memory actions. You absolutely should be observing everything around you, as if you were driving the car. With Autopilot I don't worry too much about following the lanes and modulating my speed. I can pay a little extra attention farther down the road and watching traffic. It's much less tiring.

Taking over is easy enough. If I turn the steering wheel (with a little extra effort) the car will stop steering and let me drive. If I hit the brakes Autopilot will fully disable and I'm driving manually. (I can go faster at anytime by hitting the accelerator while Autopilot is still operating).

The problem is if you are thinking "maybe it's just braking a little later than usual", or "it'll move over for that truck that's inches into my lane" or "what will happen if I just let it go". If you're a little late taking over the results might be a little more dramatic than they could have been. That is somewhat a problem with this type of assisted driving, but we've handled quite a few such situations now. The car is also pretty good at moving over if a car is really making a move into your lane.

As an example, lane changes can require some intervention. The car sometimes "freaks out" and aborts a lane change halfway through. Sometimes it seems to be a car in the second lane over. Sometimes it's a fast moving car coming up from behind. Sometimes who knows? If the car tries to abort and I'm sure it is still safe to change lanes I'll just keep the steering wheel from turning us back into the original lane. Something I'm prepared to do and just requires holding the wheel steady.

As far as FSD, if it can avoid crashing into anything, including during left turns, then I can handle the other quirks. Staying current on the Tesla forum will be important to learn what to expect when using it.
 
Yes, it is stressful, even with a careful (basically, scared) teen. That's why I compared driving the SD car to being a driving instructor - but they have separate controls!

The thing is, a teen is still getting used to the whole idea of piloting a heavy machine, where 20 mph suddenly seems veryveryfast. The SD cars are supposed to already know how to drive :facepalm:

Plus, being computers with wheels, their reactions - and thus, their self-corrections - ought to be many times faster than a human's.

Perhaps the sensor technology is the real issue. Nothing can compare with the mechanics of the human eye, hooked to a lifetime of experience in the human brain.

Come to think of it, it's kind of like teaching a teenager to drive, That is stressful, you are ready to yell "STOP!", or maybe even grab the wheel if that can help. You are on high alert at all times, needing to second guess what the driver is doing, rather than being confident and understanding what you are doing as a driver. Very stressful!

-ERD50
 
Animorph, what has been your experience with low-visibility situations, such as a pop-up heavy rainstorm? These can and do occur on freeways, and cannot always be predicted/avoided.

I either put on flashers and inch along, hoping nobody hits me from behind, or (if I can see well enough) inch my way over to the verge, stop the car, put on flashers and hope nobody hits me before the rain thins out. Neither "solution" is ideal...What does a SD car, with its faster reactions, do?

My experience with Autopilot is that I'm very comfortable letting it drive under normal conditions. That's primarily freeway driving. There just aren't a whole lot of weird situations on a freeway. .
 
Our 2020 Honda CR-V has several Driver Assist features:


Forward Collision Warning
Lane Departure Warning
Collision Mitigation Braking System™
Road Departure Mitigation System
Lane Keeping Assist System
Adaptive Cruise Control


I like that it makes a noise if I am not braking when I get too close to a car in front of me. I like when it gives haptic feedback if I get too close to the edge of the lane I'm in. I've always been impressed with these features. But even with this system, which has never been touted as leading towards autonomous driving, I've had a few surprises.



It is annoying that even when I am changing lanes due to my lane *ending* - if i pass over some stripes the steering wheel sometimes is rather aggressive in trying to keep me back in the lane. On one occasion it applied the brakes enough to pull me out of cruise control! If I use the turn signal, it ignores lane changes (as it should).


I can't imagine a more 'intelligent' aggressive like Tesla's taking over. I don't think I'll still be driving by the time such a thing becomes mainstream.
 
Before the other threads that got closed occurred, we had this one from 2015:
Are you looking forward to self driving cars?

harley opened that thread in response to an article in Wired which came about in a flurry of publicity activity from various component and vehicle manufacturers, many of which targeted 2020 for full self driving vehicles. In reviewing that thread, just about everyone here pumped the brakes on rosy predictions. We were right.

2015 was a year where many advances were made, and for some reason, there was an escalation war in claims by various manufacturers. Many of us engineers saw it for what it was: the publicity department does what the publicity department does. :LOL:

So here we are in 2021, and advances have been made. They will continue to be made. But this is going to be a long process and I think that realization is coming through to most. Tesla is one thing, and focusing there just closes threads. How about all the other assist features that dixoge mentions above? We are essentially beta testers for all this.

In my view, all these assist features are going to have to work flawlessly before we can think about moving to higher levels of self control. This will happen over the next few years. It is just a step.

I worked my entire damn career as an incrementalist on products. It is generally how development works. Marketing and publicity always touts breakthroughs. The reality is incrementalism.

So this short IEEE (electrical engineering society) article gives a recent view of things. It is a nice little read. Surprise! 2020 Is Not the Year for Self-Driving Cars

Here's an excerpt. The embedded live links to "Nissan" and "Toyota" still work and describe claims by both manufacturers that have, uh, fallen short. They'll probably claim that the assist-features are what they are talking about. Not so fast. I'm surprised the Nissan link is still live, it is directly from their site.
Five years ago, several companies including Nissan and Toyota promised self-driving cars in 2020. Lauren Isaac, the Denver-based director of business initiatives at the French self-driving vehicle company EasyMile, says AV hype was “at its peak” back then—and those predictions turned out to be far too rosy.

Now, Isaac says, many companies have turned their immediate attention away from developing fully autonomous Level 5 vehicles, which can operate in any conditions. Instead, the companies are focused on Level 4 automation, which refers to fully automated vehicles that operate within very specific geographical areas or weather conditions.
 
Last edited:
Back
Top Bottom