Let's talk Self Driving Cars again!

I think I mentioned something along these lines way back, but different.

My thoughts were, big rigs would have the FSD capability, and it's use would be limited to certain highways at certain times (mainly weather related). But they would share the road with other vehicles.

Big rigs make sense, they cost a lot, and run many more miles/year than most cars. So the FSD investment would be smaller percent-wise and per-mile-wise. And it must be hard for a long haul driver to maintain attention, so being able to rest for long stretches would be a big help.

-ERD50

A few years ago, it was already envisioned that self-driving 18-wheelers would ply the highway at night, going between major towns on interstates or highways. They would pull into a large lot outside a major city center, where a driver would hop on board to drive them into town.

I thought that scheme has already been in use, but perhaps not. It surely sounded very feasible as the driving environment is benign, and the cost of instrumenting a big rig with expensive lidars and other sensors can be justified.
 
We went from a few seconds of powered flight in an airplane to landing a man on the moon in 66 years. A bit less than the average life time of a person born in the mid twentieth century.

I imagine self driving cars may take less than 66 years. But, who knows for sure? Building an airplane must be child's play compared to building an artificial human brain for one's car.

We've done lots of impressive stuff with humans in control, but this is totally different. Way too many edge cases for AI to handle and no real regs or legal framework in the near future.

Won't be available to the average consumer in this decade, IMO.
 
Last edited:
The New York Times recently released a video documentary on Tesla Self-Driving effort. I don't know where it was originally broadcast, but saw an upload of it on YouTube. Being interested in SDC technology, I watched the entire documentary. Basically, it told how Musk overpromised and hyped up the capabilities of Tesla's Autopilot, then FSD (Full Self Driving) to sell cars.

I saw nothing that I or anyone else who has been watching the news did not already know about, but some Tesla ex-engineers confirmed two things that I have always been suspicious of.

1) Misleading Promo Video:

Back in 2016 or so, Tesla Web site showed a video where a Tesla car demonstrated a run with full autonomy. A car left a home garage and drove itself to an office building and parked itself. Totally hand-off. Very impressive.

Musk said the Model 3 would have the necessary hardware all built-in, and any other car would be obsolete in a few years like horses. At the same time, or shortly after, Musk said private Tesla car owners could sign up for their cars to make autonomous taxi runs to make lots of money.

The ex-engineers said that they made and made several laps in order to get enough good video to edit to make the final cut. The car even hit a fence at one point, and they just patched it up and continued.

That video was so impressive, yet 6 years later, the final product is still not released. However, the video served its purpose. It generated good sales for the Tesla 3 to keep the company alive.

I just went on Tesla's Web site to look for this video, and have not found it. They either moved it elsewhere or deleted it.


2) Tesla's advantage of having 100K's of cars collecting data in real life:

The ex-engineer said it was all bogus, because the production cars did not have the hardware nor the software to collect and report the necessary data to help them understand the shortcomings and to fix it. Musk said that with fleet learning, when one car made a mistake, the rest would learn from it. This is totally bogus, as I suspected.

Back then, I said that Waymo and other SDC developers had fewer cars out testing their software, but these test cars were fully instrumented and could capture high-bandwidth video along with live sensor readings for the engineers to analyze later. This has always been how vehicle developers tune their design: by using special test vehicles with heavy instrumentation carried onboard. They don't use regular production vehicles. And this is for engine tuning, suspension tuning, etc..., which is a lot simpler than tuning something as complex as a self-driving system.

Now, with the beta release of FSD, Tesla put in a soft button which the driver can use to report an anomalous FSD action. I am curious what data would get sent back to Tesla. Perhaps all that is used for is to count how many times a day Tesla owners encounter an unsafe situation or narrowly avert a crash.

Indeed, I watched many YouTube videos posted by Tesla owners who were allowed to download FSD beta software, and I saw the car making the same mistakes again and again after several software releases although the owners kept pressing the button to report each instance. It was just a "placebo" button. :)


PS. By the way, one of the YouTubers was a Tesla employee. He showed many videos where he was able to react and override the self-driving system when it was about to hit a barricade, going down a rail track, etc... But after he posted the video of his car hitting a bollard because he was not quick enough that last time, he got fired by Tesla. Free speech and certainly showing the truth are not allowed at Tesla. :)



Part 1:

Part 2:
 
Last edited:
It was a great piece. I need to re-watch it, though. I’m not sure any vehicle would’ve been able to avoid running under those semi-trailers. Two similar accidents. It was disappointing that no Gov’t Agency was willing or able to prevent repeat tragedies. .
 
It was a great piece. I need to re-watch it, though. I’m not sure any vehicle would’ve been able to avoid running under those semi-trailers. Two similar accidents. It was disappointing that no Gov’t Agency was willing or able to prevent repeat tragedies. .

Ugh, the videos whose links I shared were pirated to post on YouTube. They have been taken down.

About self-driving cars running under semi-trailers, I am quite certain that cars with lidars will not do that. Lidars will see a solid object in front. A computer is needed to interpret that huge solid object as a truck, or a wall, or a hovering UFO, but it is straightforward logic to stomp on the brake. No object identification is really needed.

On the other hand, cars with just vision cameras have to identify all the objects in the field of view, and deduce or compute the distance to them. This depth data from vision is a tough problem, and Tesla back then did not have the software or the hardware powerful enough for the job. Tesla has done quite a bit more.

By the way, the Tesla cars that drove under the semi-trailers and had their top sheared off had radar sensors installed. So, how did the radar fail to detect the huge metallic object? Easily. These radars don't work the way traditional dish radars work to detect aircraft. There are many different kinds of radar, and the type used in cars is cheap.
 
Ugh, the videos whose links I shared were pirated to post on YouTube. They have been taken down.

I typically don't consume many NYT products, so watching it is low on my viewing list, but it sounds good enough I may have to find time.

As a public service to those who do like their products, I will mention that Hulu has the piece. Lots of people have Hulu and don't check for new stuff. You can always also try a free trial.

It is also on FX. If you have a cable subscription, I think you can watch it using your credentials on the FX web page.

These are all legal and give NYT their advertising dollars.
 
I typically don't consume many NYT products, so watching it is low on my viewing list, but it sounds good enough I may have to find time.

As a public service to those who do like their products, I will mention that Hulu has the piece. Lots of people have Hulu and don't check for new stuff. You can always also try a free trial.

It is also on FX. If you have a cable subscription, I think you can watch it using your credentials on the FX web page.

These are all legal and give NYT their advertising dollars.

Thanks. We don't have cable, do have Sling_TV (DW uses it, I hardly could remember the name!) and FX shows up, and I somehow found the "Elon Musk's Crash Course" - it shows as S2:E1. Does that mean there is another video for S1, or was S1 a different subject. Don't know my way around the interface, so I don't see the other seasons up?

IIRC, it was 2 videos on youtube, so was that just split for time, or are there 2 that I should look for? This is 1Hr 15M, so I hope it's only one!

OK, now I'm curious of the timing from NYT. Musk has stepped on some toes lately with his Twitter takeover attempt (or PR stunt - could have the same effect?). Is this retribution from some media outlets? I thought Musk was adored by this group, suddenly some negative PR? Perhaps this was put together before any of that, but maybe it wouldn't have been released? Who knows, but I wonder?

-ERD50
 
Ugh, the videos whose links I shared were pirated to post on YouTube. They have been taken down.

About self-driving cars running under semi-trailers, I am quite certain that cars with lidars will not do that. Lidars will see a solid object in front. A computer is needed to interpret that huge solid object as a truck, or a wall, or a hovering UFO, but it is straightforward logic to stomp on the brake. No object identification is really needed.

.



It’s one thing to be certain that lidar will ‘see’ an object but that doesn’t mean the vehicle will have time to stop.
 
Thanks. We don't have cable, do have Sling_TV (DW uses it, I hardly could remember the name!) and FX shows up, and I somehow found the "Elon Musk's Crash Course" - it shows as S2:E1. Does that mean there is another video for S1, or was S1 a different subject. Don't know my way around the interface, so I don't see the other seasons up?

IIRC, it was 2 videos on youtube, so was that just split for time, or are there 2 that I should look for? This is 1Hr 15M, so I hope it's only one!

It is a series called "New York Times Presents". Season 1 had 11 episodes that covers various social-political themes that interest the audience of the NYT.

This Musk piece is the first episode of season 2.


OK, now I'm curious of the timing from NYT. Musk has stepped on some toes lately with his Twitter takeover attempt (or PR stunt - could have the same effect?). Is this retribution from some media outlets? I thought Musk was adored by this group, suddenly some negative PR? Perhaps this was put together before any of that, but maybe it wouldn't have been released? Who knows, but I wonder?

-ERD50

It seems timed to me and clearly a clap back at Musk. Although I will likely watch it, I'll watch it through the lens of the NYT's bias.
 
Musk said the Model 3 would have the necessary hardware all built-in, and any other car would be obsolete in a few years like horses.

How interesting Musk mentioned horses being obsolete. My dad worked a lot of j*bs in the depression, sending money home to his mom and younger siblings. One j*b he took in Texas was rounding up stray cattle as a cowboy. He really knew little about horses, but he needed the w*rk. One day, early in his tenure, he'd been out all day and realized he had no idea where he was nor how to get back to the bunk house. So he threw down the reins, grabbed the saddle horn and held on as the horse took him home at a near gallop. That was almost 90 years ago. ymmv
 
It’s one thing to be certain that lidar will ‘see’ an object but that doesn’t mean the vehicle will have time to stop.


Waymo said that its lidar can see 1,500 ft ahead. In these cases, there was plenty of time for the cars to stop.

In both cases, there were absolutely no skidmarks. This means the self-driving cars did not see the trailers at all. No attempts to brake.

I followed the 1st case very closely when it was first reported. The car kept on going for a few hundred feet. Obviously, there was no sensor to tell the car that its top had been lopped off. Besides, there was no software logic to say "if car top gets scraped off, then brake". So, the car kept on going.

The windshield-mounted camera was obliterated, but perhaps it took a while for the "brain" to accept that its eye was gone.


PS. Even recently, in the videos released by FSD beta testers with the more advanced hardware and software, there were instances where the cars were about to drive into concrete columns, walls, bollards, road barricades, what have you. It was only by being on their toes that the drivers were able to yank the steering wheel to avert a collision.

The camera sees the objects, but the "brain" does not recognize them as obstacles. It's the same as you and I can see the Chinese characters on a posted sign, and the words say "Stop or we will shoot", but we do not understand them.
 
Last edited:
How interesting Musk mentioned horses being obsolete. My dad worked a lot of j*bs in the depression, sending money home to his mom and younger siblings. One j*b he took in Texas was rounding up stray cattle as a cowboy. He really knew little about horses, but he needed the w*rk. One day, early in his tenure, he'd been out all day and realized he had no idea where he was nor how to get back to the bunk house. So he threw down the reins, grabbed the saddle horn and held on as the horse took him home at a near gallop. That was almost 90 years ago. ymmv

Horses are not "brainless". :)
 
Speaking of horses, as a kid ,my grandfather used pick me up at the railway station when I got parked to their farm for the summer. He usually was well "under the weather". After I got up on the wagon, he would tell the horses to go home, and promply fell asleep. The horses did take us home, several miles on country roads. After getting to the farm,they would stop, kick the wagon a few times to wake him up.
 
By the way, the Tesla cars that drove under the semi-trailers and had their top sheared off had radar sensors installed. So, how did the radar fail to detect the huge metallic object? Easily. These radars don't work the way traditional dish radars work to detect aircraft. There are many different kinds of radar, and the type used in cars is cheap.

Tesla took the radars out of the Models 3 and Y last year, and out of the S and X early this year.
 
Tesla took the radars out of the Models 3 and Y last year, and out of the S and X early this year.

Yes.

Until the radar got removed, they touted how useful it was when they could make use of the signal bouncing underneath the car immediately forward, and could read the distance and speed of the next car ahead.

Quite a few Tesla owners reported this feature as working very well to provide earlier warning when the motorist immediately ahead of them failed to pay attention and to reduce speed. If you just follow the car ahead, you may not stop in time either. By looking beyond the car ahead, the system has more time to slow down.

The above was a unique feature to Tesla cars, and I don't know if other car makers have implemented this (the radar is off-the-shelf and available to multiple car makers). I thought this was pretty neat.

Then, boom, Tesla declared the radar useless and removed it, because they said it often caused nuisance and false detection, and was not as reliable as vision cameras. And this may be true. When the two disagree, they trust the vision camera better.

PS. My GM car has no radar and uses a single rearview-mirror-mounted camera to provide forward collision alert. Seems to work, when a car ahead changed lane and cut in front of me.
 
Last edited:
Nobody's interested in self-driving cars anymore? :)

I admit, after a while searching the Web for development news from different makers, and watching YouTube videos, I got a bit worn out. Difficult things like SDC are not going to happen overnight, and public attention wanes.

But, last night I happened to see a Web headline about a Tesla rear-ending a motorcyclist and killing him in California on July 7. I searched the Web for more info to see if the Tesla driver engaged autopilot and dozed off or what.

What I found instead was another Tesla rear-ending a motorcyclist in Utah just yesterday Sunday 7/24, also killing him. This time, the Tesla driver admitted to the cops that the autopilot was engaged (and he was not paying attention obviously).

Aye, aye, aye...

So, how is Tesla FSD doing? After looking for YouTube videos posted by beta testers and not seeing much improvements from version to version, I got bored and stopped doing it. I looked again for a recent video, and the following is the 1st that I found.

Tesla FSD performance is just as goofy as ever. At 11:45, it tried to drive into a totally barricaded street. Aye, aye, aye... I have shared a video last year where Tesla FSD was about to drive into a barricade. And it is still doing that!

If it cannot see BIG barricades, anyone can understand it may miss a motorcyclist ahead. Sad!

PS. Read the comments to the video too. It tells you much about the current state of the Tesla "robotaxis" promised back in 2016.


 
Last edited:
I have to admit I’m less optimistic now than 5 years ago, though I won’t discount the possibility of level 5 SDC. If you just search “self driving will never work” there are lots of interesting articles on the subject. One of the articles noted how society had to change for trains, then cars v horses - not insignificant, like the SDC transition may require?

SDC on highways under typical circumstances is one thing, that’s being done successfully in controlled situations now.

Dealing with the almost infinite variables of city/congested driving is quite another. Many stationary and (unpredictable) moving obstacles in endless combinations. Endless construction situations, just dealing with flag “persons.” Visibility from near total darkness to looking directly into the sun. Inconsistent road markings, signs, lights, signals, dynamic road rules. Unpredictable human drivers defying logic - every day! Weather including ice and snow eliminating road markings and visibility - none of the high profile SDC projects seem to be grappling with that. Balancing caution and assertiveness for a SDC. We could all go on and on. Automating chaos…

Tesla and Waymo got off to a great start, but progress seems to have slowed to a crawl, despite Musk’s “optimistic spin?”

Funny example?
Traffic is hell, but what if the cars clogging up the roadways are all robots? That’s what some residents on one quiet street in San Francisco have discovered, with a seemingly endless parade of autonomous vehicles from Waymo driving down a dead-end street in the city’s residential Richmond district, turning around, and driving away.

“There are some days where it can be up to 50,” resident Jennifer King told KPIX 5 in this hilarious scene report. “It’s literally every five minutes. And we’re all working from home, so this is what we hear.”

Waymo is still using safety drivers for its tests in San Francisco. In the video, you can see Waymo’s safety drivers turning the steering wheels, which suggests that the vehicles haven’t quite mastered the [K] turn.
https://www.theverge.com/2021/10/14/22726534/waymo-autonomous-vehicles-stuck-san-francisco-dead-end
 
Last edited:
And what is "self-driving" anyhow? I'd love to be driven about by "James," my autopilot, while I'm reading a book and sipping a drink - but that's not what's meant. It's really that we are sitting upright in the driver's seat, our hands hovering, or resting lightly on the wheel (and it takes effort to hold the hands off, even lightly, of the wheel). It means the foot is doing the same - somewhere - either above the accelerator or the brake. (I asked the AARP driving instructor WTH one did with the foot when the partial self driving features - cruise control, lane keeping, etc. - as I found it uncomfortable to "hover" the foot above the pedals, and resting the foot on the floor didn't seem to offer the instant reaction time I'd like. He basically didn't seem to understand my question, so maybe it's peculiar to me.

But - it's not my idea of self-driving if the driver is fully engaged and attentive, expected to watch the road as ever.
 
"never" is too absolute. A bunch of us engineers were pushing back 5 years ago because we've seen stuff, and the 5 year plan was clearly a marketing idea that caught incredible viral spread.

Now that time has passed, we are closer. Progress will continue to be made. It takes time.
 
My neighbor once said he wanted a self driving car so he planned to buy more lottery tickets and then hire a full time chauffeur. :D
 
And what is "self-driving" anyhow? I'd love to be driven about by "James," my autopilot, while I'm reading a book and sipping a drink - but that's not what's meant. It's really that we are sitting upright in the driver's seat, our hands hovering, or resting lightly on the wheel (and it takes effort to hold the hands off, even lightly, of the wheel). It means the foot is doing the same - somewhere - either above the accelerator or the brake. (I asked the AARP driving instructor WTH one did with the foot when the partial self driving features - cruise control, lane keeping, etc. - as I found it uncomfortable to "hover" the foot above the pedals, and resting the foot on the floor didn't seem to offer the instant reaction time I'd like. He basically didn't seem to understand my question, so maybe it's peculiar to me.

But - it's not my idea of self-driving if the driver is fully engaged and attentive, expected to watch the road as ever.
Defined at least 4 years ago. No one considers “fully engaged and attentive” drivers as self driving.

Synopsys_cs327718450-automotive-levels-infographic-v4.jpg
 
And what is "self-driving" anyhow? I'd love to be driven about by "James," my autopilot, while I'm reading a book and sipping a drink - but that's not what's meant. It's really that we are sitting upright in the driver's seat, our hands hovering, or resting lightly on the wheel (and it takes effort to hold the hands off, even lightly, of the wheel). It means the foot is doing the same - somewhere - either above the accelerator or the brake. (I asked the AARP driving instructor WTH one did with the foot when the partial self driving features - cruise control, lane keeping, etc. - as I found it uncomfortable to "hover" the foot above the pedals, and resting the foot on the floor didn't seem to offer the instant reaction time I'd like. He basically didn't seem to understand my question, so maybe it's peculiar to me.

But - it's not my idea of self-driving if the driver is fully engaged and attentive, expected to watch the road as ever.


I agree with you that I don't think it is fun. If I had a Tesla with FSD (Full Self Driving), the novelty of it would wear out pretty fast as several owners have attested. It's way too stressful, particularly for in-city street driving. Youngsters seeking an adrenaline rush may find it fun, but I am old enough to not find this a fun game.

Still, some people kept trying it, and post YouTube videos about their experience. It's good to see how it works, else we would not know.

What is scary is more ignoramuses who really think the system is already truly FSD (Level 4) and kill other people or themselves with the half-baked product.

What I find most disheartening is after so much hype, the car still cannot recognize a road barricade. And even in a highway setting, the car still does not recognize a motorcyclist. Actually it does, but not reliably. Just 1 failure out of 10, or 50, or 100 encounters, you are going to kill that poor motorcyclist.

And it has been known to head for concrete columns, turn left in front of a bus, etc... Not always, but once in a while. Once is enough to get you killed. Plenty of examples on YouTube.
 
Last edited:
The problem of feet hovering or hands ready to move the wheel is fatiguing sounding. Pellice, your question made perfect sense to me.

This is different than changing to automatic transmission from manual. It isn't like you have a clutch you need to use every once in a while. It is just gone. Some manufacturers thought it would be fun to put the high beam switch there to give the left foot something to do. Eventually, that idea was jettisoned too. If you don't need the left foot, you don't need it.
 
I suppose if folks want self driving enough, we'll eventually get it. Having said that, it looks like it's gonna be a lot harder than we thought. The other issue is that folks make no allowance for errors. While we have accidents all the time with "people" in control, it appears we will not tolerate a single error in self-driving cars. Each time an accident occurs when a "self-driving" car is involved, it's national news. That's a major hurdle to overcome. YMMV
 
I suppose if folks want self driving enough, we'll eventually get it. Having said that, it looks like it's gonna be a lot harder than we thought. The other issue is that folks make no allowance for errors. While we have accidents all the time with "people" in control, it appears we will not tolerate a single error in self-driving cars. Each time an accident occurs when a "self-driving" car is involved, it's national news. That's a major hurdle to overcome. YMMV


While the above may be true, the current breed of SDCs is worse than human drivers, and has a long way to go. You have to be a new student driver to do worse than the current Tesla SDC.

Tesla SDC is a lot worse than a human driver. The reason Tesla SDC has not killed more people is because most of the errors were noticed and overridden in time by the human driver. Plenty of real-life experience has shown that. One only needs to look on YouTube.

The other SDCs such as Waymo, Cruise, and Mobileye seem to be better, as demonstrated by their robot taxis already cruising in some cities. However, these SDCs are geo-fenced, meaning they only operate in areas already vested by the developers. If turned loose in a random area or in inclement weather, they will do worse. How worse is anybody's guess. These developers are careful not to let the public try their half-baked products, so we just don't know.

Would the lidars of the other SDCs, which Tesla cars do not have, keep them from running into barricades or rear-ending a motorcyclist like Tesla SDC does? I think lidars will help tremendously, but this has to be proven.
 
Last edited:
Back
Top Bottom