Let's talk Self Driving Cars again!

I disagree. Given that our solar photovoltaic system overproduces energy relative to our home car charging (almost 100% of our charging) I believe I can claim that our cars are solar powered. After an "offset" period, to cover production emissions, we get to point where we are emission free. On top of that, the cars have proven to be almost maintenance free as well. I don't miss the gas station or oil changes one bit.
Do you take advantage of net metering? If so, you’re not quite zero emissions, but you’re much closer than many of the people who mistakenly claim EVs are zero emissions. Congrats on that. If you’re completely off grid, your EV is zero emission - I suspect that’s a small population.
 
I disagree. Given that our solar photovoltaic system overproduces energy relative to our home car charging (almost 100% of our charging) I believe I can claim that our cars are solar powered. After an "offset" period, to cover production emissions, we get to point where we are emission free. On top of that, the cars have proven to be almost maintenance free as well. I don't miss the gas station or oil changes one bit.

This is one of those things that appears to be true, so it is widely accepted. But if you peel the onion back a layer, you can see the fallacy in it.

Here's the disconnect - your solar panels don't exist in a vacuum, you need to look at the larger picture. If they were not charging your EV, they would be feeding the grid. For simplicity of the illustration, let's say your average monthly solar kWh production is equal to your average monthly EV kWh usage. So they net out. Fine. Let's just call that 1 unit of electrical energy.

So let's analyze the scenarios where you have your panels w/o the EV, and the EV w/o the panels.

Scenario 1: You installed the panels before you bought the EV. You are sending 1 unit of energy to the grid each month. The fossil fuel plants on that grid run a little less. OK, that's good.

A few months later, you buy the EV. Now you use all the solar you produce. The grid doesn't run less than before you installed the panels. So whatever fossil fuel savings the grid experienced is gone, it is back to before panels & EV. In effect, the EV is running on fossil fuel.


Scenario 2: You buy the EV before you installed the panels. You are drawing 1 extra unit of energy from the grid each month. The fossil fuel plants on that grid run a little more to power your EV. The EV is running on fossil fuel.

A few months later, you add the panels. You now have offset your demand, the grid is back to where it was before you added panels & EV. The panels saved the fossil fuel, not your EV.

So the analysis is:

A) Solar panels reduce fossil fuel use on the grid.
B) EVs increase fossil fuel use on the grid.

There really is no connection that one works with the other. They each do their thing.

Since I can have solar panels and drive a non-EV, what it boils down to is, does an EV, powered by fossil fuel on the grid (the only real source we have for extra demand) really produce less pollution than an ICE/hybrid? And I think since the people involved are considering the environment, the comparison should be to a modern high mpg hybrid, not the "fleet average", like so many comparisons use.

-ERD50
 
Last edited:
I will disagree also. The sun overproduced energy for hundreds of millions of years, terraforming the earth into a steamy green mess. Only the occasional meteor strike cooled things off enough to make the planet habitable again. Hundreds of millions of years of organic crud was eventually crushed via plate tectonics from residual heat and convection energy trapped during the Earth's formation. After the invention and commercialization of steel, explosives, and drilling/mining... Today I can access this overproduced energy for almost nothing, since tax costs exceed material costs in many cases. I'm not a fan of the maintenance care and feeding costs, but I enjoy my zero emission vehicles.
 
I disagree. Given that our solar photovoltaic system overproduces energy relative to our home car charging (almost 100% of our charging) I believe I can claim that our cars are solar powered. After an "offset" period, to cover production emissions, we get to point where we are emission free. On top of that, the cars have proven to be almost maintenance free as well. I don't miss the gas station or oil changes one bit.

Okay, there is at least ONE zero emission car (YOURS:flowers:)

On the Big Island, my son's Tesla was going to be zero emissions too, but he found the sun didn't always cooperate, then the neighborhood vandals stole his equipment and then he moved to a location without solar panels back on the mainland. Honestly, I'm not poo-poo'ing the idea of zero emissions vehicles. It's a worthy goal. It's just not always a straight line from A to B. "Life" sometimes interferes with our best laid plans. I think, in general, we should celebrate our victories along the path of being less and less dependent upon fossil fuels. Congratulations on giving us a "win!"
 
My apologies. I was replying to Music Lovers comment about ZEVs being a fallacy. For us, the switch to EVs was simultaneous with the installation of our solar system which uses an inverter that is also a car charger (DC-DC direct charging when sun is shining). We are also in a mostly nuclear grid. My point is that whatever charging we need is minimally offset by our contribution to the grid.

I do love the FSD in our Tesla, but like the LEAF just as much. When FSD is a finished product I hope it delivers everything I need from it.
 
I just saw this news article that relates to this thread.

Amazon.com Inc. has placed an order for 1,000 autonomous driving systems from self-driving truck technology startup Plus and has acquired the option to buy a stake of as much as 20%, Plus said in a regulatory filing, confirming an earlier Bloomberg report...

Founded by a group of Stanford University classmates in 2016, Plus is backed by investors including Shanghai Automotive Industry Corp., GSR Ventures Management and a Chinese long-haul company known in English as Full Truck Alliance. It also has a partnership with European truckmaker Iveco SpA and is working with Cummins Inc. on using autonomous technology in trucks powered by natural gas.

I never heard of this company. The field is really crowded.

For more, see: https://finance.yahoo.com/news/amazon-talks-buy-stake-ai-115612141.html
 
Tesla released the long-awaited V9 of FSD (Full Self Driving) software download to some beta testers last Saturday.

Here's one YouTuber's experience with it. There were several driver overrides in a video of about 10 minutes, where the car made quick swerves left or right for no apparent reason. It was a good thing the driver had his hands on the steering wheel, and steered back really quick.

One viewer commented: "Super bizarre experience. It’s like driving a car that’s trying to kill you every few minutes."

It is interesting that the YouTuber commented that this V9 showed a lot of improvements over the last one despite these incidences.

Another beta tester showed a bizarre scenario where the car kept circling around the block by keeping to make right turns, instead of making a left turn according to the autorouting as shown on the display map. He let the car go around the block several times before aborting. What's stranger is that it's the same route he has been using to test every software version. The day earlier, also with V9 FSD, the car did make the left turn, but it was not a safe one. He thought it did not see an approaching car. On this day, he tried again, and the car now refused to turn left there.

 
Last edited:
It will be a long time before I will pay a premium for any kind of self driving feature on any vehicle. I enjoy driving and rarely even use cruise control. Being very alert and being able to predict what others will do in a given situation is the key to an accident free driving record.
 
It will be a long time before I will pay a premium for any kind of self driving feature on any vehicle. I enjoy driving and rarely even use cruise control. Being very alert and being able to predict what others will do in a given situation is the key to an accident free driving record.

While it may be a long time until I would trust a self driving vehicle to actually drive itself, I would appreciate things like lane-management. I occasionally get distracted and begin to drift into another lane. Having a car that managed my mistakes would be a good thing. I've ridden in my son's Model 3 and it is pretty good at keeping the car between the lines without much attention from the driver. YMMV
 
Tesla released the long-awaited V9 of FSD (Full Self Driving) software download to some beta testers last Saturday.

Here's one YouTuber's experience with it. There were several driver overrides in a video of about 10 minutes, where the car made quick swerves left or right for no apparent reason. It was a good thing the driver had his hands on the steering wheel, and steered back really quick.

One viewer commented: "Super bizarre experience. It’s like driving a car that’s trying to kill you every few minutes." ...

It was nerve racking just to watch. As I've always said to the fans who just say "But if the car gets in trouble, you just take over", that puts you a step or two behind. A couple times, he waited to see if the car was going to correct itself, and one of these times, that's going to be too late. I'd rather just drive than to be constantly monitoring this system and trying to decide if I have to take over or not. It's tough enough trying to anticipate what other cars are going to do, now you have to wonder what your car is going to do too? No thanks!

And about 3 minutes in, he was too late. The car didn't know to swerve a little (like the car ahead did), to avoid those over grown bushes. The car swiped those bushes before the driver pulled it away. The owner clearly was not happy that the car didn't care as much about its paint job as the owner did!

... It is interesting that the YouTuber commented that this V9 showed a lot of improvements over the last one despite these incidences. ...

Yes, he kept saying that this or that action was impressive or "super impressive". On one hand I agree - like I've said from the start, it's impressive what the system can do. But from the perspective of what a FSD system has to do, these "impressive" feats are just the basic, routine things we need to be able to expect to be done near-perfectly.

Wait, didn't Musk promise one million self-driving Tesla taxis by the end of [-]the[/-] last year?

https://www.thedrive.com/news/38129...a-robotaxis-by-the-end-of-2020-where-are-they

"Next year for sure, we will have over a million robotaxis on the road," said Musk on October 21, 2019. "The fleet wakes up with an over-the-air update. That's all it takes."

Pranay Pathole@PPathole. Apr 12, 2020
Replying to @elonmusk and @Tesla

How long for the first robotaxi release/ deployment? 2023?


Elon Musk@elonmusk --- Functionality still looking good for this year. Regulatory approval is the big unknown.

12:38 AM · Apr 12, 2020

Yeah, that pesky regulatory approval. As if the car in that video is ready for a regulatory review.

-ERD50
 
Last edited:
While it may be a long time until I would trust a self driving vehicle to actually drive itself, I would appreciate things like lane-management. I occasionally get distracted and begin to drift into another lane. Having a car that managed my mistakes would be a good thing. I've ridden in my son's Model 3 and it is pretty good at keeping the car between the lines without much attention from the driver. YMMV

Many (most?) new cars have lane detection. My 2017 Buick Encore will beep at me if it senses I've drifted (in stereo, the speaker side matches the side you are drifting to). It does not take any action, just a warning, and that's the way I like it. It occasionally false alarms, I don't want to have to wrestle the wheel away from it doing something it shouldn't.

So far, I don't think I've ever had it warn me of anything I wasn't aware of. Maybe a few times, I've drifted enough to have it alarm while pointing out something in the distance to a passenger, but I was monitoring traffic, knew everything was safe, and the little bit of drift wasn't a problem.

Same with the front collision warning - it will alarm if it detects I'm closing the gap too fast with something in front of me (not distance alone, some combination of speed and rate of change). A few false alarms, but so far, every time it has alerted me, I was aware I was getting close, and prepared to act.

But I appreciate the warnings, just in case I do miss something. It's nice to have a second set of "eyes" watching out for me. But I don't want it to take over. I don't think any of these systems are ready for that. Exception might be an imminent crash - OK, hit the brakes to reduce impact speed.

And if I was getting more than one actual warning I was not prepared for on a drive, that would be a sign that maybe I shouldn't be driving (tired, impaired?).

-ERD50
 
While it may be a long time until I would trust a self driving vehicle to actually drive itself, I would appreciate things like lane-management. I occasionally get distracted and begin to drift into another lane. Having a car that managed my mistakes would be a good thing. I've ridden in my son's Model 3 and it is pretty good at keeping the car between the lines without much attention from the driver. YMMV
+1

I became a believer when my Y slammed the anti-lock brakes on at 65mph because a car stopped in the middle of the road less than 100 yards in front of me. Without the technology I would have tboned that guy at a high rate of speed. To be clear that is the standard auto pilot not FSD.
 
While it may be a long time until I would trust a self driving vehicle to actually drive itself, I would appreciate things like lane-management. I occasionally get distracted and begin to drift into another lane. Having a car that managed my mistakes would be a good thing. I've ridden in my son's Model 3 and it is pretty good at keeping the car between the lines without much attention from the driver. YMMV

DW has 2020 Lexus 350RX that we took on a road trip to FL in April. Even though she had it for over 6 months, she had never used the cruise control, since all her driving had been local. She did however use the lane management.

Low and behold, when you activate both on the highway, the car is basically self driving. The radar cruise control slows you down when approaching the car ahead AND the the lane management feature actually will steer the car, keeping you centered in your lane. But you must keep your hands on the wheel, or it will yell at you (well, actually just beep a loud warning).

That's as close as I need to a self driving car.
 
In March 2015, speaking at an Nvidia conference, Musk stated:

"I don't think we have to worry about autonomous cars because it's a sort of a narrow form of AI. It's not something I think is very difficult. To do autonomous driving that is to a degree much safer than a person, is much easier than people think... I almost view it like a solved problem."

And now, just prior to releasing V9 FSD software, Musk admitted in a Twitt that

"Generalized self-driving is a hard problem, as it requires solving a large part of real-world AI. Didn’t expect it to be so hard, but the difficulty is obvious in retrospect.

Nothing has more degrees of freedom than reality."



When watching YouTube videos, I usually do not read viewer comments, except for specific cases. In this case, I do in order to see what people think. It's very interesting to see many say how good the system is, and that the self-learning system will "keep on learning and learning" and it will be a lot better soon. Hands-off driving any time now.

I don't know how these AI self-driving systems are trained, particularly Tesla FSD. But there have been several incidences of Tesla cars running into semitrailers and trucks stopped on the freeway, and these occurred over the course of a few years. Why is it that hard for Tesla to "train" its system to correct this fundamental flaw? Can't Tesla just park trucks on a speedway and drive test the car over and over again until the software recognizes the truck in its path? Why does the public have to do this "AI training" for Tesla?

As mentioned, I am very interested in this technology, and have watched many videos on different systems. There are just more videos about Tesla SDC, so I have seen more of it. I have seen a Tesla owner turning irate when he was testing the car "Summon" mode in an empty parking lot, and it drove directly onto his son and almost killed him. I have seen several videos where a Tesla car failed to see a handicap parking sign in a parking lot, and about to mow it down if the owner did not stop the car.

All of the above happened in broad daylight, and not under rain or fog. The computer simply failed to see these objects, including a child, in these cases. Supporters will say, but it does in so many other cases. The problem is that if it fails 1 in a 1,000 times, it has the potential to kill a lot more people than a human driver.

PS. By the way, in the video that I posted above about the V9 FSD, a viewer commented that the YouTuber should not have intervened when the car acted up, in order to see if the car would correct itself at the last minute. Don't you see how idiotic people can get?
 
Last edited:
In March 2015, speaking at an Nvidia conference, Musk stated:

"I don't think we have to worry about autonomous cars because it's a sort of a narrow form of AI. It's not something I think is very difficult. To do autonomous driving that is to a degree much safer than a person, is much easier than people think... I almost view it like a solved problem."

And now, just prior to releasing V9 FSD software, Musk admitted in a Twitt that

"Generalized self-driving is a hard problem, as it requires solving a large part of real-world AI. Didn’t expect it to be so hard, but the difficulty is obvious in retrospect.

Nothing has more degrees of freedom than reality."


...

There was a recent discussion on this forum of the The Dunning–Kruger effect (from wiki):

The Dunning–Kruger effect is a hypothetical cognitive bias stating that people with low ability at a task overestimate their ability.

So how much experience did Elon have with AI and Self Driving? Per Dunning–Kruger, probably very little!

-ERD50
 
+1

I became a believer when my Y slammed the anti-lock brakes on at 65mph because a car stopped in the middle of the road less than 100 yards in front of me. Without the technology I would have tboned that guy at a high rate of speed. To be clear that is the standard auto pilot not FSD.

FSD has all the features of the lower grades.

Yes, the system does work in some cases, but fails occasionally.

The above in itself is not bad, if this is a system to aid the driver, because the driver is still responsible. He knows that he has to drive.

However, if you are going to take away the steering wheel and brake pedal, or tell the driver he can go to sleep, it's an entirely different ball game.

Or if you are a robotaxi, hauling a family of 4 to the airport for their vacation trip. :)
 
Last edited:
In March 2015, speaking at an Nvidia conference, Musk stated:

"I don't think we have to worry about autonomous cars because it's a sort of a narrow form of AI. It's not something I think is very difficult. To do autonomous driving that is to a degree much safer than a person, is much easier than people think... I almost view it like a solved problem."

And now, just prior to releasing V9 FSD software, Musk admitted in a Twitt that

"Generalized self-driving is a hard problem, as it requires solving a large part of real-world AI. Didn’t expect it to be so hard, but the difficulty is obvious in retrospect.

Nothing has more degrees of freedom than reality."


As mentioned, I am very interested in this technology, and have watched many videos on different systems. There are just more videos about Tesla SDC, so I have seen more of it. I have seen a Tesla owner turning irate when he was testing the car "Summon" mode in an empty parking lot, and it drove directly onto his son and almost killed him. I have seen several videos where a Tesla car failed to see a handicap parking sign in a parking lot, and about to mow it down if the owner did not stop the car.

All of the above happened in broad daylight, and not under rain or fog. The computer simply failed to see these objects, including a child, in these cases. Supporters will say, but it does in so many other cases. The problem is that if it fails 1 in a 1,000 times, it has the potential to kill a lot more people than a human driver.

PS. By the way, in the video that I posted above about the V9 FSD, a viewer commented that the YouTuber should not have intervened when the car acted up, in order to see if the car would correct itself at the last minute. Don't you see how idiotic people can get?


Self driving cars will never be perfect in our lifetime. Even now, my PC crashes occasionally which is an inconvenence. However when a self driving car has a hiccup, it can be life threatening. Don't get me wrong. I am for electric cars using AI technology to assist the driver. However, the driver must be ready to intervene.

This is why airplanes have pilots. A modern commercial airline is high tech but you must have a pilot to fly the commercial plane in an emergency if there is a mechanical or electrical malfunction. The FAA will never state that pilots are no longer necessary in our lifetime. Similarly people should never state that the self driving car is 100% reliable and a driver is no longer necessary and the driver can fall asleep at the wheel.

You can fall asleep at the wheel if you want but you will be endangering yourself and the public until the technology is 100% perfect which is not going to happen soon. This is because cars will always have electrical and mechanical malfunctions. To be perfect, all cars must never have electrical and mechanical malfunctions. Yeah..right.

Until then, the early adapters are the guinea pigs. I will stick to having fun driving my human controlled sports car during my retirement. Driving a self driving car is not my idea of having fun.
 
There was a recent discussion on this forum of the The Dunning–Kruger effect (from wiki):

So how much experience did Elon have with AI and Self Driving? Per Dunning–Kruger, probably very little!

-ERD50

A refresher on the Dunning-Kruger curve.

923px-Dunning%E2%80%93Kruger_Effect_01.svg.png
 
Last edited:
Lady friend's Subaru has lane keeping feature, in addition to radar for following cars. I find them interesting gizmos when driving it. For the most part they are annoying features. As are the lane dparture beeps. It is more crap I need to pay attention to, rather than just the traffic. Using dash mounted GPS as "heads up speed display" in my vehicles is great. I find the best installed inventions are ABS and cruise control.. The backup cameras are nice for parking within an inch of another car's bumper, otherwise I still backup using side view mirrors. I would find it far more stressfull to monitor a "self driving" feature than just driving.
Driving is not like aircraft piloting whre automation can can fly the craft. Even they have TCAS for monitoring other traffic and give instruction to each aircraft what direction to diverge.. Besides, the routes, altitudes speed are assigned by ATC, they monitor separation and rarely have conflicting traffic on long haul flights.
 
Last edited:
Self driving cars will never be perfect in our lifetime. Even now, my PC crashes occasionally which is an inconvenence. However when a self driving car has a hiccup, it can be life threatening. Don't get me wrong. I am for electric cars using AI technology to assist the driver. However, the driver must be ready to intervene.

This is why airplanes have pilots. A modern commercial airline is high tech but you must have a pilot to fly the commercial plane in an emergency if there is a mechanical or electrical malfunction. The FAA will never state that pilots are no longer necessary in our lifetime. Similarly people should never state that the self driving car is 100% reliable and a driver is no longer necessary and the driver can fall asleep at the wheel.

You can fall asleep at the wheel if you want but you will be endangering yourself and the public until the technology is 100% perfect which is not going to happen soon. This is because cars will always have electrical and mechanical malfunctions. To be perfect, all cars must never have electrical and mechanical malfunctions. Yeah..right.

Until then, the early adapters are the guinea pigs. I will stick to having fun driving my human controlled sports car during my retirement. Driving a self driving car is not my idea of having fun.


Nothing in life is perfect. Most things we use are just "good enough".

SDC technology does not have to be perfect either. But I don't see anyone who will step up and say that the system in the above video is close to "good enough" for a robotaxi.

PS. By the way, the FAA has regulations about what "good enough" means for aircraft. On the other hand, the NHTSA has not set down any guideline on SDC. The technology is still too new. Everything goes, and a manufacturer can claim anything. If the driver dies, that's tough luck; just a sacrifice for technology advancement. If the driver, or should I say non-driver, kills a bystander, let the driver be sued.
 
Last edited:
Tesla released the long-awaited V9 of FSD (Full Self Driving) software download to some beta testers last Saturday.

Here's one YouTuber's experience with it. There were several driver overrides in a video of about 10 minutes, where the car made quick swerves left or right for no apparent reason. It was a good thing the driver had his hands on the steering wheel, and steered back really quick.

One viewer commented: "Super bizarre experience. It’s like driving a car that’s trying to kill you every few minutes."

It is interesting that the YouTuber commented that this V9 showed a lot of improvements over the last one despite these incidences.

Another beta tester showed a bizarre scenario where the car kept circling around the block by keeping to make right turns, instead of making a left turn according to the autorouting as shown on the display map. He let the car go around the block several times before aborting. What's stranger is that it's the same route he has been using to test every software version. The day earlier, also with V9 FSD, the car did make the left turn, but it was not a safe one. He thought it did not see an approaching car. On this day, he tried again, and the car now refused to turn left there.




Watching the above YouTube video again, I noticed an error by the FSD vision system.

At 2:30, the traffic lights were out, and the car slowed down a bit. The YouTuber commented that FSD acted properly and treated the lights being out as a stop. However, it was not true.

The car slowed down but did not stop. And on the screen, you can see that it saw the lights as being green! The system misread the lights! This could have severe ramifications in different circumstances.
 
Last edited:
And here's another video from a beta FSD tester. The car was about to run into a barrier, and was overridden by the driver at 2:50.

Later, when traffic was slow, several times it attempted to overtake the car in front, even trying to squeeze in front of a bus, as shown by the planned path on the screen. And it made a bad left turn into the lane in the opposite direction.

When encountering a road closed by barricades, one time it knew to reroute, but another time it headed right for the barricades.

 
Last edited:
Here's another FSD beta tester, this one in Chicago. V9 does not seem to have much improvement over V8.2 in driving skills, other than a better UI to show the driving situation on the display screen.

The car has a lot of problems dealing with cars parked on both sides of the road. Beta testers out in the countryside or on the highway have a lot better luck, but then Tesla has been doing that reasonably well.

Urban driving environment is tough, even though we do not see really heavy traffic in all these videos. And that's why Waymo as of 5 years ago said it did plenty of highway testing, and would be focusing on city testing only.

Note that we are still looking at perfect daylight driving conditions. Nothing yet about rain, snow, fog, dust storm, etc...

 
Here's another FSD beta tester, this one in Chicago. ...

While these Beta testers are somewhat balanced (not total fan-boys), you can see their bias coming through in these videos. They give the FSD way too much credit, and see (or don't see) the reality.

Haven't watched the whole thing, but In the Chicago video, @ 6:00 that bus had it's signal on, and the brake lights flash, and the Tesla is still trying to squeeze in. Really bad. And the driver is trying to say the car would have handled it, he was just being cautious with his intervention. BS. Sounds like he has been bit by the Musk version of Steve Job's "Reality Distortion Field".

I've spent a fair amount of time driving in Chicago, and you don't mess with CTA buses. You give them wide berth. They will pull out and you better move. In their defense, driving that big bus on a crowded city street is a major challenge - if they yielded the ROW and drove "politely" all the time, they'd be sitting there forever. Passengers would get out and walk! And in this case, the bus driver actually did use his signal (that's not a given either, though the bus drivers I think are pretty good about it).

Or, if you are an aggressive/reckless driver, you anticipate that bus lane change, and accelerate like a bat out of hell to clear it before it can cut you off. But you don't meander in the space that bus wants to occupy - the bus will win!


You made a really great observation in the previous video. That guy compliments the Tesla for stopping and recognizing the blocked traffic signals, but as you point out, it did neither. It was a "rolling stop", and it saw the lights as green. FYI, I've never seen "X" on a traffic light - when they are down around here, they flash red (driver to treat same as stop sign), or someone might manually drop down a STOP sign that is permanently mounted to the pole, but folded up normally.

These drivers give the Tesla way too much credit, IMO.

-ERD50
 
Last edited:
Back
Top Bottom