Thoughts on TESLA

Status
Not open for further replies.
? How does any car brand autopilot make and process eye contact with another driver or pedestrian. Not beating on Tesla, the issue is generic.

My qualification to ask : 45 years of driving and over 1 million miles with no at fault accidents .

Defensive driving often comes down to subtle human traits and behaviors.


In a presentation at an autoexpo perhaps 1 year ago, when Waymo announced the testing of their Pacifica fleet in Phoenix, AZ, the CEO revealed their inhouse lidar/radar/vision camera sensor suite for the 1st time. I just happened to see a video on youtube.

He showed that their lidar had superior resolution, far better than anything on the market. He demonstrated that the resolution was good enough to "see" the facial features of a pedestrian from how many feet away I cannot remember. And yes, anyone could definitely see the pedestrian nose and ears on the lidar scan.

He then added that being able to tell where the pedestrian was facing would help determine his intention, whether he was going to step off the sidewalk.

I was of course impressed with the sensor performance. Have not seen anything more about this subject, but then I have not been looking on the Web.

I don't know if anyone has claimed to be able to use vision cameras to do the same, but Waymo has good cameras too and they rely on the lidar heavily.

Well, if the pedestrian has a ski mask on, it is going to be tough for any sensor (or person). :)

PS. Note that lidar will not be able to "see" through the windshield to scan the face of a driver. Back to square one with the vision cameras. :)
 
Last edited:
In a presentation at an autoexpo perhaps 1 year ago, when Waymo announced the testing of their Pacifica fleet in Phoenix, AZ, the CEO revealed their inhouse lidar/radar/vision camera sensor suite for the 1st time. I just happened to see a video on youtube.

He showed that their lidar had superior resolution, far better than anything on the market. He demonstrated that the resolution was good enough to "see" the facial features of a pedestrian from how many feet away I cannot remember. And yes, anyone could definitely see the pedestrian nose and ears on the lidar scan.

He then added that being able to tell where the pedestrian was facing would help determine his intention, whether he was going to step off the sidewalk.

I was of course impressed with the sensor performance. Have not seen anything more about this subject, but then I have not been looking on the Web.

I don't know if anyone has claimed to be able to use vision cameras to do the same, but Waymo has good cameras too and they rely on the lidar heavily.

Well, if the pedestrian has a ski mask on, it is going to be tough for any sensor (or person). :)

PS. Note that lidar will not be able to "see" through the windshield to scan the face of a driver. Back to square one with the vision cameras. :)

I have no doubt the system ability to see facial features on a pedestrian.

How does the self driving system use non verbal communication ( driver to driver or driver to pedestrian) ? Remember your driver training back in high school ? and crossing guard admonishment " DO YOU HAVE EYE TO EYE CONTACT ? "

Maybe needs an android driver in the self driving car that can express it's intention to humans outside of the vehicle ?
 
I have no doubt the system ability to see facial features on a pedestrian...

Eh, don't be so sure. Remember that Tesla cars do not detect parked cars reliably, let alone pedestrian faces. Nothing is as easy as it seems.

... How does the self driving system use non verbal communication ( driver to driver or driver to pedestrian) ? Remember your driver training back in high school ? and crossing guard admonishment " DO YOU HAVE EYE TO EYE CONTACT ? "

Maybe needs an android driver in the self driving car that can express it's intention to humans outside of the vehicle ?

A very valid question. I just happened to see an article on this subject, something that Waymo appears to be still struggling with in their extensive in-town road testing for 1 year now. Will share it if I find it again, perhaps on the other old thread about SDC and not here.

By the way, I just learned that their test center is in my part of town. That's why although the metropolitan Phoenix is quite large, I have seen their test cars way too often. Several times, I saw their cars when taking a walk around my neighborhood, on quiet residential streets. Pretty weird, as there was no vehicle traffic there. I was very curious what they were trying to test.
 
I find the autopilot recognition of roadside structures interesting. I was working with a group of firms in object recognition about 10 years ago. At that time, they had a library of signs in a sign database.
My Tesla AutoPilot1/AP1 car use Mobleyes cameras and initial interpretation and did speed sign recognition. It really worked quite well but was complicated as it had to read like 10,000 types of signs including signs from several parts of the world. non-trivial apparently.

My Tesla AutoPilot2/AP2 car uses GPS based speedlimits and they are not only not as accurate but it is slower to get them updated/fixed.

High Definition (HD) Maps bring functions such as high-precision localization ... based on my distance to several signs/poles/etc then I know exactly what lane I'm in down to like 10 cm. Plus planning and decision making like what is around that blind bend.

With as many miles as Tesla cars are being sold now they are 'mapping' a lot of streets/highways with HD data. Mobileye (Intel now owns) also has their hardware in a lot of cars and collects this data.

This basic search would keep you busy for many hours. (HD maps + localization)
https://www.google.com/search?q="hd+maps"+localization
 
On one video clip where the car exited the freeway and took the ramp, commenters debated whether the AP was able to recognize the traffic light at the end of the ramp to stop, because it was not clear. Well, that's far short of Waymo cars as I could tell.
Tesla recently came out with another level of their autopilot (v9 -- aka Navigate on Autopilot (NoA)) that
a) automatically turns on your blinker and takes exit. I use it regularly.
b) It also suggests lane changes when it sees that other lanes than what you are in are moving X% faster than your lane *and* you have your speed limit set higher than you are going *and* you have your 'temperament' (forget the setting name) set to a certain level (disabled,mild,average,'mad max' level).
c) suggest lanes to be in as your highway Y's/splits in 1.5ish miles or your exit is coming up.
d) some other subtle things.

Tesla does not officially recognize stop sign nor stop lights but have that in beta software (Elon tweeted about this recently). However, after an exit is taken in cleary decrements your speed from the highway speed by 10-15 mph increments to finally coming to a stop. This is your video in question as Tesla does not document this.

RyyTXKN.jpg
 
Thanks. That explains the confusion of the video poster and his commenters that I saw in that youtube video. They were not sure what was supposed to happen, I guess.

And about this:

Not sure if this post of mine got lost and goes along with your point.

Owner (who is an electrical engineer):
There indeed are zero fuses on the 12v side. All circuit protection is performed solid-state, which in practice means a transistor (MOSFET) is used to switch all the loads, and the body controllers monitor the current going though each of these and in the event of an overload, it just switches off the transistor. It's much faster and safer, and allows a more reliable and easy to diagnose car. It is more expensive, but this was needed to make the M3 fault tolerant to achieve full self driving with confidence.
Quote:
Owner (who is an aero engineer):
Wow. That's possibly a bigger change to how a car works than anything else Tesla has done to date.



I am afraid the above EE is even more out-of-date than I am. :) I retired in 2012.

Back in about 2008 or so, while searching for a MOSFET suitable for a design job that I was doing, I ran across an article about a part designed by a semiconductor house for use in automobiles as a smart fuse (forgot who that was). Thought that was neat, but I had no application for it. Did not follow up to see which car makers already used it, or were going to adopt it. Forgot all about it until now.

Anyway, searched the Web just now, and found that Texas Instruments and ON Semiconductor call their parts "eFuses", and STMicroelectronics calls theirs "E-fuses". EDN, a popular trade magazine, calls the part "e-fuse".

With so many semiconductor houses building this kind of parts, I would think many car makers use these smart fuses by now.

PS. Another search on the Web shows that back in 2003 Volvo built a prototype fuse block that they called "active fuse". It was done with discrete parts, meaning without the much cheaper and better chips that the semiconductor houses are now cranking out.

PPS. One should not confuse the power protection "eFuse" from the "eFUSE" used for programming logic chips.
 
Last edited:
A Tesla owner says that Autopilot automatically applied the emergency brake to avoid a crash caught on his TeslaCam last week.

https://electrek.co/2018/12/22/tesla-autopilot-avoids-crash-video/




Just curious why they call it the emergency brake:confused: Isn't it just the brake?


IOW, should it not be that the Tesla applied the brakes in this emergency situation....


The emergency brakes (at least in all cars that I know of) is a secondary brake system when the primary brakes fail... FWIK it is by cables... I do not think Tesla is using that secondary braking system...
 
The above video was reported earlier in post #1497 by Ready.

The confusion over the word "emergency brake" comes from the term AEB, which stands for Automatic Emergency Braking system. It was available in some cars in the US since 2006. Many new cars in 2017 have it now. Automakers have pledged to have it in all of their cars by 2022.

So, one can see that the AEB is separate from the autopilot or autosteering function, just like the ABS is.

In the Tesla, I recall reading that they made it part of the Autopilot package. IIRC, there's some lawsuit about that. For the autopilot reliability in avoiding collision, see my post #1499. :)

WASHINGTON, DC — Four of 20 automakers report that automatic emergency braking (AEB) is standard on more than half of their 2017 model year vehicles, the National Highway Traffic Safety Administration (NHTSA) and the Insurance Institute for Highway Safety (IIHS) announced in the first update of manufacturer progress toward equipping every new vehicle with the crash avoidance technology. Even without making it standard, another five automakers report that more than 30 percent of vehicles they produced in 2017 were equipped with AEB.

“The growing number of vehicles offering automated emergency braking is good news for America’s motorists and passengers,” says U.S. Department of Transportation Secretary Elaine L. Chao. “With each model year, manufacturers will increasingly utilize technology to allow vehicles to ‘see’ the world around them and navigate it more safely.”

Twenty automakers pledged to voluntarily equip virtually all new passenger vehicles by September 1, 2022, with a low-speed AEB system that includes forward collision warning (FCW), technology proven to help prevent and mitigate front-to-rear crashes. The commitment is intended to get the technology into a wider swath of the vehicle fleet faster than otherwise possible today.
 
Last edited:
And about this:

Quote: Owner (who is an electrical engineer):
There indeed are zero fuses on the 12v side. All circuit protection is performed solid-state, ...

Quote:
Owner (who is an aero engineer):
Wow. That's possibly a bigger change to how a car works than anything else Tesla has done to date.
I am afraid the above EE is even more out-of-date than I am. :) I retired in 2012.

....

I also wondered about any engineer that would be that impressed and put that much weight on the use of an electronic fuse (for once, I refrained from commenting! :) well, at least until you brought it up again). Another example of the fans looking through rosy glasses at everything.

Electronic fuses/current-limiters are very old tech. It isn't anything amazing, it's just a design choice. To say that "That's possibly a bigger change to how a car works than anything else Tesla has done to date. ", is insulting to all the hard work the Tesla engineers have done.


A Tesla owner says that Autopilot automatically applied the emergency brake to avoid a crash caught on his TeslaCam last week.

https://electrek.co/2018/12/22/tesla-autopilot-avoids-crash-video/

I was not impressed. There were flashing lights ahead, the Tesla driver stayed in the empty blocked lane, and then pulled right behind a string of cars that were pretty close, and you could see lots of brake lights ahead before he pulled behind that string of cars.

Any reasonably good driver would have left plenty of space ahead when coming up to that situation and changing lanes. There would be no 'emergency', no need to rely on the car's system.

I have a 'collision alert' system on my 2017 car. It does not apply the brakes (I think it does prep them, taking up slack in the pedal so it responds a bit faster if you do press the pedal), it just warns me with beeps and a 'heads-up' red flashing light. I keep it on the most sensitive setting. I have had it flash a few dozen times, but I have yet to have it flash when I wasn't already aware that I was coming up somewhat fast on the car ahead, and already had my foot hovering on the brake (except for just 2~3 false alarms), ready to respond if the car does slow and I get closer. Never came close at all to actually hitting anyone this way.

-ERD50
 
? How does any car brand autopilot make and process eye contact with another driver or pedestrian. Not beating on Tesla, the issue is generic.

My qualification to ask : 45 years of driving and over 1 million miles with no at fault accidents .

Defensive driving often comes down to subtle human traits and behaviors.
There’s tons written earlier on this thread and online (just search) on how autonomous cars deal with stationary and moving obstacles, and the answer is not generic anyway - Waymo and Tesla use different tech. When in doubt, autonomous cars are programmed to use caution, they don’t guess and play chicken. And autonomous cars can learn and teach all other cars what they’ve learned almost instantly.

And remember the goal doesn’t have to be “perfect,” unfortunately there will still be some accidents and some fatalities. You can always conjure up a failure mode. If a pedestrian without illumination crosses the street at night outside a crosswalk, it’s not a given a human driver wouldn’t hit them too. Name one common mode of human operated transport that is perfect? There were over 37,000 fatalities and many more accidents in the US in 2017, 94% were caused by human error - the status quo is far from perfect. If autonomous cars reduce accidents and fatalities by 90% as some consider their goal, it’s a good trade IMO. I’d welcome 3,700 fatalities vs 37,000 - and millions worldwide.
 
Last edited:
Electronic fuses/current-limiters are very old tech. It isn't anything amazing, it's just a design choice...

It is a cost-driven consideration. When semiconductor companies crank something out by the zillion, everybody uses them.

I recently had to buy some old-style 3AG glass fuses for my electronic instruments. Expensive stuff! I also bought some MOSFETs. Dirt cheap! I could not get over it.
 
Last edited:
My conclusion from watching the video is that the Tesla AP is not as good as a fully alert driver who would have seen the accident situation developing a few seconds earlier. But, it is better than a distracted driver who would have simply plowed into the back of the car ahead .

It basically reinforces my belief that automated systems are currently tools to help us drive better and more safely, but not a substitute for an alert human. Still it is one of the few reasons I would trade in my older but still very good used vehicle for a new one.
 
Last edited:
My conclusion from watching the video is that the Tesla AP is not as good as a fully alert driver who would have seen the accident situation developing a few seconds earlier. But, it is better than a distracted driver who would have simply plowed into the back of the car ahead .

It basically reinforces my belief that automated systems are currently tools to help us drive better and more safely, but not a substitute for an alert human. That is one of the few reasons I would trade in my older but still very good used vehicle for a new one.
That’s a reasonable conclusion today. But no automaker is selling anything like an autonomous car yet (though Tesla got a little over (or way over) their marketing skis a while ago with vague autonomous claims and they’ve backed off since the Tesla-White truck fatality).

You’re on the money with the distracted driver remark. 94% of US auto fatalities are human error, distracted, intoxicated, tired, etc. drivers. That’s where autonomous cars can more easily improve over humans. For the sake of discussion, when/if an autonomous car can just equal an alert human and avoid most or all the human error fatalities - I’d certainly make that 90% improvement trade. The autonomous vs human argument is only 10% of the fatality reduction opportunity - many detractors seem to lose sight of that, aiming for “perfect” where the status quo is far from perfect.

And you haven’t seen a true level 3, much less a level 4 or 5 vehicle. You might change your mind then, wait and see. You may be a late adopter, but that’s your choice. And no one is asking you to take a leap of faith, they’ll have to prove themselves like any new tech.
 
Last edited:
But until we have a truly autonomous system, the AEB is available on many cars now, and will be standard in 2022.

See my post #1509. It is not exclusive to just Tesla cars. Don't get fooled and get the AEB mixed up with any "autopilot".
 
But until we have a truly autonomous system, the AEB is available on many cars now, and will be standard in 2022.
Helpful technology but as your link notes the pledge is for “a low-speed AEB system that includes forward collision warning.” A few can be effective up to 50-55 mph, but many only work up to about 35 mph under some conditions. So they’ll reduce accidents and that’s good, but relatively fewer fatalities as fatalities are more often higher speeds. I have FCW and PCB on my cars, it not works under some circumstances.
 
Last edited:
The confusion over the word "emergency brake" comes from the term AEB, which stands for Automatic Emergency Braking system. It was available in some cars in the US since 2006. Many new cars in 2017 have it now. Automakers have pledged to have it in all of their cars by 2022.

It is more that just having AEB. This is really broken down into low-speed (ie. 37 or maybe 50) and high-speed AEB (highway speeds! i.e. to 90 MPH). Tesla's have high-speed AEB. There is more engineering and hardware required for this!

My research a while back into the GM Volt and Bolt is that they only do up to 37 MPH (Low Speed Auto Emerg Braking) but supposedly the Volt with ACC/Adaptive Cruise Control will do High-Speed AEB. Bolt does not offer ACC that I know of. The Nissan LEAF has Low-Speed AEB to 50 MPH ... so not that good on the highway!

Still roads around Chicagoland even in the suburbs are 40 and 45 MPH so the up to 37 MPH in the Volt/Bolt is not great.

The Model 3 originally was shipped with Slow-Speed Auto Emerg Braking to 50 MPH
but then received an Over-The-Air/OTA update to get High-speed AEB to 90 MPH just like the X/S.
 
Last edited:
Helpful technology but as your link notes the pledge is for “a low-speed AEB system that includes forward collision warning.” A few can be effective up to 50-55 mph, but many only work up to about 35 mph under some conditions. So they’ll reduce accidents and that’s good, but relatively fewer fatalities as fatalities are more often higher speeds. I have FCW and PCB on my cars, it not works under some circumstances.

True. It's still no panacea.

Note that the Tesla near-hit incidence in that video was also at a low speed.

Note also that AEB on Tesla cars did not help the fatal accidents where the cars drove into fixed objects either. In fact, it did not deploy as there were no skidmarks!

Does AEB protect against hitting pedestrians and bicyclists? Only the newer types that I don't think are in production yet. They use vision cameras and not radar.

The problem with AEB is that a false detection will cause it to slam the brake for the wrong reason. Disaster will ensue, as anyone can imagine.

PS. Cross-posted with Eroscott on the new Tesla AEB at high speed. If effective, that would certainly be helpful.
 
Last edited:
The problem with AEB is that a false detection will cause it to slam the brake for the wrong reason. Disaster will ensue, as anyone can imagine.
True. My primitive low speed AEB has been fooled several times. Several times due to a car turning in front of me, and once (inexplicably) going around a turn at night at about 30 mph - I can only guess it saw the reflection on a sign at the apex of the turn as a vehicle. AEB systems will undoubtedly improve, and it sounds like the Tesla system is way more evolved than most current AEB driver assistance systems.

However, if it’s daytime, good weather and I have ACC on and a car ahead of me slows gradually and stops, even my primitive driver assistance featured car will slow and stop comfortably. I don’t rely on it, just experimented a couple times.
 
Last edited:
...
The problem with AEB is that a false detection will cause it to slam the brake for the wrong reason. Disaster will ensue, as anyone can imagine.
...

Multiple ways to detect this and deal with hard to identify places. Some of these are identified in 'tile' files for GPS coord that have a problem. A train overpass that is on the bottom of a hill in my area is one example. Also the car can 'trigger' this data feedback if it activates AEB even for a short time and the driver presses on the accelerator pedal overriding ... aha ... perhaps this was an area of false AEB so trigger an investigation (I've done this with GPA coordinates and Google streets when others have pointed it out).

Some info I wrote up for my son a while back based on research. There is a ton more to this and I'm giving an overview:

Autopilot also has ADAS (adv driver assist sys) map tiles that are fetched from a server (<geohash>.tile). Each tile covers a certain fixed geographic area (5 char geohash), and they contain information that the AutoPilot uses for assisting it's decision making (i.e. extra info besides cameras for slowing for turns as an obvious one). Tile files around your commonly driven areas are cached on the SSD in the CID (Center Instrument Display) hardware.

Google street of train overpass at the bottom of a hill.
hdw1spx.jpg


Tesla tile data (unofficial viewer of data from official geohash.tile file).
5FDso9d.jpg
 
Multiple ways to detect this and deal with hard to identify places. Some of these are identified in 'tile' files for GPS coord that have a problem. A train overpass that is on the bottom of a hill in my area is one example. Also the car can 'trigger' this data feedback if it activates AEB even for a short time and the driver presses on the accelerator pedal overriding ... aha ... perhaps this was an area of false AEB so trigger an investigation (I've done this with GPA coordinates and Google streets when others have pointed it out).

Some info I wrote up for my son a while back based on research. There is a ton more to this and I'm giving an overview:

Autopilot also has ADAS (adv driver assist sys) map tiles that are fetched from a server (<geohash>.tile). Each tile covers a certain fixed geographic area (5 char geohash), and they contain information that the AutoPilot uses for assisting it's decision making (i.e. extra info besides cameras for slowing for turns as an obvious one). Tile files around your commonly driven areas are cached on the SSD in the CID (Center Instrument Display) hardware.

Google street of train overpass at the bottom of a hill.


Tesla tile data (unofficial viewer of data from official geohash.tile file).

If the system needs GPS and some pre-recorded info to know that's an overpass and not something it needs to brake for, that just indicates how weak it is.

So what if some one does brake hard there? Does the system tell you it is ignoring things at that point? Then you look up, and say "Wha?"?

Like I've said time and time again in the self driving car thread, I fear these systems will lull many into complacency, and we may have even more accidents as people pay less and less attention. I want to see a system that forces the driver to pay attention - until we are full autonomous, no steering wheel.

-ERD50
 
I totally agree with ERD50 about any system requiring external assistance is not really autonomous, and can still be fooled.

I recently did a long RV trek of more than 10,000 miles through Alaska and the Yukon, traveling over dirt and bad roads that I had never driven on. I survived navigating over pot holes and frost heaves, using only my eyes.

A SDC has an array of sensors, and yet still needs to be told in advance what to do. This shows how hard it is to use sensors and computers to replace a human brain, no make it any animal brain. It is really harder than people think.

True. My primitive low speed AEB has been fooled several times. Several times due to a car turning in front of me, and once (inexplicably) going around a turn at night at about 30 mph - I can only guess it saw the reflection on a sign at the apex of the turn as a vehicle. AEB systems will undoubtedly improve, and it sounds like the Tesla system is way more evolved than most current AEB driver assistance systems.

However, if it’s daytime, good weather and I have ACC on and a car ahead of me slows gradually and stops, even my primitive driver assistance featured car will slow and stop comfortably. I don’t rely on it, just experimented a couple times.

How often can such false detection occur that you will tolerate? Once a week? Or a month? Or a year?

Because there will never be a perfect system, not for airplanes nor spacecraft nor autos, we will have to tolerate some misbehavior. But has anyone defined what level of malfunction that we can tolerate? And is there an easy way to test to make sure a system sold to the public meets that?

There is a trade-off between false detection and failure to detect a real potential accident. It's not too different than medical tests that can give a false positive diagnosis, vs. failure to detect a serious illness, except that in the case of the AEB you do not have the time to consult a 2nd opinion.

To have something work and to have it work reliably are two very different things. People can be easily impressed by a one-time demo. It is just not as simple and easy as people imagine.


PS. In the case of Midpack's AEB, it is a system to assist and not to replace him. And so, it is still Midpack's responsibility to keep a safe distance and brake as required. The car maker may already set the system threshold on the side to prevent false detection. And it still brakes wrongly on occasions.

Imagine if Midpack no longer has any brake pedal, or steering wheel. How more reliable the computer/sensor set will have to be? Will it be so touchy to brake for a mere shadow on the road?
 
Last edited:
A SDC has an array of sensors, and yet still needs to be told in advance what to do. This shows how hard it is to use sensors and computers to replace a human brain, no make it any animal brain. It is really harder than people think.
Do some research as all these systems use external information to *complement* all their sensors. They can travel on some roads they have not been on or an area without connectivity. They just do it more "cautiously". There is a lot of redundancy and supplemental information. It is not black and white. You and some others are looking at it very naively from a 10,000 ft level. :) Lot of advanced work is happening in this field. HD Maps as an example have taken off. Check out the Mobileye (Intel now) presentations. They are truly impressive and enlightening. It is starting to be clear you don't appreciate some of my insights and information so perhaps I'll just move on and save my time on these topics for others.
 
Last edited:
Well, I do not see any problem in using external data such as traffic congestion, construction zone info, weather data to help route the car, etc... That's all good stuff! I love it, even when I have to look it up on a smartphone during a trip.

But when you need external data over the Web to help you brake or swerve, I have to question the capability of the system when such data is not available. How safe is the unaided system? Is it wrong to ask that?

Musk has said he does not use lidar like Waymo and nearly everyone else, because it is a "crutch". But if lidar is a crutch, it is still an onboard sensor, and the car is still autonomous. I do not know what Musk calls external aiding data to help him brake.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom