Self-Driving Cars -- Needs of the Many vs Numero Uno

The enhanced safety measures could be implemented with blindspot detection, lane-deviation warning, adaptive cruise control, etc... It did not have to be a "false" autopilot not ready for its time, that lured the driver into watching DVD, texting, or dozing off.

I agree with that. I haven't watched the Tesla promos, but the end result is that some people seem encouraged to let the car take over, rather than think of this as an 'assist', and that's a problem.

I'll try to catch the Tesla promo info later, but it does seem they need to change their approach towards this.

-ERD50
 
I agree with that. I haven't watched the Tesla promos, but the end result is that some people seem encouraged to let the car take over, rather than think of this as an 'assist', and that's a problem.

I'll try to catch the Tesla promo info later, but it does seem they need to change their approach towards this.

-ERD50

I think you're right.

Having now driven many miles with the assist on its VERY easy and tempting to treat it as self driving.

I think Tesla is pretty good at preventing this. If you don't touch the wheel for a few minutes it starts warning you aggressively and then turns off.

At the sales center they're very specific on this.

As a driver I'm lazy and WANT it to be self driving when it isn't so I think it'll be hard to overcome.

That said when it's stop and go traffic I think it's a better driver than I am. In high speed highway driving it's more dangerous because of the false security. Numerous situations come up it can't deal with... obstacles in the road... narrow side barriers crappy painted lines, steep hills and so on. I watch it the whole time... but I can be more "passive" and thus I'm less tired after an hour or two of driving.

Sent from my HTC One_M8 using Early Retirement Forum mobile app
 
The enhanced safety measures could be implemented with blindspot detection, lane-deviation warning, adaptive cruise control, etc... It did not have to be a "false" autopilot not ready for its time, that lured the driver into watching DVD, texting, or dozing off.

Agreed, and add that all of these features first shake the steering wheel or sound a buzzer or put a message on a screen in the middle of the instrument panel or a combination of them. To see details you can typically download the owners manual for cars before buying them, also read the included warnings. It should be noted that car manuals are now over 500 pages in length. I found this piece in the verge that quotes the Tesla's owners manual http://www.theverge.com/2016/6/30/12073240/tesla-autopilot-warnings-fatal-crash
and to quote the sections: "
Warning: Do not depend on Traffic-Aware Cruise Control to adequately and appropriately slow down Model S. Always watch the road in front of you and stay prepared to brake at all times. Traffic-Aware Cruise Control does not eliminate the need to apply the brakes as needed, even at slow speeds.
Warning: Traffic-Aware Cruise Control can not detect all objects and may not detect a stationary vehicle or other object in the lane of travel. There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.
Warning: Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.
Warning: Traffic-Aware Cruise Control may misjudge the distance from a vehicle ahead. Always watch the road in front of you. It is the driver's responsibility to maintain a safe distance from a vehicle ahead of you.
Warning: When you enable Traffic-Aware Cruise Control in a situation where you are closely following the vehicle in front of you, Model S may apply the brakes to maintain the selected distance.
Warning: Traffic-Aware Cruise Control has limited deceleration ability and may be unable to apply enough braking to avoid a collision if a vehicle in front slows suddenly, or if a vehicle enters your driving lane in front of you. Never depend on Traffic-Aware Cruise Control to slow down the vehicle enough to prevent a collision. Always keep your eyes on the road when driving and be prepared to take corrective action as needed. Depending on Traffic-Aware Cruise Control to slow the vehicle down enough to prevent a collision can result in serious injury or death.
Warning: Driving downhill can increase driving speed, causing Model S to exceed your set speed. Hills can also make it more difficult for Model S to slow down enough to maintain the chosen following distance from the vehicle ahead.
Warning: Traffic-Aware Cruise Control may occasionally brake Model S when not required based on the distance from a vehicle ahead. This can be caused by vehicles in adjacent lanes (especially on curves), or by stationary objects."
and a second section:
"

Traffic-Aware Cruise Control is particularly unlikely to operate as intended in the following types of situations:

  • The road has sharp curves.
  • Visibility is poor (due to heavy rain, snow, fog, etc.).
  • Bright light (oncoming headlights or direct sunlight) is interfering with the camera's view.
  • The radar sensor in the center of the front grill is obstructed (dirty, covered, etc.).
  • The windshield area in the camera's field of view is obstructed (fogged over, dirty, covered by a sticker, etc.).
Caution: If your Model S is equipped with Traffic-Aware Cruise Control, you must take your vehicle to Tesla Service if a windshield replacement is needed. Failure to do so can cause Traffic-Aware Cruise Control to malfunction.
Warning: Many unforeseen circumstances can impair the operation of Traffic-Aware Cruise Control. Always keep this in mind and remember that as a result, Traffic-Aware Cruise Control may not slow down or may brake or accelerate Model S inappropriately. Always drive attentively and be prepared to take immediate action.
Warning: Traffic-aware cruise control may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.
"
So it would appear that the driver was not following the owners manual instructions.

 
Last edited:
Thanks to meierlde for posting the above info. These caveats are about what I expected.

That's a far cry from a truly autonomous autopilot that the public may perceive that the Tesla is capable of. People who do not heed the above warnings must not value their life highly, nor that of innocent bystanders.
 
Last edited:
When a vehicle approaches it is visible in the rear view mirror first, then both the rear view and side mirror, and then in both the side mirror and my peripheral vision. At no point is another vehicle ever out of my sight.

I have demonstrated this to several people by adjusting their mirrors and then walking toward their car from behind while they sit in the driver's seat. At no point do they lose sight of me. If you lost sight of a vehicle, then your mirrors must have been adjusted differently than mine.

I know we are in the middle of another thread... but if you read my description of how it happened to me.... there never was a time when the vehicle approached me from behind.... I merged onto a highway from another highway and had to concentrate on a driver in front of me as they seemed to want to slow down or stop... the first time I could look at the other two lanes the other vehicle was already IN my blind spot... doing the same speed as me... he was far enough up that his headlights did not show in my side mirrors, but not enough that I could see him out of the passenger window.... and he was black which blended in with the night...

As I said, it has worked for you until it does not... and as I also said, I was a believer that you could eliminate the blind spot but was proved wrong... lucky for me I did not change lanes as I had a feeling that all was not clear... IOW, I could not SEE that the lane was empty like I wanted to....
 
The 130 million miles statistics is just a spin by the company, who is acting more and more like politicians. It's bull ****. Do not fall for it.
Musk is a HUGE personality, and a master of spin. Smart dude, but also a smart marketeer.

He not only spins his products, he becomes the leader of spinning whole industries.
 
... I found this piece in the verge that quotes the Tesla's owners manual Tesla’s own Autopilot warnings outlined deadly crash scenario | The Verge
and to quote the sections: "
Warning: Warning: Warning: .....

So it would appear that the driver was not following the owners manual instructions.

Thanks to meierlde for posting the above info. These caveats are about what I expected.

That's a far cry from a truly autonomous autopilot that the public may perceive that the Tesla is capable of. People who do not heed the above warnings must not value their life highly, nor that of innocent bystanders.

Yes, the caveats are there, but we also have human nature at work. Look at the feedback from this poster who is an actual Tesla driver, and appears to be cautious and approach this with care, yet... (bold mine)

....
Having now driven many miles with the assist on its VERY easy and tempting to treat it as self driving.

...

As a driver I'm lazy and WANT it to be self driving when it isn't so I think it'll be hard to overcome. ...

And many other drivers will be more casual about this. That is why I wish we'd focus on all the warning/assist features plus additional tests to assure the driver is paying attention (eye movement scans, head movement scans in addition to steering wheel input).

I recall in Driver's Ed, after we were on a highway for a while, the instructor blocked my rear view mirror with his notebook, and said "Quick, tell me the color of the car behind us". It was a test if I had 'situational awareness.' These smart systems could do that, every once in while, ask the driver something about the environment. Keep them involved.

That should be the norm until these systems can take over w/o all the caveats, not before.

-ERD50
 
Yes, the caveats are there, but we also have human nature at work. Look at the feedback from this poster who is an actual Tesla driver, and appears to be cautious and approach this with care, yet... (bold mine)



And many other drivers will be more casual about this. That is why I wish we'd focus on all the warning/assist features plus additional tests to assure the driver is paying attention (eye movement scans, head movement scans in addition to steering wheel input).

I recall in Driver's Ed, after we were on a highway for a while, the instructor blocked my rear view mirror with his notebook, and said "Quick, tell me the color of the car behind us". It was a test if I had 'situational awareness.' These smart systems could do that, every once in while, ask the driver something about the environment. Keep them involved.

That should be the norm until these systems can take over w/o all the caveats, not before.

-ERD50

Yup.
No one reads manuals and less than no one follows directions.

Most people don't eat healthy and get fat, heart disease, etc.

Most people don't get annual checkups and risk catching things late... like cancer.

Most people text/use phone while driving and risk getting killed (or killing someone else).

And on and on.

It's just another piece of technology that adds convenience and will be abused.

What matters to me that in aggregate it is safer than not having it. That has yet to be seen but I'm fairly sure it will prove to be true especially as it improves with time.

I suspect over the next 20-30 years you'll see vehicles become more and more networked and collisions will drop substantially. Of course things like hacking risk will increase but the % of people dying in accidents will continue to drop imo.

I see my model X as a piece of beta software/hardware so I'm pretty careful.

Actually more dangerous than auto drive are the doors.

The button on top of they key closes them all and I've accidentally hit it many times. I've closed the door on my wife while she's putting in the kids. Closed it on my father in laws hand and smacked myself in the head at least 3 times :).

None of that had caused injury because they sense obstacles and stop.. but if the door closes on you under the right circumstances (holding a knife, little babies, near a cliff) I could imagine some not so funny Wil e coyote scenarios...

Sent from my HTC One_M8 using Early Retirement Forum mobile app
 
Yes, the caveats are there, but we also have human nature at work. Look at the feedback from this poster who is an actual Tesla driver, and appears to be cautious and approach this with care, yet... (bold mine)



And many other drivers will be more casual about this. That is why I wish we'd focus on all the warning/assist features plus additional tests to assure the driver is paying attention (eye movement scans, head movement scans in addition to steering wheel input).

I recall in Driver's Ed, after we were on a highway for a while, the instructor blocked my rear view mirror with his notebook, and said "Quick, tell me the color of the car behind us". It was a test if I had 'situational awareness.' These smart systems could do that, every once in while, ask the driver something about the environment. Keep them involved.

That should be the norm until these systems can take over w/o all the caveats, not before.

-ERD50

I would have told the instructor to screw it... I do not care what color the car behind is, I just care IF there is a car and what it is doing....

Even today I do not look long enough, nor is my brain programmed to process the color... the only time I actually look is when stopped because my DW has this game called 'punch buggy' and I want a leg up when I can actually pay attention....
 
I would have told the instructor to screw it... I do not care what color the car behind is, I just care IF there is a car and what it is doing....

Even today I do not look long enough, nor is my brain programmed to process the color... the only time I actually look is when stopped because my DW has this game called 'punch buggy' and I want a leg up when I can actually pay attention....

Well, perhaps the correct answer was "No, there's no car behind me"? I thought it was a good thing for him to do, it made the point to me (a dumb teenager) about always being aware of where cars are around you, and the lesson stuck with me (as a dumb adult).

-ERD50
 
People who do not heed the above warnings must not value their life highly, nor that of innocent bystanders.

Just like most drivers of non-autonomous vehicles...

Careless people will always find ways to harm themselves, and others.

The point I have been trying to make, in vain it seems like, is that the public has been so excited about this so-called "autopilot" that will allow them to go to sleep, or watch DVD. Or it will drive them to the hospital when they are infirm or invalid.

I surely hope I will live to see that happens. But the current technology ain't there yet. Stop dreaming. When the "real" stuff happens, we will know.

In the mean time, people who perpetuate the myth of the current autopilot are doing harm to the public. Lots of people do not know the limitations of current technology, and they also do not read manuals. They prefer to see youtube videos of "Look Ma, no hands".
 
Last edited:
A while back, I asked the following question:

If the computer in your car fails, is it going to accelerate hard, or stomp on the brakes? Does it swerve hard left or right? Keep coasting straight at the same speed? Or is it going to be different and unpredictable for every failure? I guess eventually the car will stop when encountering sufficient obstacles.

It was not strictly the computer that failed in the Florida fatal Tesla accident, but the camera that was supposedly blinded by the bright sky (according to Tesla).

I believe the camera was mounted inside and high up (by the rearview mirror?), and the top of the car was scraped off by the underside of the trailer. The owner of the house where the car stopped said that its top was peeled back like the lid of a can of sardines. Of course, the camera was obliterated, which means the "autopilot" was now blinded.

Still, the car kept on careening down the road, veered off and went through two fences until it was stopped by an electric pole. See photo below.

So, I guess I have my answer, which is really obvious. All cars eventually stop when encountering enough obstacles.


tesla-truck-accident-3-e1467386967417.jpg
 
Last edited:
The point I have been trying to make, in vain it seems like, is that the public has been so excited about this so-called "autopilot" that will allow them to go to sleep, or watch DVD. Or it will drive them to the hospital when they are infirm or invalid.

NW: relax. It is not in vain. I like ER forums, but a lot of time people just like to talk. All these "introverts" in real life really like to spout off here, sometimes just babble. They may be listening but still need to just talk.

Now, some will disagree with your point and do feel that since self driving cars are going to be a clear revolution in transportation, they are "wish casting" it along. This is normal. See space flight as an example.

What I find interesting in this discussion is that most of us asking for patience are engineers who have seen the reality of developing products. Conversely, some of my engineer friends have even bigger blinders on, and worship at the alter of Elon Musk. It is strange.

One of my upper level managers called us into a meeting. During the meeting, the guy got off track and spent nearly 1/2 hour evangelizing about his Tesla, and how he was excited to get another software update. Frankly, it bordered on creepy.

I'm a technologist, but I don't like it when people worship technologists.
 
The followers worship Musk because he dares go where others did not. That's fine, as long as nobody has to lose life or limbs, particularly bystanders.

Not being into auto, I never follow news about any car, Tesla included. But just since last week, I started to get more curious about exactly what sensors the car has to allow it to be an "autopilot". Here's what I find out.

It has a front long-range radar, mounted in the nose grill. It has a front-looking camera, most likely mounted high behind the windshield, probably on the rearview mirror. Around the car perimeter are several ultrasonic sensors. These serve as proximity sensors, and supposedly good to 16 ft. The latter are the ones that fail to pick up an adjacent high truck, as noted by a Tesla owner in an earlier post in this thread.

So, if the front-looking camera failed to detect the white semitrailer against the sky in the fatal accident, then why did the front radar not help? This has been bothering me.

I found the answer. Musk himself in a tweet said that the radar picked up the semi, but this reading could not be used because high overhead traffic signs would have the same radar signature.

So, in order to declare that there's an obstacle ahead, both the camera and radar have to agree. Else, there would be a lot of false positives. The car would brake all the time on the road, and get rear-ended.

One should now understand why the driver's eyes must always be on the road to be the tie-breaker. And he must act fast enough to save himself.

By the way, MobilEye, the company that supplies Tesla with the camera vision software says that its system does not yet deal with cars crossing the road, but its new software will in 2018. They call this "Lateral Turn Across Path (LTAP) Detection".

That's what I have found so far. Much of this hot self-driving car software is too new and proprietary, so it is difficult to know exactly what it can do. However, the above info is from the horse mouths.
 
Last edited:
As the technology advances, self-driving vehicles should be able to coordinate with each other. In the case of the Tesla accident, a self-driving semi could have detected that the Tesla was approaching at high speed, and then either taken evasive action, or communicate back to the Tesla that it was about to get in trouble. Even if some vehicles are operated manually, they could be "smart" vehicles that provide that sort of feedback to other vehicles.
 
A truly autonomous vehicle is more desirable. A tree may fall across the road, or a load of brick might have fallen off a truck. A flash flood has washed out half the road. A human driver would have no problems stopping for them. We would want a car with no steering wheel and no brake to do no less. Anything less is just "driver assistance".
 
Last edited:
Careless people will always find ways to harm themselves, and others.

The point I have been trying to make, in vain it seems like, is that the public has been so excited about this so-called "autopilot" that will allow them to go to sleep, or watch DVD. Or it will drive them to the hospital when they are infirm or invalid.

I surely hope I will live to see that happens. But the current technology ain't there yet. Stop dreaming. When the "real" stuff happens, we will know.

In the mean time, people who perpetuate the myth of the current autopilot are doing harm to the public. Lots of people do not know the limitations of current technology, and they also do not read manuals. They prefer to see youtube videos of "Look Ma, no hands".


Apparently my quips do not always evoke the expected response...

I think "self-driving" autos would perform well in situations where the infrastructure is designed for said autos, say an interstate highway, with known access points, well-marked lanes, the capability for autos to communicate with each other, etc.

Dealing with texting, sleeping, drunken, careless, stupid, or otherwise occupied humans, on busy streets with many obstacles, is, as you've mentioned, another matter.
 
Well, if something works that well as everybody including myself wants, why not?

I have often said I stopped being a car enthusiast somewhere in my 30s, a geezer before my time. And I do not care to drive. But that does not mean I will endorse or use something that has the potential to give me a heart attack when it hiccups.

See my post right above talking about what sensors are in the Tesla, and their limitations.
 
Could it be more instructive to research how fully autonomous cars like Google's function, instead of focusing solely on the Tesla Model S? The Models S has 'autopilot' features that are only intended to assist a fully engaged driver. It's clear with this controversial Model S fatality, the driver was expecting more of the 'autopilot' feature than Tesla intended. While Tesla is working toward fully autonomous, as are many automakers, I don't get the sense Tesla is leading the way technically (even though they seem to be marketing/promoting more than others).
 
Last edited:
Could it be more instructive to research how fully autonomous cars like Google's function, instead of focusing solely on the Tesla Model S?
Well Midpack, I think they are both instructive because they take different points of view.

Your timing on this question is good. The LATimes examines that exact comparison today:
Tesla and Google are both driving toward autonomous vehicles. Which company is taking the better route? - LA Times
“Having developed software and hardware products … I can point to the incredible inventiveness of customers in doing things you just never, ever considered possible, even when you tried to take the ridiculous and stupid into account,” said Paul Reynolds, a former vice president of engineering at wireless charging technology developer Ubeam. “If customer education is the only thing stopping your product from being dangerous in normal use, then your real problem is a company without proper consideration for safety.”
 
More from the LA Times article.
The NHTSA ranks self-driving cars based on the level they cede to the vehicle, with 1 being the lowest and 5 the highest.Tesla’s autopilot feature is classified as level 2, which means it is capable of staying in the center of a lane, changing lanes and adjusting speed according to traffic. Google is aiming for levels 4 and 5 — the former requires a driver to input navigation instructions, but relinquishes all other control to the vehicle, while level 5 autonomy does not involve a driver at all.
I didn't know the NHTSA already has a scale in mind. This is good, because I think there is a disconnect in our discussions based on what a self driving car is. I see "self driving cars" as a "level 5" activity only. Others see the dream being real at levels 3 or 4.
 
Wonder what self-driving cars will be called?

Horseless carriage became auto-mobile, and then car & auto.

Probably still cars. Maybe scars? (self-cars? sar?).
 
Could it be more instructive to research how fully autonomous cars like Google's function, instead of focusing solely on the Tesla Model S? The Models S has 'autopilot' features that are only intended to assist a fully engaged driver. It's clear with this controversial Model S fatality, the driver was expecting more of the 'autopilot' feature than Tesla intended. While Tesla is working toward fully autonomous, as are many automakers, I don't get the sense Tesla is leading the way technically (even though they seem to be marketing/promoting more than others).

Thank you!

I have been trying to compare Google's approach throughout the thread, and that just did not catch people's attention (that's why politicians know to keep repeating something about 100 times before people get their message).

The LA article is good in comparing the two companies' philosophies. But let me add a bit of technical info for the curious.

Basically, Google has a far superior sensor in the Lidar. Being a laser beam, it can pinpoint distances to obstacles along very narrow lines of sight, unlike a radar beam which would need a rotating dish like the aircraft tracking ones for precise tracking (not something like the Tesla's radar that cannot tell a semi from an overhead highway sign).

They then use the vision camera to look at the objects and try to identify them. They try to track all objects around the car, and for moving vehicles try to see if their paths would intersect with their own. They would not miss the semi-trailer crossing in front of them.

So, why is Google not yet releasing their software? It's because they want to build a true autonomous car, one that needs no steering wheel. For that, the job becomes much much tougher, as they need to watch out for everything that can go wrong. They need to drive around obtacles, read hand gestures of policemen and construction workers directing traffic. They need to read street signs, temporary construction signs. Can their lidar detect pot holes for example? I saw some examples of the lidar scan, and I was still dubious because the step curbs of perhaps 5" were barely visible.

They know they are not ready. When will they be? They say soon, but I am still a bit skeptical. Their software is very impressive though.

By the way, as I repeatedly pointed out, the current lidar is an expensive dome ($75K initially) that sits on top of the car. Tesla owners who want a sleek looking car would not care for it.

PS. When I was still working, there were companies looking to use laser beams in helicopters for wire detection (a chopper rotor getting tangled in wires or power lines is very bad news). They reported that the laser worked well to detect old and dull wires which scattered the laser back to the emitter. A new shiny wire would reflect the beam like a mirror (not scattering), and the reflected beam would not hit the receptor. So, shiny objects are invisible to the lidar.

I do not know how this problem is solved with current lidars. What I told happened 25 years ago, but it's hard to cheat the laws of physics.

google-new-self-driving-car-prototype.jpg
 
Last edited:
I do not know how this problem is solved with current lidars. What I told happened 25 years ago, but it's hard to cheat the laws of physics.

I think Google's car also has a short-range radar. That might help?
 
Back
Top Bottom