Self Driving Cars?

But what about folks on bicycles, walking, running, on skates, scooters etc.
the signs are still needed.
Like I said, we may still have signs as a courtesy to humans. Though many/most bikers, walkers, runners, skaters, and some scooters ignore or don't need road signs today. And they're already restricted from many roads, the restrictions might include more roadways.
 
I guess I just don't see the big savings that others see. There may be some insurance and licensing savings but all of the other costs will remain the same. Self driving cars will not magically maintain or repair themselves for less, and they will still depreciate at the same rate as regular cars.
 
I'm not sure about, or even concerned about, savings. I just want to make sure I can make it to FL or the beer store and back without having to ask another person to drive me, and without killing anybody because I've gotten too dangerous to be allowed behind the wheel. I hope the bugs are mostly worked out in the next 15 years or so.
 
AI (Artificial Intelligence) is already employed by Google to interpret hand signals from bicyclists, gestures from cops directing traffic, etc... They are not saying how well that works, but at least they acknowledge the problems they have to solve. Other companies don't even bring these problems up.

Here's a case where AI would help. Remember the Tesla that went under the semitrailer that crossed the road? Tesla says that its vision camera could not distinguish the white side of the trailer against the bright sky. Would a human have the problem?

No, because a human knows what a semitrailer is. He sees the front cab crossing the road, and he recognises it as a semitrailer truck. He sees the black front wheels, 3 of them, to his right. He sees the 2 rear wheels to his left. These wheels are still moving across his line of vision so he does not miss them, and he is headed for the distance between them. It is unlikely that he would fail to see the trailer body spanning the front and rear axles, but even if he does, he knows there is a trailer body between the axles, and he would not drive between the axles.

The human mind is not as fast as a computer executing an algorithm but far more flexible. Never confuse speed with smart. A calculator bought for $1 at a dollar store can multiply two 5-digit numbers far faster than you can, but do you think it is smarter than you? A computer only knows about the cases that the programmer puts in his software. That's the reason for Google to spend a lot of drive time in the cities to collect the data and understand all the scenarios that they have to deal with. Google has got the highway driving down long ago. Piece of cake compared to city driving, which they have been working on the last several years.

I like technology and would not mind having a self-driving car. I am just not holding my breath for one. These companies are currently hyping up their work in order to attract investor's money. They may get it some time in the future, but it is not coming to a car dealer in the next few years. Not a truly autonomous vehicle, and only driver-assisting features like the current Tesla.
 
Last edited:
(water filled pothole)
No, a self-driving car cannot, the same way a person cannot. To decide to drive over it or not depends on many factors. Maybe a smart car can eventually consider all these factors in making its decision.

Car (thinking): I just learned from the last 15 cars that passed this way that there is an unknown object in the road. There was a suspension upset in the ones that didn't slow-down or alter course. I'm going to slow and alter course.
 
(water filled pothole)


Car (thinking): I just learned from the last 15 cars that passed this way that there is an unknown object in the road. There was a suspension upset in the ones that didn't slow-down or alter course. I'm going to slow and alter course.
Man, that's how Skynet starts!
 
(water filled pothole)


Car (thinking): I just learned from the last 15 cars that passed this way that there is an unknown object in the road. There was a suspension upset in the ones that didn't slow-down or alter course. I'm going to slow and alter course.
As you undoubtedly know, cars 'talking to one another' is already in development, so you paint a likely picture.
 
I guess I just don't see the big savings that others see. There may be some insurance and licensing savings but all of the other costs will remain the same. Self driving cars will not magically maintain or repair themselves for less, and they will still depreciate at the same rate as regular cars.
On your island, everyone has their own car. On my island, nobody owns their own car and the car to person ratio is one in ten. They're all members of a transportation coop. Your cars are all maintained sporadically, or at least not as efficiently as a fleet, so will cost a bit more to keep running. Each car owner on your island has separate transactions with the maintenance company, tire company, fuel company, governing taxation authority, licensing authority, insurance company, etc, etc. My presumption here is that more and separate smaller transactions means higher overhead to handle all of that. On my island, that's all contracted or brought in-house by the fleet coop. On my island the coop takes the equivalent of the money that your island spend on 90% of it's cars and puts it into something that earns money (this is to make the two scenarios comparable). That income would go back to the fleet coop to reduce the rates that members pay.

It's hard for me to see how my island would not have a significantly lower per-capita transportation burden.
 
Last edited:
In theory, everything is possible, including living on the Moon or on Mars. :) In practice, there are hindrances, such as limitation of physics, and quite often costs.

It's hard to predict technology advances. We have made a lot of progresses in some areas, and yet some problems that were supposedly solved were abandoned. For example, the SST Concorde and the Space Shuttle were fully developed, deployed, then abandoned.

Again, if the self-driving car arrives on the scene, I will be happy to use it. I am not going to predict when it will happen.
 
.......... A calculator bought for $1 at a dollar store can multiply two 5-digit numbers far faster than you can, but do you think it is smarter than you?.......
Would this be before or after I have the first cup of coffee in the morning?
 
Yeah but people with money won't want to share cars.

After a few rides getting cars with odors, where people have just eaten or whatever, people will get hyperconscious about hygiene issues.
And now where would you store golf clubs?
 
How would one distinguish between a puddle and a pothole filled with water?
Within 3 weeks last month, I blew out both front tires in freeway potholes while watching traffic vs. the road surface. At the closeness of cars & speed, I doubt I could have avoided them when I first saw them.
 
Would they do away with the need for designated drivers?
 
What would a bank robber do with a self driving getaway car?:D
By then the police will have self driving cars too, so no problem. :LOL:

And there probably won't be brick-n-mortar banks by then either (seriously).
 
As I mentioned earlier, Waymo (Google) has the lead in the technology behind self-driving cars. Yet, Google said it was not ready and more testing was needed, even though it uses a Lidar to supplement the vision camera. A Lidar would not fail to see the obstacles that Tesla's autopilot is blind to.

There was no video of the infamous Tesla that clipped its roof driving under a semitrailer in Florida. But, I just saw this video of an accident in China.


And then, this one.


And then this one


Now, one can see how a bicyclist would not be safe sharing a road with a Tesla on the so-called "autopilot".

Google said it would not release its system before it was fully checked out, because enthused and impressed drivers would not heed the warnings to keep their hands on the steering wheel and to monitor the car movement.

Just watch these idiots. Most cars with driver-assisting features will sound a warning and disconnect if they sense that the driver's hand is not on the wheel. Obviously, Tesla allows or tolerates "Look ma, no hands" despite its legalese to the contrary. In the video below, watch how an alert driver was able to save his life when he quickly took over when the "autopilot" malfunctioned at 1:25.

 
Last edited:
As I mentioned earlier, Waymo (Google) has the lead in the technology behind self-driving cars. Yet, Google said it was not ready and more testing was needed, even though it uses a Lidar to supplement the vision camera. A Lidar would not fail to see the obstacles that Tesla's autopilot is blind to.

There was no video of the infamous Tesla that clipped its roof driving under a semitrailer in Florida. But, I just saw this video of an accident in China.

And then, this one.

And then this one

Now, one can see how a bicyclist would not be safe sharing a road with a Tesla on the so-called "autopilot".

Google said it would not release its system before it was fully checked out, because enthused and impressed drivers would not heed the warnings to keep their hands on the steering wheel and to monitor the car movement.

Just watch these idiots. Most cars with driver-assisting features will sound a warning and disconnect if they sense that the driver's hand is not on the wheel. Obviously, Tesla allows or tolerates "Look ma, no hands" despite its legalese to the contrary. In the video below, watch how an alert driver was able to save his life when he quickly took over when the "autopilot" malfunctioned at 1:25.
Not entirely sure exactly what point(s) you're making?

You found 3 Tesla accidents worldwide over an unknown period. Meanwhile we average 92 fatal auto accidents per day in the US alone. I'd be at least as concerned about that. If when fully developed, it's estimated self driving cars will reduce accidents 90% vs today's incident rate. I'd take that, even if turns out to be 50%. All accidents are tragic. Again I hope we won't let the perfect be the enemy of the good.

This isn't going to happen fast, we'll all have lots of time to adapt. The first self driving cars will be too expensive for wide adoption, and some drivers will be skeptical and spouting hyperbole no matter what the reality is. If drivers are ever forced to adopt, it won't be until the technology is very mature and manual drivers have clearly become a hazard on roads. Not in most of our lifetimes I'd guess.

Decades from now, there may be a small group who refuse to get on board with self driving cars, just as there are some people today who absolutely refuse to fly (I know a couple).

It does seems clear that Tesla let its marketing get ahead of the cars auto capability, and they've backtracked. An unfortunate setback for self driving, even if not fully deserved. As you point out, a few drivers have acted irresponsibly too.
 
Last edited:
Not entirely sure exactly what point(s) you're making?

You found 3 Tesla accidents worldwide over an unknown period. Meanwhile we average 92 fatal auto accidents per day in the US alone. I'd be at least as concerned about that. If when fully developed, it's estimated self driving cars will reduce accidents 90% vs today's incident rate. I'd take that, even if turns out to be 50%. All accidents are tragic. Again I hope we won't let the perfect be the enemy of the good...

My point has always been that the technology in self-driving cars is still very immature. Tesla in racing to "beat" the other car makers in releasing its elementary lane-keeping features has done a disservice to the industry and also the public.

These 3 accidents happened to be videotaped. We don't know how many happened and simply not reported. We also do not know how many close calls happened, where the drivers saved themselves and innocent bystanders by reacting quickly. See an instance in the last video at 1:25! Look on youtube, there are plenty more.

Anyway, Tesla has been claiming hundreds of thousand of miles safely traveled by its "autopilot". We do not know how many close calls it caused that was caught in time by the drivers. Pure spin and BS.

About self-driving cars eventually reduce accidents and fatalities, well, when it happens it will be good. I would like to see an outsider agency like NHTSA test and certify these cars, instead of listen to boastful and irresponsible makers like Tesla.
 
Last edited:
Speaking of statistics, here's the one that I pay attention to, but most people do not read.

The carmaker’s autonomous vehicles traveled a total of 550 miles on California public roads in October and November 2016 and reported 182 “disengagements,” or episodes when a human driver needs to take control to avoid an accident or respond to technical problems, according to a filing with the California Department of Motor Vehicles. That’s 0.33 disengagements per autonomous mile...

Waymo had a much lower rate of disengagements in 2016, improving to about 0.2 disengagements per thousand miles from 0.8 a year earlier.

It is clearly shown that Tesla test vehicles (not the production ones!) require human intervention once every 3 miles.

Google (Waymo) cars require human driver takeovers once every 5,000 miles.

Google cars are 1,700 times better. But is it good enough? No. If a driver does not catch the computer screw up in time, he will have an accident and may not live if it happens on a highway. I do not want to have a risk of dying every 5,000 miles.

The point again and again is that I do not hold my breath to wait for a self-driving car. I am not excited about this. When it happens, I will know.
 
Last edited:
My point has always been that the technology in self-driving cars is still very immature. Tesla in racing to "beat" the other car makers in releasing its elementary lane-keeping features has done a disservice to the industry and also the public.

These 3 accidents happened to be videotaped. We don't know how many happened and simply not reported. We also do not know how many close calls happened, where the drivers saved themselves and innocent bystanders by reacting quickly. See an instance in the last video at 1:25! Look on youtube, there are plenty more.

Anyway, Tesla has been claiming hundreds of thousand of miles safely traveled by its "autopilot". We do not know how many close calls it caused that was caught in time by the drivers. Pure spin and BS.

About self-driving cars eventually reduce accidents and fatalities, well, when it happens it will be good. I would like to see an outsider agency like NHTSA test and certify these cars, instead of listen to boastful and irresponsible makers like Tesla.
Fair enough. As long as we keep in mind how many lives are lost, how many accidents there are, and how many close calls there are with the status quo. Automobile accidents are a significant cause of death in the US and worldwide. Some people are judging precursors to self driving cars, like Tesla, against an unattainable standard of perfect, instead of comparing to the status quo.

I appreciate Tesla and others for pushing the envelope, but I agree they (and some owners) got tragically ahead of themselves.
 
I'd love to have a self driving car if it was reliable enough for me to nap in when the car is in self driving mode.
 
By the time they are good and proven, it will be about the time I'd be getting in the "should we take away her keys?" age. I live in S fla, and I see the terror in the faces of some older drivers who clearly should not be on the road.

So, self-driving cars should extend my freedom to get around even after it's no longer practical for me to do all the driving. Especially once it becomes the new normal and a majority are driverless (that might take more than a generation though, unless converting existing cars becomes a cheap thing).

While today they might seem scary and have a lot of work to do, and "no way not me never" - yeah I said that about reading books on a kindle 10 years ago.
 
Fair enough. As long as we keep in mind how many lives are lost, how many accidents there are, and how many close calls there are with the status quo. Automobile accidents are a significant cause of death in the US and worldwide. Some people are judging precursors to self driving cars, like Tesla, against an unattainable standard of perfect, instead of comparing to the status quo.

I appreciate Tesla and others for pushing the envelope, but I agree they (and some owners) got tragically ahead of themselves.

You also need to look at the number of vehicles and the number of vehicle miles driven by the Teslas to be able to compare their safety. Raw numbers of accidents on their own aren't enough. It seems that the Tesla is pushing the system too soon. Just one of many discussions of the "AutoPilot" accident rate - Tesla's own numbers show Autopilot has higher crash rate than human drivers

Another issue with the limited Tesla system is the ability of the car to detect and respond to motorcycles. One of the motorcycle boards that I read had a discussion of Tesla under "AutoPilot" rear-ending a MC in Norway.

Can Autonomous Cars Detect Motorcycles? | Cycle World
 
You also need to look at the number of vehicles and the number of vehicle miles driven by the Teslas to be able to compare their safety. Raw numbers of accidents on their own aren't enough. It seems that the Tesla is pushing the system too soon. Just one of many discussions of the "AutoPilot" accident rate - Tesla's own numbers show Autopilot has higher crash rate than human drivers

What I want to see are the statistics on driver 'assisted' vehicles versus vehicles that depend on the driver to always be alert and right.
 
Back
Top Bottom