The death of taxicabs?

What do these things do if they run into an obstruction and don't have a real driver available to take over. Are they smart enough to do a three point turn and go back? What if they are on a one way street?
 
I don't Uber needs autonomous cars to kill off taxis.
As someone who has used Uber, I cannot imagine every using a taxi in Denver again.
Uber is more reliable.
Uber gives you a fixed price.
Uber already has your credit card so you don't have use it to pay the driver.
Uber allows you to not tip.
Uber cars are nicer than taxis.
 
no drivers would have saved Uber $8.9M

"Colorado regulators slapped Uber with an $8.9 million penalty for allowing 57 people with past criminal or motor vehicle offenses to drive for the company, the state’s Public Utilities Commission announced Monday.

The PUC said the drivers should have been disqualified. They had issues ranging from felony convictions to driving under the influence and reckless driving. In some cases, drivers were working with revoked, suspended or canceled licenses, the state said. A similar investigation of smaller competitor Lyft found no violations."

Uber fined $8.9 million by Colorado for allowing drivers with felony convictions, other drivers license issues
 
I've seen quite a few of the Uber Volvos on the roads in Tempe, Arizona since I've been here the last couple of months. They're testing self-driving mode, but with engineers in the seats. Apparently their autonomous test fleet is located here.

https://www.azcentral.com/story/mon...cars-arrive-arizona-tempe-debut-scl/98208998/

I saw 3 or 4 of them following each other through downtown Scottsdale not too long ago. Probably same group that you saw.

Taxis will be a thing of the past unless they get on board with 21st century technology.
 
What do these things do if they run into an obstruction and don't have a real driver available to take over. Are they smart enough to do a three point turn and go back? What if they are on a one way street?

I read an article by a reporter who drove along with one one of the early autonomous cars which really helped me understand the magnitude of this problem.

The autonomous car was trying to make a turn into a small parking lot. There was a center line in the entrance which separated the "in" and "out" lanes. But a delivery truck was parked, blocking part of the "in" lane.

Any human driver would routinely check to be sure no-one was coming out, then drive in, encroaching into the "out" lane at that one spot. Probably without giving it a second thought.

The autonomous car refused to cross the center line, and just sat there, tying up traffic.

This brings us down an interesting rabbit-hole. We will ultimately have to teach the self-driving cars when to break the law. This will require them to make what amounts to a moral decision. It is clearly better to stray over the white line than to tie up traffic, or worse yet, than to hit a pedestrian or even a pothole.

Similarly, it's better to hit a deer than an oncoming car, but it may be better to drive off the road than to hit a deer.

We (as drivers) make these value decisions all the time.

But what if the choices were even worse? Is it better to hit a crowd of people, or a brick wall? Should the car be programmed to protect the life of it's passenger, or sacrifice that life for the "greater good?"
 
Somebody must be good enough at granularizing human behavior, to break down our daily "common sense + experience" decisions into code so the car can adopt them. Sure, the hard part is assessing, in any particular case along any particular road, whether the deer or the ditch will cause less harm; but that's a very hard decision for a human, too, given the small amount of time you have to grasp the situation and react to it.

At least the car is unlikely to have been texting, arguing, or bopping to music, when the deer leaps in front of it.

(We could try wiping out deer, too - I'd never miss them, bwah ha >:D.

I

This brings us down an interesting rabbit-hole. We will ultimately have to teach the self-driving cars when to break the law. This will require them to make what amounts to a moral decision. It is clearly better to stray over the white line than to tie up traffic, or worse yet, than to hit a pedestrian or even a pothole.

Similarly, it's better to hit a deer than an oncoming car, but it may be better to drive off the road than to hit a deer.
 
Following on from Capt Tom's comments, my understanding is that the driverless cars are like very timid drivers. They'll leave pretty safe gaps, and if someone cuts in they will back off. Can you imagine what that would be like in NYC? Human taxi drivers will be cutting them off all the time, knowing that they can get away with it. The driverless cars will take forever to get around with everyone else cutting them off. It would work well in an all-driverless environment, but not a mixed one. If you're in a hurry to get uptown, are you going to get in the very slow, polite, driverless one, or the one with the aggressive cabbie?
 
It's not just about how driverless cars behave towards other cars. Scientific American had an interesting article a few months ago about the interactions between pedestrians and cars in cities. Right now pedestrians and drivers engage in a constant unspoken negotiations as peds try to make eye contact with drivers, wait for gaps in traffic or lights to go green, use crosswalks, etc. There's a lot more human to human interaction going on than we realize.

When pedestrians know cars will always "see" them and always stop for them, they start feeling free to ignore crosswalks and lights because the technology will keep them safe. But then you have cars stopping and waiting for pedestrians even when they had the right of way, so traffic gets even more chaotic and riding in a car is less efficient than it ever was.

I thought it was an interesting perspective on the driverless car that I hadn't considered before.
 
Uber's lack of transparency, sitting on information about a data breach for more than a year, it doesn't give me much confidence in the company wanting me to trust them with employing bleeding edge technology?
 
Uber and taxis aside, I’m looking forward to autonomous cars, though there will be growing pains.

In this thread like all others it’s fun to read posts about how autonomous cars may not handle some uncommon hypothetical situations well - as if human drivers never make mistakes. Human drivers are far from perfect, autonomous cars don’t have to be perfect to be better or safer. Autonomous cars will learn and instantly share with other cars constantly, I’m not sure humans do. 90% of accidents are attributed to human error, that won’t be the case with autonomous cars once they’re fully developed. For every mistake an autonomous car makes, humans would have made far more other avoidable mistakes.

There were over 37,000 auto fatalities in the USA in 2016, and after years of decline, auto fatalities have been on the rise again since 2010. There are many, many more worldwide. Why defend that status quo?

I’d rather sit in traffic waiting once in awhile than having to wonder which human drivers around me are drunk, high, texting, reading, lost, sleepy, smoking, angry at their girlfriend, or otherwise distracted. There are unsafe human drivers around you every day you drive...
 
Last edited:
I’d rather sit in traffic waiting once in awhile than having to wonder which human drivers around me are drunk, high, texting, reading, lost, sleepy, smoking, angry at their girlfriend, or otherwise distracted. There are unsafe human drivers around you every day you drive...
I agree but it needs most or all vehicles to be autonomous before it would be effective. I suspect that may take a while. I could see a highly traffic controlled city like London banning manned cars at some point in the next couple of decades.
 
I agree but it needs most or all vehicles to be autonomous before it would be effective. I suspect that may take a while. I could see a highly traffic controlled city like London banning manned cars at some point in the next couple of decades.
Agreed, that’s what I meant by growing pains - all autonomous cars will be safer and flow more smoothly than the (long) period where unpredictable humans share the road with them. There was an accident a few weeks ago where a human driver backed into an autonomous car on the road. A human can deliberately cause an accident with an autonomous car just like they can with a human in a car today.

I’ve read several studies that suggest autonomous cars may start in small but densely populated urban areas, and expand outward. Not concurrent, but eventually human drivers will be prohibited from first urban areas and then maybe broader areas. There may come a time where human drivers are only permitted to drive on rural roads. For those now alarmed - those studies envision this happening over 30-50 years, it will take time for a variety of reasons.
 
Last edited:
I probably found the reference here but I recently read an article about high tech truck convoying that was fascinating. Several trucks using compatible tech line up on an interstate and the following trucks draft the lead truck at mere feet separation. The technology signals the following trucks when and how hard to brake etc. The drafting saves them a fortune in fuel.
 
I’d rather sit in traffic waiting once in awhile than having to wonder which human drivers around me are drunk, high, texting, reading, lost, sleepy, smoking, angry at their girlfriend, or otherwise distracted.

....not to mention road rage.
 
So, what would road rage look like with driverless cars? Twitter wars between passengers?
 
Uber and taxis aside, I’m looking forward to autonomous cars, though there will be growing pains.

In this thread like all others it’s fun to read posts about how autonomous cars may not handle some uncommon hypothetical situations well - as if human drivers never make mistakes. Human drivers are far from perfect, autonomous cars don’t have to be perfect to be better or safer. Autonomous cars will learn and instantly share with other cars constantly, I’m not sure humans do. 90% of accidents are attributed to human error, that won’t be the case with autonomous cars once they’re fully developed. For every mistake an autonomous car makes, humans would have made far more other avoidable mistakes.

There were over 37,000 auto fatalities in the USA in 2016, and after years of decline, auto fatalities have been on the rise again since 2010. There are many, many more worldwide. Why defend that status quo?

I’d rather sit in traffic waiting once in awhile than having to wonder which human drivers around me are drunk, high, texting, reading, lost, sleepy, smoking, angry at their girlfriend, or otherwise distracted. There are unsafe human drivers around you every day you drive...

Oh, I think it'll be safer and more efficient once established, and I'm looking forward to that. I'm just pointing out why I think it's not going to be as easy for them to get established as some people are thinking.

Also, human nature will be to point out any deaths or accidents a driverless car gets in as "proof" it isn't safe, ignoring all those driver-caused accidents. I wonder how long it'll take for that to go away.
 
Back
Top Bottom