Wow, so glad I don't live anywhere near there! Way too much traffic and too many people. The car navigation is impressive though.
Particularly impressive was scooting over the double yellow near the end to get by that stopped delivery van. It thought about it for a while though. Nothing wrong with caution.I apologize if this Chevrolet Bolt video was already posted. I thought it was pretty impressive and there are dozens of subtle things it dealt with.
Just because you don't know, doesn't make it "hand waving." I know they're working on every circumstance that's been mentioned here and many, many more, I don't just assume otherwise.But how can we be sure that the current test cars are better? Just hand waving, saying that computers are fast and technology is cool?
Just because you don't know, doesn't make it "hand waving." I know they're working on every circumstance that's been mentioned here and many, many more, I don't just assume otherwise.
I apologize if this fully autonomous driving Chevrolet Bolt video was already posted. I thought it was pretty impressive and there are dozens of subtle things it dealt with...
I think your perspective/ruler is wrong for the AutoPilot1/HardWare1 version. As a driver assistant feature mainly intended for highway driving it is very impressive. I think there are like 1.3 billion autopilot miles now (and a few billion non-autopilot). Of my 27K miles, I probably have 17K on highways. It performs impressively there. I've driven from IL to MT last summer. A couple days were over 400 miles and would have never done that before. It is a much more relaxing (and safe) way to drive!Admittedly, I am less than impressed by the most visible project, namely Tesla, just from seeing the videos of all the mishaps on youtube.
As we reported before, Tesla is now gathering more data from autonomous miles driven in a day than Google’s program has logged since its inception in 2009, but the two companies are gathering fairly different datasets.
Nothing wrong with any of that.It is true that I don't know about all the things happening, and all the different groups of people doing this.
Admittedly, I am less than impressed by the most visible project, namely Tesla, just from seeing the videos of all the mishaps on youtube.
And being an engineer, I ask myself some questions. Why does it fail in those cases? What is the cause? What will they do to fix it?
I think your perspective/ruler is wrong for the AutoPilot1/HardWare1 version. As a driver assistant feature mainly intended for highway driving it is very impressive. I think there are like 300 million autopilot miles now (and a few billion non-autopilot). Of my 27K miles, I probably have 17K on highways. It performs impressively there...
...
Self-driving cars have the potential to dramatically reduce accidents and fatalities, ...
Self-driving cars have the potential to dramatically reduce accidents and fatalities, and everyone working on their development has that as a primary objective. A handful of missteps doesn't prove otherwise.We don't know that. I said potential, not know, you know the potential doesn't exist?
Wasn't there an earlier post with data saying that the self-driving capabilities currently require several human interventions per mile? There are no level 5 cars available yet, so human interventions are required, that's a given.
How much better does it need to be to be better than the average human? 90% of accidents are attributed to human error, links earlier. How much that can be reduced, we don't know yet. And how hard is it to fill that gap with technology? TBD. Early data suggested Tesla autosteer reduced accidents by 40% under some conditions, links earlier. That's just one piece of many autonomous features. I really don't think those are known. Of course not, in development, but a lot of progress so far.
If you restate it, and say that driver assistance technology has a lot of potential to dramatically reduce accidents and fatalities, I very much agree. What else would self-driving/autonomous mean?
-ERD50
We don't know that. Wasn't there an earlier post with data saying that the self-driving capabilities currently require several human interventions per mile?
A little late to the thread, a lot of this has been well covered, so briefly:
What else would self-driving/autonomous mean?If you restate it, and say that driver assistance technology has a lot of potential to dramatically reduce accidents and fatalities, I very much agree.
That was me. See post #66. It's an official report.
No, not several interventions per mile. It was one in 3 miles for Tesla, and one in 5,000 miles for Waymo.
This is the 3rd time I brought this up.
I want to reduce accidents too, and as mentioned several times elsewhere, I do not care to drive and do not love my car like some people do. I do not even mind an ugly car with a Lidar on top.Nothing wrong with any of that.
But to keep your mistrust of Tesla in perspective, do you also ask yourself why there are 17,249 (that's 12 every minute) auto accidents with 88 fatalities per day in the US? That's the "bogey" - not perfect.
Self-driving cars have the potential to dramatically reduce accidents and fatalities, and everyone working on their development has that as a primary objective. A handful of missteps doesn't prove otherwise.
I had a long career in engineering as well. I ask myself the same questions.
https://crashstats.nhtsa.dot.gov/Api/Public/Publication/812376
A Web site said it was indeed two Lidars, made by Velodyne. For redundancy? .....[/IMG]
... But to keep your mistrust of Tesla in perspective, do you also ask yourself why there are 17,249 (that's 12 every minute) auto accidents with 88 fatalities per day in the US? That's the "bogey" - not perfect. ...
I've worked with Lidar datasets, and I'm confident that self driving cars could use such existing conditions data to drive safely in a static environment. The software should be able to handle roadway lanes, bridges, etc. IMO, the moving objects and road conditions present a great challenge to safety. These would not be in the existing conditions dataset, and the car would have to recognize and deal with these issues as they arise. Lot of variables there for software to handle.
Lidars are far more reliable than radar. The problem with them is that they do not work well in rain and snow.I've worked with Lidar datasets, and I'm confident that self driving cars could use such existing conditions data to drive safely in a static environment. The software should be able to handle roadway lanes, bridges, etc. IMO, the moving objects and road conditions present a great challenge to safety. These would not be in the existing conditions dataset, and the car would have to recognize and deal with these issues as they arise. Lot of variables there for software to handle.
From what I remember, ghosting can be problematic in the data. Passing cars, pedestrians, - anything moving presents problems where the software has to decipher what is static and what is moving. The data is scrubbed to remove the ghosting. Perhaps the 2 units provide additional data to facilitate the scrubbing. And certainly redundancy wouldn't hurt.
Passing cars and pedestrians are not ghosts here. They are objects of interest. Waymo identifies, tracks them, and tries to predict where they are going to be..
Please watch the video whose link I posted earlier. You will understand.