The New York Times recently released a video documentary on Tesla Self-Driving effort. I don't know where it was originally broadcast, but saw an upload of it on YouTube. Being interested in SDC technology, I watched the entire documentary. Basically, it told how Musk overpromised and hyped up the capabilities of Tesla's Autopilot, then FSD (Full Self Driving) to sell cars.
I saw nothing that I or anyone else who has been watching the news did not already know about, but some Tesla ex-engineers confirmed two things that I have always been suspicious of.
1) Misleading Promo Video:
Back in 2016 or so, Tesla Web site showed a video where a Tesla car demonstrated a run with full autonomy. A car left a home garage and drove itself to an office building and parked itself. Totally hand-off. Very impressive.
Musk said the Model 3 would have the necessary hardware all built-in, and any other car would be obsolete in a few years like horses. At the same time, or shortly after, Musk said private Tesla car owners could sign up for their cars to make autonomous taxi runs to make lots of money.
The ex-engineers said that they made and made several laps in order to get enough good video to edit to make the final cut. The car even hit a fence at one point, and they just patched it up and continued.
That video was so impressive, yet 6 years later, the final product is still not released. However, the video served its purpose. It generated good sales for the Tesla 3 to keep the company alive.
I just went on Tesla's Web site to look for this video, and have not found it. They either moved it elsewhere or deleted it.
2) Tesla's advantage of having 100K's of cars collecting data in real life:
The ex-engineer said it was all bogus, because the production cars did not have the hardware nor the software to collect and report the necessary data to help them understand the shortcomings and to fix it. Musk said that with fleet learning, when one car made a mistake, the rest would learn from it. This is totally bogus, as I suspected.
Back then, I said that Waymo and other SDC developers had fewer cars out testing their software, but these test cars were fully instrumented and could capture high-bandwidth video along with live sensor readings for the engineers to analyze later. This has always been how vehicle developers tune their design: by using special test vehicles with heavy instrumentation carried onboard. They don't use regular production vehicles. And this is for engine tuning, suspension tuning, etc..., which is a lot simpler than tuning something as complex as a self-driving system.
Now, with the beta release of FSD, Tesla put in a soft button which the driver can use to report an anomalous FSD action. I am curious what data would get sent back to Tesla. Perhaps all that is used for is to count how many times a day Tesla owners encounter an unsafe situation or narrowly avert a crash.
Indeed, I watched many YouTube videos posted by Tesla owners who were allowed to download FSD beta software, and I saw the car making the same mistakes again and again after several software releases although the owners kept pressing the button to report each instance. It was just a "placebo" button.
PS. By the way, one of the YouTubers was a Tesla employee. He showed many videos where he was able to react and override the self-driving system when it was about to hit a barricade, going down a rail track, etc... But after he posted the video of his car hitting a bollard because he was not quick enough that last time, he got fired by Tesla. Free speech and certainly showing the truth are not allowed at Tesla.
Part 1:
Part 2: