One thing I'm wondering is what a self-driving car will do if it can suddenly not see what's around it. The EyeSight feature on my Subaru disables itself when I drive into very dense fog or near-white-out snow conditions. If the car can't see, it can't react. It can't pull over the the shoulder if it can't find the shoulder. It can't just stop, because it might get rear ended. When I've encountered these situations, I've been able to see well enough to proceed. But maybe my vehicle is just being conservative when it warns me, or maybe a fully self-driving car will have more sophisticated vision and will be able to see better than a human in any condition.
Some companies are working only on cameras while others are using LIDARs and other sensors.
People like to say self-driving cars are safer than human drivers because of the indefatigable and fast-acting computers.
They miss the difficulties developers have been having: the limitations of the sensors that allow the computer to see.
All self-driving cars use vision cameras. They are cheap, and are necessary for the computer to read traffic lights, road signs, hand gestures by a bicyclist, a traffic cop, or a road maintenance crew, etc... They are needed to read colors painted on the curb for parking permission, parking lanes, etc...
Supplementing the vision cameras are other sensors, such as the Lidars always visible on Google car roofs, millimeter-wave radars, and ultrasonic sensors. Each of these has different limitations. For example Lidars are excellent to map out the obstacles and other vehicles in the surrounding, but they have problems under certain weathers, such as fog, rain, dust storm, etc...
Ultrasonic sensors are dirt cheap, and good for close-range detection. Hence they are found in cars with self-parking features. A self-driving car should have these, because a Lidar on the rooftop cannot look down to obstacles close to the wheels.
Millimeter-wave radars are good at determining speed relative to other vehicles, but do not have good resolution like a Lidar.
The Tesla that drove under a semi-trailer, had its top sheared off and killed its inattentive driver, had only a vision camera that failed to distinguish the white trailer against the white cloudy sky. It has a millimeter-wave radar, whose output was ignored because it could not tell a tall trailer from an overhead highway sign.
So, a combination of sensors will be needed, plus very sophisticated software to combine all sensors together to paint a picture of the surrounding environment. A fast computer is essential, but the limitations right now are seemingly in the sensors.