People in Virginia who follow developments in the autonomous car industry may be aware of a fatal accident that occurred in March involving a pedestrian and an autonomous Uber vehicle. The pedestrian reportedly stepped into a dark area of the road where there was no crosswalk moments before the car came along.
Although local police say that the car was likely not at fault in the accident because a driver probably could not have stopped in time, an Arizona State University professor of computer science says that teaching cars to drive like humans is a flawed approach. According to the professor, people have higher expectations for autonomous vehicles than they do for human drivers. One fatal accident with a self-driving car could destroy the industry, yet cars are taught to drive in a way that leads them to make the same types of errors that human drivers make.
In the case of this accident, the professor says, the car was proceeding under the assumption that there were no obstructions although it had no visual confirmation. This is the same assumption a human driver would make while traveling on a dark road. The professor says that autonomous vehicles should drive at a speed that allows them to stop if an obstacle enters their range of vision.
Many motor vehicle accidents happen because of human error, and although autonomous cars are expected to significantly reduce the accident rate, it will be years before they are in widespread use. When a driver causes an accident that injures someone else, that driver may be financially liable even if the driver does not face any charges in the accident. The driver's insurance company should provide compensation to injured passengers and other injured drivers, but it could be too little, or the driver may lack insurance. It may be necessary to file a civil case to get adequate compensation.