Tesla rejected more advanced driver monitoring features on its cars

Engineers within Tesla wanted to add robust driver monitoring systems to the company's cars to make sure drivers used Autopilot safely, and Tesla even worked with suppliers on possible solutions, according to Wall Street Journal But those executives – including Elon Musk – reportedly rejected the idea for fear that the options would not work well enough, could be costly and because drivers could be bothered by an overly annoying system.

Tesla considered some different types of monitoring: one that would follow the eyes of a driver using a camera and infrared sensors, and another that involved adding more sensors to the steering wheel to make sure the driver holds on. Both ideas would help inform the car system if the driver has stopped paying attention, which could reduce the possibility of an accident in situations where the autopilot is disconnected or unable to prevent the car from crashing.

Musk later confirmed on Twitter that the eye tracking option was "rejected as ineffective, not cost."

While a name like "autopilot" might suggest that there are no situations that a Tesla car can not handle, The accidents still happen even when the autopilot is activated, and three people have died while using the function. Tesla promises that the autopilot will one day be able to completely drive the car, but the system currently looks more like the limited driver assistance packages offered by GM, Nissan and others.

Tesla cars lightly monitor drivers by using a sensor to measure small movements on the steering wheel. If the driver does not have his hands on the wheel, he is repeatedly warned and, finally, the car stops on the side of the road and must be restarted before the autopilot turns on again. However, that ability had to be added months after the autopilot was launched in 2015, after a number of drivers posted videos of themselves using the driver assistance function recklessly. Even now, there is evidence that it is possible to cheat the steering wheel sensor.

In contrast, GM's semi-autonomous system, Super Cruise, looks at a driver's face to make sure they are paying attention to the road. It also allows hands-free driving.

However, in general terms, the National Transportation Safety Board said last September that the entire industry needs to improve by installing safeguards that help ensure that these driver assistance features are not misused.

The NTSB statements came with the conclusion of the safety board investigation into the death of Joshua Brown in June 2016, who was the first person to die while using the autopilot in the United States. (It is believed that a driver who died while using the autopilot in China in January 2016 was the first person in the world to die while using the driver assistance function). At that time, the safety board specifically recommended that Tesla find ways beyond steering wheel sensors to monitor drivers. The NTSB is investigating the most recent death of the autopilot, which occurred in March in California.

Tesla often points out that the number of accidents involving the use of autopilot is small compared to the scale and frequency of the most typical car accidents. And Musk recently committed to regularly publish data on the performance of the autopilot, which will begin at the end of this financial quarter . But Musk also said recently that auto-pilot accidents tend to occur because drivers' attention can vary, something that could be solved with better driver control.

"When there's a serious accident, almost always, in fact, maybe always, it's an experienced user, and the problem is more of complacency," Musk said in a quarterly earnings call earlier this month. "They just get too used to that." That tends to be more of a problem. It is not a lack of understanding of what Autopilot can do. It is [drivers] thinking that they know more about the autopilot than they do. "