Fatality in Tesla Crash Highlights Risks of Automated Driver Assist Features

July 1, 2016

The internet has been filling with videos of people demonstrating their Tesla Autopilot feature. Many of these practices caught on video are dangerous and discouraged by the company. Nonetheless, when you tell people the car will keep itself in lanes, change lanes at times for you and park for you, it cannot be a surprise the community of drivers would grow over confident in the feature. Unfortunately, today we learned of the first fatal accident involving a Tesla sedan which was operating on autopilot.

Reports indicate the 2015 Tesla Model S was driving down a divided central Florida highway with the autopilot engaged when a white paneled truck pulled out in front of it. Present speculation suggests the autopilot failed to appreciate the white truck due to the bright light of the sun behind it. We have no way of knowing what the driver was doing immediately before the crash, although there are reports that the highway patrol was able to recover a DVD player from the wreckage. Regardless, it appears clear he too failed to appreciate the truck crossing in front of him. At this point it is unknown how many other collisions may have occurred while autopilot was engaged, but Tesla is stating this is the first fatality in over 130 million miles of driving on autopilot.

Sadly, the driver involved in this fatal crash had previously posted a video to his YouTube account highlighting a "near miss" with a utility vehicle.

National Highway Traffic Safety Administration (NHTSA) Opens Preliminary Evaluation

The NHTSA is now investigating the event in what Tesla says is a "preliminary evaluation" to make sure the autopilot system was working properly which can lead to a recall. In a statement to the Verge, the NHTSA said in part:

NHTSA's Office of Defects Investigation will examine the design and performance of the automated driving systems in use at the time of the crash. During the Preliminary Evaluation, NHTSA will gather additional data regarding this incident and other information regarding the automated driving systems.

The opening of the Preliminary Evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles.

If the administration's evaluation leads to further action, it will not be a 'first' recall for vehicles with automated driver assist features. In November of last year, for example, Toyota was forced to recall 31,000 Lexus and Toyotas because of defects in the automatic braking system and Ford, also last Fall, recalled nearly forty-thousand F-150 pickup trucks who's radar system mistook reflections from passing trucks, forcing the vehicles to break with nothing in the way.

This tragic collision serves as a warning to all drivers who have vehicles with advanced warning systems or other versions of an autopilot. Regardless of the faith one has in this technology, it remains the drivers responsibility to remain diligent. Despite boasting of the reliability of the technology, Tesla also immediately warns drivers to always be paying attention and to keep hands on the wheel.

In addition to a warning to present owners and drivers of these vehicles, this also serves as foreshadowing for what may come to our communities as more and more people are able to afford vehicles with these capabilities. For now, this is a luxury few can afford to enjoy. However, like many prior luxuries, the auto industry will strive to make these desirable features available to the masses. As more and more vehicles enter our roads and highways with advanced technology or autopilot capabilities, we as a society will need to consider the amount of responsibility we place on the companies who produce the technology and the drivers who rely on them.