Tesla Crash Hearing Reignites Debate About Partially Autonomous Cars

The NTSB argues that Tesla needs to do a better job of making sure drivers remain alert when its Autopilot system is running.
Author:
Publish date:

A government hearing on a fatal Tesla  (TSLA) - Get Report vehicle crash is reigniting debates about the safety of advanced driver-assistance systems (ADAS) that can take over from human drivers in some, but not all, situations.

At a Tuesday hearing, the National Transportation Safety Board (NTSB) harshly criticized both Tesla and the National Highway Transportation Safety Administration (NHTSA) over a March 2018 crash that killed an Apple engineer driving Tesla’s Model X crossover. The Apple engineer, Walter Huang, had turned on Tesla’s Autopilot ADAS system, and is believed to have been playing a mobile game at the time of the crash.

Huang’s Model X accelerated into a highway barrier at a spot where a left-lane exit opened up. The NTSB asserted that Autopilot failed to determine the right lane of travel, and that the Model Y’s collision-avoidance and automatic emergency braking systems failed to do their jobs. It also argued that Tesla, which has Autopilot issue warnings when a driver’s hands aren’t on a car’s steering wheel, needs to do a better job of making sure that drivers are alert and engaged when Autopilot is running.

Separately, the NTSB criticized the NHTSA, which unlike the NTSB has the power to set regulations for vehicles and order recalls. The NTSB argued that the NHTSA needs to do a better job of guaranteeing that ADAS systems both work as advertised and aren’t used in unintended ways, and also took issue with an NHTSA tweet that declared the agency is “working to keep regulations reasonable so cars, trucks and SUVs — with the latest safety features — are more affordable and families can be safer on the roads.”

Apple also received a bit of criticism, with the NTSB noting the company didn’t have a distracted-driving policy in place.

In 2018, Tesla defended Autopilot in the wake of the crash, stating that Autopilot gave Huang multiple warnings that his hands were off his Model Y’s steering wheel.

Elon Musk’s company has also frequently defended Autopilot’s broader safety record. In its Q3 shareholder letter, Tesla claimed that it “registered one accident for every 4.34 million miles driven in which drivers had Autopilot engaged,” compared with a national average (per NHTSA data) of one accident for every 500,000 miles.

Nonetheless, the March 2018 crash, along with other crashes that have occurred with Autopilot engaged, have sparked plenty of arguments about whether cars supporting partial autonomy can lull drivers into a false sense of security.

Alphabet’s  (GOOGL) - Get Report Waymo self-driving unit has long been critical of cars promising partial autonomy for this reason. Waymo’s driverless taxi service, which for now only operates on select roads in the Phoenix metro area, features cars that are meant to fully drive themselves. A “safety driver” is typically present behind the wheel, but that might change soon.

Tesla, for its part, is pushing ahead with the development of a full self-driving (FSD) mode for cars equipped with its latest Autopilot hardware (time will tell just how comprehensive its self-driving abilities are). And various other automakers are working on Level 2+ autonomous systems that can automate more driving activity than their current, ADAS-equipped vehicles can.

All of this guarantees that the debate about partial autonomy won’t be ending anytime soon.

Alphabet is a holding in Jim Cramer’s Action Alerts PLUS Charitable Trust Portfolio. Want to be alerted before Cramer buys or sells GOOGL? Learn more now.