Tesla Fatality Reflects Need for Stronger Regulatory Authority of Driverless Tech

The National Highway Traffic Safety Administration has opened a preliminary investigation into the accident, a collision between a Model S sedan and a semi-trailer in Williston, Fla.
By Doron Levin ,

The inevitable has happened: A motorist relying on autonomous driving technology was killed in an accident apparently caused by some combination of the system's shortcomings and driver inattention.

No doubt at least some owners of Tesla (TSLA) - Get Report vehicles equipped with Autopilot are driving their vehicles more carefully since last Thursday when the automaker disclosed details of the May 7 accident on its Web site. Others may be pushing the boundaries of Autopilot's capabilities, like the driver who was caught on camera sleeping behind the wheel.

Tesla disclosed that the National Highway Traffic Safety Administration -- the day previous to its June 30 announcement -- had opened a preliminary investigation into the accident, a collision between a Model S sedan and a semi-trailer in Williston, Fla. In its disclosure, Tesla explained:

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

Tesla underscored the many warnings and cautions that come with AutoPilot, especially that the driver must always pay attention to the road and any obstacles. But the accident wasn't the first in which the system may have been complicit, perhaps because owners have lulled themselves into thinking of their cars as self-driving. In May, a video of a Tesla vehicle, possibly on a Swiss highway, colliding with a panel van was posted here.

Since the first reports of autonomous systems testing, automakers and research scientists have been warning that the variety of circumstances a human driver can see and understand are numerous and often complex, and are still beyond the capability of current autonomous systems to emulate. One can only wonder what other "extremely rare circumstances," beyond a semi-trailer making a left turn on a divided highway, can't be detected by the software and vision systems that are the basis of Tesla's Autopilot.

In May, Andrew Ng, the former chief scientist of Alphabet's Google's Deep Learning project and now Baidu's  (BIDU) - Get Reportchief scientist, said it was "plainly irresponsible" for Tesla to ship Autopilot prior to being ready for safe driverless operation. Ng is working from Baidu's Silicon Valley laboratories; the company has said it is developing its own self-driving technology.

Tesla's Autopilot system is similar to many others from rival manufacturers, a format that combines adaptive cruise control with lane-keeping assist functions. They are impressive, yet far from the driverless "set it and forget it" technology that perhaps will arrive in a few years. BMW last week forecast that it will introduce a fully driverless vehicle in 2021. Microsoft co-founder Bill Gates has declared that such a vehicle won't be ready for another 15 years.

Whether the driver was responsible for his own death, or whether Tesla had a hand in it, will be sorted out over the next months or years. In the meantime, NHTSA or some responsible agency, should step forward and take a role deciding how much driverless technology can be tested by ordinary drivers rather than by company researchers.

Doron Levin is the host of "In the Driver Seat," broadcast on SiriusXM Insight 121, Saturday at noon, encore Sunday at 9 a.m.

This article is commentary by an independent contributor. At the time of publication, the author held no positions in the stocks mentioned.

Loading ...