Skip to main content

The National Transportation Safety Board (NTSB) has issued a preliminary report regarding a Tesla (TSLA) Autopilot fatality that took place in Florida on March 1, 2019. This report raises multiple troubling questions for Tesla, including speculating on the different kinds of liability Tesla may have. 

The report is only two pages long, so I encourage every reader to go to the original source, linked to above. In summary, the Tesla was traveling on a 55 MPH speed limit road while on Autopilot -- but going 68 MPH. A large truck was crossing the road, and for whatever reason the Tesla did not stop.

As a result, Tesla hit the truck, slicing the roof off the car, and the person in the Tesla died. The remains of the car were found 1,600 feet away, where it had come to a stop. 

Autopilot or not, why didn't the car brake or evade the obviously huge obstacle? I imagine that this will be the source of further investigation. (Full disclosure: I'm short TSLA stock.)

You would think that if any driver assistance system, such as auto-braking or Tesla's Autopilot, had any value at all, it would at least be able to avoid driving straight into a giant truck. If not, we are really nowhere on the path to a driverless car of any sort, such as a robotaxi. A small child knows to avoid driving into a large truck at high speed.

We are not talking about any advanced James Bond style maneuver here. We are talking about the vehicle seeing a giant truck ahead, and at least braking.

Anyone in the mood for having your car drive you straight into a giant truck at 68 MPH? Me, neither.

This incident took place on March 1. On May 2 -- only two weeks ago -- Tesla went to the market to raise over $2 billion in new financing -- mostly in convertible debt, but also a little equity. At that time, Tesla hosted a call with investors where CEO Elon Musk said that driverless robotaxis are the dominant part of its future value, and that we would see a million of these Tesla driverless robotaxis by next year.

Most experts in the industry believe that a truly driverless car, which can go anywhere a human can drive, at those kinds of full legal speeds, is very far out into the future -- not anywhere near next year. Given this, it would have been good for an investor to know the basic facts surrounding this fatal Tesla Autopilot accident.

But did Tesla mention this during that investor call? Did Tesla say that this car was operating in Autopilot mode at the time of the crash, and that the car evidently failed to even brake for the truck? No, it did not. It kept its real and would-be investors in the dark, as it passed the hat for over $2 billion in new funding.

Scroll to Continue

TheStreet Recommends

This may raise questions of liability. Should real and would-be (in the May 2 deal) Tesla shareholders have been told what Tesla already knew at that time, about this Autopilot accident? That's a rhetorical question.

The other kind of liability here -- which would of course also hurt shareholders in the end -- is if some action following this event would cause Tesla to have to turn off its Autopilot system. It could certainly do so via over-the-air (OTA) update, for all of the hundreds of thousands of such cars in already in the field. 

If that's what Tesla had to do, then it would presumably have to refund all the thousands of dollars per car that it charged for this functionality. Using the roundest of math here, let's say it would have to refund $2,000 per car (very conservative), multiplied by 500,000 cars. That's $1 billion. Having to refund that kind of cash to customers would bring Tesla to its balance sheet knees.

One may also ask whether it is Tesla's amazing ability to slam into things, that is causing the insurance rates for Tesla to skyrocket. Here is GEICO raising prices on Tesla insurance by 25%.

Even if people are not hurt, Teslas seem to have a magic propensity to go from parked, then straight into a shop window. This incident, for example, occurred just a week ago.

Of course, a driver ought not to allow the car to go haywire. The driver should keep his hands on the Tesla steering wheel at all times, and detect the things in front (and around) of the car, that the Tesla evidently misses. However, Tesla's CEO is not setting a good example in this regard. Here is a picture of his appearance on 60 Minutes half a year ago, where he was ever so proud that he's looking away while not having his hands on the steering wheel:

If you find that messaging by CEO Musk to be, ahem, unwise -- then just go to Yes, the very landing page for this functionality. What is the first thing you see on that page? It's a video of a Tesla "driving itself" where the person behind the wheel isn't touching it. The text on the screen at the beginning of the video reads: "He is not doing anything. The car is driving itself."

Okay then! I don't think the customer can be faulted for not knowing how to behave differently: The CEO goes on 60 Minutes showing that it's great to take your hands off the wheel and look away. Then Tesla's own web site features a long video explicitly showing that you can leave your hands off the wheel for -- not seconds -- but minutes!

With all of that, investors also can't be faulted for starting to see the risk of multiple liability events, that could severely imperil Tesla's driverless robotaxi plans, as well as yet again its balance sheet. Tesla should have at least come clean on this before it raised those billions of dollars two weeks ago.

Learn the History of the Companies in Your Portfolio | Behind the Label

At the time of submitting this article for publication, the author was short TSLA. However, positions can change at any time. The author regularly attends press conferences, new vehicle launches and equivalent, hosted by most major automakers.