That's not going to change on Thursday.
Just last month in Mountain View, CA., a Tesla Model X slammed into a barrier, killing the driver. The company has said the driver was utilizing its Autopilot feature and failed to acknowledge several alerts to take hold of the wheel.
Not long after that fatal accident occurred, another Tesla vehicle using Autopilot showed that it, too, may have run into the same barrier if not for the driver's intervention. Luckily, there was not an accident as a result. But it clearly shows some of the so-called "fatal flaws" of Tesla's system, as some are saying.
Here's a look for yourself:
To Tesla's credit, the company has tried to be upfront about the issue -- both in what happened and what it's doing to make it safer. For instance, its latest hardware and technology vastly improves the vehicle's vision of its surroundings.
But here's the problem with Tesla: It's a catch-all.
Here's a statement from its website, regarding Autopilot, with bold emphasis added:
"Enhanced Autopilot adds these new capabilities to the Tesla Autopilot driving experience. Your Tesla will match speed to traffic conditions, keep within a lane, automatically change lanes without requiring driver input, transition from one freeway to another, exit the freeway when your destination is near, self-park when near a parking spot and be summoned to and from your garage.
Tesla's Enhanced Autopilot software has begun rolling out and features will continue to be introduced as validation is completed, subject to regulatory approval. Every driver is responsible for remaining alert and active when using Autopilot, and must be prepared to take action at any time."
Basically, it says Autopilot can do this, this, this, a little of that, a touch of this, and a little more of that. BUT it's up to you to make sure it doesn't crash.
That's not cool.
Driver assist features like lane assist (beeps when swerving) or automatic braking are one thing. But when a car can park itself, drive itself, change lanes, exit and enter a freeway on its own and do a whole host of other things, it's self-driving at that point.
The problem is, it's self-driving, but not full-time. It's not capable -- at least with near-100% dependability -- of going from Point A to B to Z without knowing how to navigate every obstacle it faces. Clearly, that's demonstrated in the video above.
Alphabet Inc. (GOOGL - Get Report) (GOOG - Get Report) has a self-driving car segment, Waymo, led by CEO John Krafcik. He recently said it's the responsibility of the driver to prevent these types of accidents. That said, he also argued that the fully-autonomous-or-else approach is the best way to go, because putting drivers in the mix makes it less safe.
It's hard to plunk people down in a driver's seat where, say on the freeway, they don't need to do anything, but expect them to keep their hands on the wheel and their eyes on the road. It's just not realistic.
They're going to text, take phone calls and send emails. That's just the nature of it. If you're not physically steering, how can you be as focused as you would be actually driving?
Tesla's advances with Autopilot have been impressive, no doubt. And just like with Uber last month, there's bound to be fatalities at the hands of autonomous driving. The solution for Tesla is not readily apparent, but it either needs to eliminate these predictable risks or force drivers to stay aware (via alerts or alarms perhaps) until the automaker figures it out.
Until then, Tesla's stock will always be one accident away from going up in smoke (again).