Autonomous driving -- and the subsequent car accidents that are involved -- are becoming bigger and bigger news.
Uber and Tesla Inc. (TSLA) have both made headlines, after an Uber self-driving car struck and killed a pedestrian in March and a driver using Tesla's driver-assist feature, Autopilot, was killed after hitting a highway barrier.
We can add Alphabet's (GOOGL) (GOOG) self-driving segment, Waymo, to the accident list after this weekend, too. Like Uber, the accident took place in Arizona. However, unlike Uber, the vehicle was being manually operated at the time and more importantly, the crash did not result in a fatality.
Here's the story: A sedan drove through an intersection as the driver's light changed from yellow to red. As another vehicle entered the intersection (on a green light), the sedan swerved to avoid the accident and ended up hitting the Waymo vehicle.
Minor injuries were reported and that's seemingly the end of that. At least from the standpoint of, "what does this mean for Waymo?"
Shares of Alphabet, a holding in Jim Cramer's Action Alerts PLUS Charitable Trust Portfolio, were up about 1% Monday, in-line with the Nasdaq's bounce. The stock was down 3.40% to $1,056.06 Tuesday morning.
But the accident touches on larger questions around autonomous driving, namely why its accidents get so much attention. When I hopped online this morning, the Waymo news was one of the first stories I saw. But there were no fatalities or serious injuries and the vehicle wasn't even driving itself. So why does it get any attention?
Tesla CEO Elon Musk recently touched on this topic, criticizing the coverage autonomous driving gets vs. a regular car accident -- one that likely wouldn't even have been reported on. On the company's most recent earnings report, he said:
"If it's an autonomous situation, it's headline news, and the media fails to mention that -- actually they shouldn't really be writing the story, they should be writing the story about how autonomous cars are really safe, but that's not the story that people want to click on...
And, yeah, it's really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe. Because people might actually turn it off, and then die. So anyway, I'm really upset by this."
He's not wrong. Danny Shapiro, the senior director of automotive at Action Alerts PLUS holding Nvidia Corporation (NVDA) , asked me (rhetorically), how much safer than humans do autonomous cars need to be for the public to accept them?
It wasn't a rhetorical question because it was dumb, but because it was obvious that I nor anyone else has the answer. Is it 10 times safer? How about 100 times safer? There is no clear transition point and perhaps until we're farther along in the development of autonomous cars, that change in perception may be a difficult hurdle to clear.
The media writes these stories on autonomous car accidents because that's what sells; that's what people want to read. They don't want to read that Waymo's disengagement rate has climbed to almost once in every 5,600 miles -- disengagement rate meaning how often a human driver must take control of the car. Think about that: Once per 5,600 miles. That's more miles than driving from LA to NYC and back again.
They don't want to consider that more than 1 million people die annually from car accidents or that more than 90% auto accidents are due to human error. They don't realize that companies like Nvidia and Waymo are now capable of building neural networks that can put autonomous driving platforms through billions of synthetic testing miles, teaching self-driving car systems at exponential rates.
The truth is simple: Too many consumers still have too much doubt about the technology behind autonomous driving, how fast it's evolving and what it means for safety rates. It seems like the public perception of autonomous driving will be negative if it results in 100,000 deaths per year, rather than positive because it results in a 90% decline from current levels.