At the 2018 Detroit Auto Show, we got lucky when we were able to sit down with Danny Shapiro, senior director of automotive at Nvidia (NVDA) .

Shapiro was more than happy to fill us in on the inner workings of the self-driving car and the pivotal role that AI is playing in the revolution. You will actually be able to have a conversation with the car via its voice-recognition and life-preservation programming. But it went beyond self-driving capabilities and in fact, included a lot of in-cabin features.

When people think of AI in the car, they tend to imagine level 4 and level 5 driving capabilities. At these levels -- 5 being the highest -- the vehicle would be capable of most driving scenarios it encounters. Level 5 would be a fully autonomous drive. 

But the AI advances inside the car shouldn't go unnoticed. Just last week at the CES show in Las Vegas, Mercedes-Benz unveiled its new (Mercedes Benz User Experience) MBUX infotainment system. The AI-powered system will be available in the A-Class starting next month.

Dietmar Exler, the CEO of Mercedes-Benz USA, told us that this vehicle appeals to many of the automaker's younger customer base. In that sense, it will be exciting to hear their feedback on the new system, he said.

But it won't stop at Mercedes. While it wouldn't be surprising to see the automaker introduce MBUX to further models, it will go beyond Mercedes in general. Nvidia has already announced a new partnership with Volkswagen (VLKAY) for its Drive IX, Nvidia's intelligence experience toolkit. In a nutshell, Nvidia's hardware gives its customers the tools necessary to build their own AI-based applications.

Danny Shapiro, senior director of automotive at Nvidia.
Danny Shapiro, senior director of automotive at Nvidia.

"Software is going to define so much of the user experience, the driving experience and it will continue to evolve and get better and better," Shapiro said. "New applications and new features [will be] added to your car even after it's in your driveway."

In that sense, it will be functions like Tesla's (TSLA) over-the-air updates.

Mercedes and Nvidia began working together a few years ago. Instead of Mercedes looking for a cheap infotainment solution, it asked, what's the most powerful computer we can put in a car? The answer was the most powerful one ever in a production vehicle.

The new MBUX system has plenty of extra computing power to handle future tasks, an important component to future updates. Shapiro explained that in order for systems like the MBUX to have increased capabilities in the future via over-the-air updates, it first needs to have enough "processing horsepower," a fitting term for an AI vehicle.

When we speak to Apple's (AAPL) Siri or Amazon's (AMZN) Alexa, these AI-powered voice recognition systems need tons of computing horsepower. That's why it's done over the cloud, Shapiro explained.

Nvidia and Apple are holdings in Jim Cramer's Action Alerts PLUS Charitable Trust Portfolio. Want to be alerted before Cramer buys or sells NVDA or AAPL? Learn more now.

As an example, he pointed out that if you ask Siri what time is it, it won't know the answer unless the phone has service. Even though the phone knows what time it, Siri won't, because it can't connect to the cloud.

For a vehicle AI system, a lot of that processing power is being kept within the vehicle. While some features -- like asking about the weather this weekend or a good restaurant -- will be cloud-based, much of it will be within the vehicle itself.

Saying "it's hot in here," should prompt the system to respond by asking the driver if they'd like the window rolled down or if they want the air conditioning turned on. Even a simple command like "roll my window down" or "what's the temperature outside right now" is done via local processing and not through the cloud.

But a prompt like this brings up another point. Shapiro points out that the system is built to recognize not only what is said, but who is saying it. It doesn't do the driver much good to ask for their window to be rolled down and have the car roll down the passenger's window.

Voice recognition has come a long way very quickly thanks to improved natural language understanding (NLU), Shapiro explained.

Essentially, it allows users to move away from robotic commands and speaking, to having more of a conversation with the system, he added.

Beyond entertainment, it has a safety element, too. The system will be able to recognize if the driver is about to open their door while a bicyclist approaches from behind. In an example of waiting to turn left at an intersection, the system will know if we are visually looking to see if traffic is clear on the right while missing a vehicle approaching from the opposite direction when we start to pull out.

In this sense, AI can use multiple inputs from inside and outside the car to prevent an accident.

"AI is really at the heart of everything that's happening in the transportation space," Shapiro said and it's clear the industry is moving quickly on this new technology -- both inside and outside the vehicle.

More of What's Trending on TheStreet:

This article is commentary by an independent contributor. At the time of publication, the author had no positions in the stocks mentioned.

More from Stocks

Starbucks Shares Gain as Investor Day Highlights China Growth Ambitions

Starbucks Shares Gain as Investor Day Highlights China Growth Ambitions

Real Money Video Wrap: GE Jumps as Analysts Look to Price in Bottom

Real Money Video Wrap: GE Jumps as Analysts Look to Price in Bottom

Adobe Tops Q4 Sales Forecasts, Sees $11.1 Billion in 2019 Revenues

Adobe Tops Q4 Sales Forecasts, Sees $11.1 Billion in 2019 Revenues

Stocks End Indecisively, General Electric Rises, Key Trends You Need to Watch

Stocks End Indecisively, General Electric Rises, Key Trends You Need to Watch

This Is Bear Market Action

This Is Bear Market Action