If you're an automaker, self-driving start-up, university or otherwise looking into autonomous driving development, Nvidia (NVDA) - Get Report is quickly becoming the one-stop shop to make it all a reality.

The company's latest addition to the lineup? Its DGX SuperPOD.

The DGX SuperPOD consists of 96 Nvidia DGX-2H supercomputers, making the DGX SuperPOD the world's 22nd-fastest supercomputer. Some may wonder why we're not talking about a top-five or top-ten supercomputer announcement, but that would be missing the point.

For starters, 22 of the world's top 25 most powerful supercomputers are powered by Nvidia's GPUs. They can be used for computing massive amounts of data and performing long-dated simulations, such as climate change. In the case of the SuperPod, it's purpose and capability is aimed at self-driving vehicles.

Nvidia already makes the DRIVE Pegasus, the onboard computer that allows for fully self-driving features. Then there's DRIVE Constellation, which allows for wide-ranging simulation of autonomous driving testing applications. Both are vital to scaling autonomous driving platforms, but the company is plugging yet another hole for companies and researchers.

Its DGX SuperPOD supercomputer is powered by 1,536 Nvidia V100 GPUs and interconnected with Nvidia NVSwitch and Mellanox network fabrics, allowing it to deliver 9.4 petaflops of processing capability. This makes it possible for automakers, startups and researchers to train their neural networks for self-driving applications. Remember, in March, Nvidia announced the acquisition of Mellanox for $6.9 billion. 

Nvidia's DGX SuperPOD

A self-driving car can generate a terabyte of data per hour. Then multiply that for an eight-hour drive, five days a week for an entire year and suddenly, we're looking at thousands of terabytes in data. Now consider a dozen test cars in a fleet. Then 100. Now consider testing these vehicles for years and it doesn't take a data scientist to realize that traditional means of data storage and A.I. training makes this a difficult, if not impossible task.

The DGX SuperPOD is what's used in-house by the Nvidia team and given its insane computational power, it's no wonder why they use it. From the company's post:

"AI leadership demands leadership in compute infrastructure," said Clement Farabet, vice president of AI infrastructure at NVIDIA. "Few AI challenges are as demanding as training autonomous vehicles, which requires retraining neural networks tens of thousands of times to meet extreme accuracy needs. There's no substitute for massive processing capability like that of the DGX SuperPOD."

From a power and performance perspective, the SuperPOD vastly outperform its CPU-based peers. Further, its size makes it an attractive product as well. According to the company, the SuperPOD takes about three weeks to setup. That's compared to the six to nine months generally needed for systems of similar scale.

What does all of this mean? While its automotive segment certainly doesn't represent the bulk of Nvidia's revenue, it adds one more product the company's industry-leading portfolio. By creating the tools and products that will power and train self-driving vehicles today, Nvidia is putting itself in the position to be tomorrow's big winner from this industry-changing shift in technology. 

The company already has hundreds of partners in the automotive world, but recently announced a notable partnership with Toyota (TM) - Get Report and has deepened its relationship with Daimler (DDAIF) , the parent company of Mercedes-Benz. New products like the DGX SuperPOD will only improve these partnerships down the road. 

This article is commentary by an independent contributor. At the time of publication, the author had no positions in the stocks mentioned.