Is Tesla Building the World’s Best Supercomputer?

Rob Maurer

In addition to Tesla's unique on-board computer chip which powers the company's Autopilot driving-assist features in its vehicles, Tesla is working on an incredibly powerful server-side supercomputer intended to help automate training of the neural network Tesla hopes will one day turn those vehicles into fully autonomous robotaxis.

Tesla has been working on the project for at least a couple of years, and refers to it as "Project Dojo". Elon Musk recently shared more details on the project in a series of tweets.

"Tesla is developing a [neural network] training computer called Dojo to process truly vast amounts of video data. It’s a beast! Please consider joining our AI or computer/chip teams if this sounds interesting," wrote Musk.

Musk described the performance of Dojo as a "truly useful exaflop at de facto FP32."

FP32 is a number formatting system used by computers for calculations known as floating-point. Because numbers can extend to infinity, computers round them to a certain number of significant digits. Each digit takes memory, or bits, so more precise numbers require more bits. FP32 stores numbers using 32 bits. This allows for more precise calculations than FP16, for example, which uses just 16 bits. The additional precision comes at the expense of speed as the extra digits take time to process, but can be useful when working with extremely large or small numbers where small variance can make a difference to the outcome.

An "exaflop" describes how many floating-point operations a computer can handle per second, hence the FLOPS acronym. "Exa" is a prefix similar to "kilo" or "giga" which represents one quintillion, or 10 to the 18th power. This is equivalent to one billion billions. Therefore, an FP32 1 exaflops computer would be capable of one quintillion 32-bit operations per second.

Currently the most powerful supercomputer in the world is capable of operating at 415 pentaflops (0.415 exaflops), but at a higher precision than FP32. According to, that computer is capable of exceeding 1 exaflops for calculations at FP32 or lower. 

Tesla's aspirations for Dojo would place the computer among the most powerful in the world once complete, though likely not quite at the top. Musk hopes version 1.0 of Dojo will be complete in about a year. Tesla's Dojo will likely be highly specialized for Tesla, and it will be interesting see how Tesla is able to take advantage of such massive computing power in the future.

For more Tesla news and analysis, please see the included video.


Disclosure: Rob Maurer is long TSLA stock and derivatives.